LiteLLM custom provider throwing an error

I am self-hosting a LiteLLM instance to proxy all my AI stuff. I managed to add it as a custom provider and fetch the models by prefixing the API key with ‘Bearer ‘ because the LiteLLM logs showed it was missing that.

Now that I added it as a provider and select a model to chat with, I run into this error:

In the LiteLLM logs I don’t see any sign of an API call, so it seems to fail somewhere before it hits the endpoint. I’ve tried to add /v1 at the end of it to no effect.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
📱

BoltAI Mobile

Date

30 days ago

Author

boltai

Subscribe to post

Get notified by email when there are changes.