I am self-hosting a LiteLLM instance to proxy all my AI stuff. I managed to add it as a custom provider and fetch the models by prefixing the API key with ‘Bearer ‘ because the LiteLLM logs showed it was missing that.
Now that I added it as a provider and select a model to chat with, I run into this error:
In the LiteLLM logs I don’t see any sign of an API call, so it seems to fail somewhere before it hits the endpoint. I’ve tried to add /v1 at the end of it to no effect.
Please authenticate to join the conversation.
In Review
BoltAI Mobile
30 days ago
boltai
Get notified by email when there are changes.
In Review
BoltAI Mobile
30 days ago
boltai
Get notified by email when there are changes.