You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Did anything change that might cause this. I dont see deepseek as a support model for fireworks I believe it was there before. Can I just make this an open ai compatible model?
Relevant log output
itellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-r1
Pass model as E.g. For 'Huggingface' inference endpoints pass in`completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.61.11
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
Testing the chat with this existing model gives a 400 error below this was working in version main-v1.59.8. My proxy config is this
litellm_params:
model: fireworks_ai/accounts/fireworks/models/deepseek-r1
api_key: secret
Did anything change that might cause this. I dont see deepseek as a support model for fireworks I believe it was there before. Can I just make this an open ai compatible model?
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.61.11
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: