-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Issues: BerriAI/litellm
[Feature]:
aiohttp
migration - 10-100x Higher RPS Master ti...
#7544
opened Jan 4, 2025 by
ishaan-jaff
Open
3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: Embeddings request fails on /v1/embeddings
bug
Something isn't working
#8744
opened Feb 23, 2025 by
deepanshululla
[Bug]: LiteLLM Proxy error: LiteLLM.LoggingError: [Non-Blocking] Exception occurred while failure logging cannot pickle 'FrameLocalsProxy' object
bug
Something isn't working
#8727
opened Feb 21, 2025 by
hnykda
[Feature]: Programmatic Management of Virtual Keys and Teams via Python SDK
enhancement
New feature or request
#8721
opened Feb 21, 2025 by
jim-halpert-ai
[Bug]: wrong model input length values for amazon Nova
bug
Something isn't working
#8714
opened Feb 21, 2025 by
NicolasGDM
[Bug]: Bedrock code incompatible with both structured outputs and tool calling
bedrock
bug
Something isn't working
feb 2025
#8713
opened Feb 21, 2025 by
andrzej-pomirski-yohana
[Feature]: Support default custom_llm_provider field in files settings + enable team/key based access to specific files endpoint
enhancement
New feature or request
feb 2025
#8712
opened Feb 21, 2025 by
krrishdholakia
[Bug]: GroqException - list index out of range
bug
Something isn't working
#8710
opened Feb 21, 2025 by
ChenghaoMou
[Bug]: Something isn't working
invalid_image_url
with S3 signed URL
bug
#8709
opened Feb 21, 2025 by
aguadoenzo
[Bug]: chat completions api gives 500 when llm provider is aiohttp_openai
bug
Something isn't working
feb 2025
help wanted
Extra attention is needed
llm translation
openai
#8708
opened Feb 21, 2025 by
jaswanth8888
[Bug]: UI says "unlimited" for key creation values even when limited
bug
Something isn't working
mlops user request
#8707
opened Feb 21, 2025 by
parkerkain-8451
[Bug]: Caching on litellm proxy does not work when using structured output (response_format)
bug
Something isn't working
#8706
opened Feb 21, 2025 by
dalssoft
[Bug]: 'CompletionUsage' object is not subscriptable
bug
Something isn't working
#8705
opened Feb 21, 2025 by
ChenghaoMou
[Bug]: Transitive dependency on tenacity not understood by bazel
bug
Something isn't working
#8704
opened Feb 21, 2025 by
regb
[Bug]: model: fireworks_ai/accounts/fireworks/models/deepseek-r1 gives 400 error in version v1.61.11
bug
Something isn't working
#8699
opened Feb 21, 2025 by
numenbit
[Feature]: Improve proxy error reporting
enhancement
New feature or request
#8698
opened Feb 21, 2025 by
hnykda
[Feature]: Add o3 Data Zone model prices
azure openai
enhancement
New feature or request
feb 2025
spend tracking
#8692
opened Feb 20, 2025 by
jasonpnnl
[Bug]: Something isn't working
feb 2025
security
/openai/deployment/{model}/completions
API with Key Model Limit
bug
#8688
opened Feb 20, 2025 by
km322
[Bug]: Internal user account possibly overwritten; Team pointing to user ID that does not exist
bug
Something isn't working
#8685
opened Feb 20, 2025 by
bopowers9
[Bug]: Model Hub model names are out of card width
bug
Something isn't working
feb 2025
help wanted
Extra attention is needed
ui
#8681
opened Feb 20, 2025 by
dipakparmar
[Feature]: Add support for Mistral Codestral through Azure
azure openai
enhancement
New feature or request
feb 2025
llm translation
#8679
opened Feb 20, 2025 by
mboret
[Bug]: Duplicated Content in Fields Something isn't working
help wanted
Extra attention is needed
text
and content
Using Together AI on Streaming Response
bug
#8675
opened Feb 20, 2025 by
josperrod9
[Feature]: Support user_continue_message for gemini - when only system message given
awaiting: user response
enhancement
New feature or request
#8673
opened Feb 20, 2025 by
nickprock
Increased around 40ms ASR Latency at P50 After Integrating with LiteLLM
#8671
opened Feb 20, 2025 by
yin250
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.