-
-
Notifications
You must be signed in to change notification settings - Fork 1k
Issues: BerriAI/litellm
[14/04/2024 - 19/05/2024] 🐛 💪 Bug Bash \ New Features \ Provi...
#3045
opened Apr 15, 2024 by
krrishdholakia
Open
1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature]: deepgram STT as an alternative to openai's whisper
enhancement
New feature or request
#3978
opened Jun 2, 2024 by
thiswillbeyourgithub
[Bug]: Division by zero error when accessing model metric on proxy server UI
bug
Something isn't working
#3977
opened Jun 2, 2024 by
dlebech
[Bug]: Something isn't working
/health
with unpriv user API key gives a very confusing response
bug
#3972
opened Jun 2, 2024 by
Manouchehri
[Feature]: Fallbacks based on request priority + available usage thresholds
enhancement
New feature or request
#3971
opened Jun 2, 2024 by
krrishdholakia
[Feature]: Allow setting multiple orgs for an openai call (Router)
enhancement
New feature or request
#3949
opened May 31, 2024 by
krrishdholakia
[Docs]: OIDC configs / usage
documentation
Improvements or additions to documentation
#3946
opened May 31, 2024 by
Manouchehri
8 tasks
[Feature]: LiteLLM.INFO
enhancement
New feature or request
#3937
opened May 31, 2024 by
anmolbhatia05
[Feature]: Add Codestral API provider
enhancement
New feature or request
#3922
opened May 30, 2024 by
SamArgillander
[Bug]: Something isn't working
user
field and user_api_key_*
is sometimes omitted randomly
bug
#3920
opened May 30, 2024 by
Manouchehri
[Bug]: Anthropic models return arguments from function calls unreliably after proxy update
bug
Something isn't working
#3919
opened May 30, 2024 by
demux79
[Bug]: KeyError when using ollama function call. (inconsistent ollama api response)
bug
Something isn't working
#3912
opened May 30, 2024 by
cdoneshot
[Feature]: Add NVIDIA NIM API provider
enhancement
New feature or request
#3896
opened May 29, 2024 by
tuanlv14
[Feature]: to add messages corresponding to each LLM given in the fallback list.
enhancement
New feature or request
#3893
opened May 29, 2024 by
Aviral-Cactus
[Feature]: Support gemini pdf mime type
enhancement
New feature or request
#3890
opened May 29, 2024 by
krrishdholakia
[Feature]: admin UI - improving analytics tab
enhancement
New feature or request
#3880
opened May 28, 2024 by
ishaan-jaff
[Bug]: clarifai provider in litellm[proxy] not work
bug
Something isn't working
#3862
opened May 27, 2024 by
tuanlv14
[Feature]: Make S3 cache async
enhancement
New feature or request
#3860
opened May 27, 2024 by
Manouchehri
[Bug]: S3 cache doesn`t work well with time to live
bug
Something isn't working
#3852
opened May 27, 2024 by
pharindoko
[Feature]: Support azure_ad_token_provider for Azure openai endpoint if only using LiteLLM Proxy
enhancement
New feature or request
#3851
opened May 27, 2024 by
RyoYang
[Feature]: REST API endpoint New feature or request
api.litellm.ai/public_models?key=<my-key>
to show available models and supported params
enhancement
#3833
opened May 25, 2024 by
ishaan-jaff
[Bug]: Anthropic 1s latency - tool calling
bug
Something isn't working
#3827
opened May 24, 2024 by
krrishdholakia
[Feature]: Support Fine-Tuned VertexAI models
enhancement
New feature or request
#3821
opened May 24, 2024 by
ishaan-jaff
[Bug]: clarifai provider in litellm[proxy] not work
bug
Something isn't working
#3796
opened May 23, 2024 by
ytpk
[Bug]: Langfuse Failure callback - Content Filter Exceptions on Streaming Requests don't get logged to Langfuse
bug
Something isn't working
#3795
opened May 23, 2024 by
ishaan-jaff
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.