We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A non-fatal error seems to be reported somewhat randomly:
{"message": "LiteLLM.LoggingError: [Non-Blocking] Exception occurred while failure logging cannot pickle 'FrameLocalsProxy' object", "level": "ERROR", "timestamp": "2025-02-21T23:40:35.475080", "stacktrace" │ │ : "Traceback (most recent call last):\n File \"/usr/lib/python3.13/site-packages/litellm/llms/anthropic/chat/handler.py\", line 228, in acompletion_function\n response = await async_handler.post(\n │ │ ^^^^^^^^^^^^^^^^^^^^^^^^^\n api_base, headers=headers, json=data, timeout=timeout\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/ │ │ site-packages/litellm/litellm_core_utils/logging_utils.py\", line 131, in async_wrapper\n result = await func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/lib/python3.13/site │ │ -packages/litellm/llms/custom_httpx/http_handler.py\", line 236, in post\n raise e\n File \"/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py\", line 192, in post\n response │ │ .raise_for_status()\n ~~~~~~~~~~~~~~~~~~~~~~~~~^^\n File \"/usr/lib/python3.13/site-packages/httpx/_models.py\", line 761, in raise_for_status\n raise HTTPStatusError(message, request=request, respon │ │ se=self)\nhttpx.HTTPStatusError: Server error '529 Internal Server Error' for url 'https://us-east5-aiplatform.googleapis.com/v1/projects/varuna-400921/locations/us-east5/publishers/anthropic/models/claude- │ │ 3-5-haiku@20241022:rawPredict'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/529\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback ( │ │ most recent call last):\n File \"/usr/lib/python3.13/site-packages/litellm/main.py\", line 467, in acompletion\n response = await init_response\n ^^^^^^^^^^^^^^^^^^^\n File \"/usr/lib/pyt │ │ hon3.13/site-packages/litellm/llms/anthropic/chat/handler.py\", line 247, in acompletion_function\n raise AnthropicError(\n ...<3 lines>...\n )\nlitellm.llms.anthropic.common_utils.AnthropicError: │ │ {\"type\":\"error\",\"error\":{\"type\":\"overloaded_error\",\"message\":\"Overloaded\"}}\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File │ │ \"/usr/lib/python3.13/site-packages/litellm/utils.py\", line 1253, in wrapper_async\n result = await original_function(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/ │ │ usr/lib/python3.13/site-packages/litellm/main.py\", line 486, in acompletion\n raise exception_type(\n ~~~~~~~~~~~~~~^\n model=model,\n ^^^^^^^^^^^^\n ...<3 lines>...\n e │ │ xtra_kwargs=kwargs,\n ^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py\", line 2202, in exception_type\n raise e │ │ \n File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py\", line 2171, in exception_type\n raise APIConnectionError(\n ...<4 lines>...\n )\nlitellm.excepti │ │ ons.APIConnectionError: litellm.APIConnectionError: Vertex_aiException - {\"type\":\"error\",\"error\":{\"type\":\"overloaded_error\",\"message\":\"Overloaded\"}}\n\nDuring handling of the above exception, │ │ another exception occurred:\n\nTraceback (most recent call last):\n File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py\", line 1899, in failure_handler\n capture_exce │ │ ption(exception)\n ~~~~~~~~~~~~~~~~~^^^^^^^^^^^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/api.py\", line 148, in capture_exception\n return Scope.get_current_scope().capture_exception(\n │ │ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n error, scope=scope, **scope_kwargs\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentr │ │ y_sdk/scope.py\", line 1168, in capture_exception\n event, hint = event_from_exception(\n ~~~~~~~~~~~~~~~~~~~~^\n exc_info, client_options=Scope.get_client().options\n ^^^^ │ │ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/utils.py\", line 1035, in event_from_exception\n \"values\": exceptions_from_error_tup │ │ le(\n ~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n exc_info, client_options, mechanism\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk │ │ /utils.py\", line 910, in exceptions_from_error_tuple\n single_exception_from_error_tuple(\n ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n exc_type, exc_value, tb, client_options, mechanism\n ^^^ │ │ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/utils.py\", line 730, in single_exception_from_error_tuple\n serialize_frame(\n ~~~ │ │ ~~~~~~~~~~~~^\n tb.tb_frame,\n ^^^^^^^^^^^^\n ...<3 lines>...\n max_value_length=max_value_length,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python │ │ 3.13/site-packages/sentry_sdk/utils.py\", line 619, in serialize_frame\n rv[\"vars\"] = copy(frame.f_locals)\n ~~~~^^^^^^^^^^^^^^^^\n File \"/usr/lib/python3.13/copy.py\", line 88, in co │ │ py\n rv = reductor(4)\nTypeError: cannot pickle 'FrameLocalsProxy' object\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/usr/lib/p │ │ ython3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py\", line 2016, in failure_handler\n capture_exception(e)\n ~~~~~~~~~~~~~~~~~^^^\n File \"/usr/lib/python3.13/site-packages/sentry_ │ │ sdk/api.py\", line 148, in capture_exception\n return Scope.get_current_scope().capture_exception(\n ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n error, scope=scope, **scope_kwargs\n │ │ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/scope.py\", line 1168, in capture_exception\n event, hint = event_from_exception(\n │ │ ~~~~~~~~~~~~~~~~~~~~^\n exc_info, client_options=Scope.get_client().options\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packa │ │ ges/sentry_sdk/utils.py\", line 1035, in event_from_exception\n \"values\": exceptions_from_error_tuple(\n ~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n exc_info, client_options, mechanism\n │ │ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/utils.py\", line 910, in exceptions_from_error_tuple\n single_exception_from_error_tuple(\n ~~~ │ │ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^\n exc_type, exc_value, tb, client_options, mechanism\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packa │ │ ges/sentry_sdk/utils.py\", line 730, in single_exception_from_error_tuple\n serialize_frame(\n ~~~~~~~~~~~~~~~^\n tb.tb_frame,\n ^^^^^^^^^^^^\n ...<3 lines>...\n max_value_leng │ │ th=max_value_length,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File \"/usr/lib/python3.13/site-packages/sentry_sdk/utils.py\", line 619, in serialize_frame\n rv[\"vars\"] = copy(frame. │ │ f_locals)\n ~~~~^^^^^^^^^^^^^^^^\n File \"/usr/lib/python3.13/copy.py\", line 88, in copy\n rv = reductor(4)\nTypeError: cannot pickle 'FrameLocalsProxy' object"}
No
litellm-helm-0.1.614
No response
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
What happened?
A non-fatal error seems to be reported somewhat randomly:
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
litellm-helm-0.1.614
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: