Topic hub pagination

LiteLLM errors - page 5

Continue browsing this topic cluster with SEO-safe static pagination.

LiteLLM LiteLLM Updated May 13, 2026

LiteLLM Performance Regression v1.81.x — UI and API Slowness

Fix LiteLLM performance degradation after upgrading to v1.81.x Includes evidence for LiteLLM troubleshooting demand.

Significant performance regression after upgrading from 1.80.5 to 1.81.x (UI + API slowness)
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM RouterRateLimitError Missing Retry-After Header Fix

Fix missing Retry-After header in LiteLLM RouterRateLimitError so downstream clients can properly handle rate limit cooldown Includes evidence for LiteLLM troubleshooting demand.

RouterRateLimitError: No deployments available for selected model, Try again in X seconds (no Retry-After header)
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM Structured Output Fails for Anthropic Models on Bedrock Converse

Fix structured output JSON schema errors when using Anthropic models via LiteLLM on AWS Bedrock Converse API Includes evidence for LiteLLM troubleshooting demand.

BedrockException - The model returned the following errors: output_config.format: Extra inputs are not permitted
LiteLLM LiteLLM Updated May 13, 2026

Claude Code Multi-Turn Conversation Hangs with LiteLLM Gateway 400 Error

Fix Claude Code getting stuck/hanging during multi-turn conversations when routed through LiteLLM gateway Includes evidence for LiteLLM troubleshooting demand.

API Error: 400 litellm bad request — Claude Code multi-turn conversation hangs via LiteLLM gateway
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM MCP Server Returns Opaque Error Instead of Upstream 401

Fix LiteLLM MCP server returning opaque error when upstream rejects forwarded Bearer token with 401 Includes evidence for LiteLLM troubleshooting demand.

fix(mcp): surface upstream 401 for token-forwarding MCP servers
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM Structured Output Fails for Anthropic Models on AWS Bedrock

Fix structured output / JSON schema responses failing when using Anthropic models via AWS Bedrock through LiteLLM Includes evidence for LiteLLM troubleshooting demand.

BedrockException - structured output (JSON schema) does not work correctly for Anthropic models on Bedrock/converse
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM Tool Registry Empty for Anthropic Messages Endpoint — Tool Calling Fails

Fix LiteLLM tool registry not populating for /v1/messages anthropic_messages endpoint, causing tool/function calling failures Includes evidence for LiteLLM troubleshooting demand.

Tool registry (LiteLLM_ToolTable / LiteLLM_SpendLogToolIndex) not populated for /v1/messages (anthropic_messages) path
LiteLLM LiteLLM Updated May 13, 2026

LiteLLM Reports 'No deployments available' Instead of Clear Rate Limit Error on 429

Fix LiteLLM confusing rate limit error message showing 'No deployments available' instead of clear 429 rate limit Includes evidence for LiteLLM troubleshooting demand.

RateLimitError: Error code: 429 - {'error': {'message': 'No deployments available for selected model.', 'type': 'None', 'param': 'None', 'code': '429'}}