Topic hub pagination

LiteLLM errors - page 3

Continue browsing this topic cluster with SEO-safe static pagination.

LiteLLM LiteLLM Updated May 15, 2026

LiteLLM SMTP Configuration Crash - Nonce Length Validation Error

Fix LiteLLM crashing with nacl.exceptions.ValueError nonce must be exactly 24 bytes when configuring SMTP Includes evidence for LiteLLM troubleshooting demand.

nacl.exceptions.ValueError: The nonce must be exactly 24 bytes long
LiteLLM LiteLLM / Anthropic API Updated May 14, 2026

LiteLLM Responses API Drops cache_control on input_text Content Blocks

Fix LiteLLM cache_control not forwarding to Anthropic when using Responses API Includes evidence for LiteLLM / Anthropic API troubleshooting demand.

LiteLLM Responses API silently drops cache_control on input_text content blocks during transformation
LiteLLM LiteLLM Updated May 14, 2026

LiteLLM Azure OpenAI Authentication Broken in v1.84.0 with enable_azure_ad_token_refresh

Fix LiteLLM Azure OAI authentication error after upgrading to 1.84.0 when using enable_azure_ad_token_refresh Includes evidence for LiteLLM troubleshooting demand.

LiteLLM AuthenticationError: AzureException AuthenticationError - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
LiteLLM LiteLLM Updated May 14, 2026

LiteLLM Missing Retry-After Header on Rate Limit Errors

Fix LiteLLM not returning Retry-After header when all deployments are in cooldown Includes evidence for LiteLLM troubleshooting demand.

No Retry-After header on RouterRateLimitError (all deployments in cooldown)
LiteLLM LiteLLM Updated May 14, 2026

LiteLLM Incorrect TPM Rate Limiting for Virtual Keys

Fix LiteLLM virtual keys not enforcing TPM rate limits correctly Includes evidence for LiteLLM troubleshooting demand.

Incorrect TPM limiting for virtual keys — rate limit not enforced correctly