Continue browsing this topic cluster with SEO-safe static pagination.
LiteLLM LiteLLM Updated May 13, 2026
Check if installed LiteLLM version is compromised, understand scope of credential theft, and remediate Includes evidence for LiteLLM troubleshooting demand.
CRITICAL: Malicious litellm_init.pth in litellm 1.82.8 — credential stealer (supply chain compromise) LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM Fireworks AI tool call failures when JSON Schema contains default null in nested properties Includes evidence for LiteLLM troubleshooting demand.
Fireworks AI rejects tool schemas with "default": null — drop_params doesn't sanitize nested schemas LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM failing to connect to local Ollama instance at 127.0.0.1:11434 Includes evidence for LiteLLM troubleshooting demand.
LiteLLM APIConnectionError: Cannot connect to host 127.0.0.1:11434 LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM blocking self-hosted free models when user exceeds budget for paid models Includes evidence for LiteLLM troubleshooting demand.
ExceededBudget: API blocks free (no cost) models when budget is exceeded LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM OpenAI completions API failures after upgrading to v1.81.x Includes evidence for LiteLLM troubleshooting demand.
OpenAI inference broken in SDK on v1.81.x — completions API failure with GPT models LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM embedding router not retrying or failing over when embedding host is unreachable Includes evidence for LiteLLM troubleshooting demand.
aembedding() missing num_retries kwarg — no failover for embedding model groups LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM OCI adapter incorrectly mapping max_completion_tokens to maxTokens for GPT-5 models Includes evidence for LiteLLM troubleshooting demand.
Unsupported parameter: 'max_tokens' is not supported LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM proxy failover not working when a deployment host is unreachable Includes evidence for LiteLLM troubleshooting demand.
APIConnectionError LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM daily activity API timezone counting bug overstates totals Includes evidence for LiteLLM troubleshooting demand.
LiteLLM /user/daily/activity overstates totals for single-day window in non-UTC timezones LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM Router.aspeech no retry failover for TTS requests Includes evidence for LiteLLM troubleshooting demand.
LiteLLM Router.aspeech() bypasses async_function_with_fallbacks — TTS requests have no retry or failover LiteLLM LiteLLM Updated May 13, 2026
Fix LiteLLM proxy TPM limit not being enforced correctly across multiple replicas in production deployments Includes evidence for LiteLLM troubleshooting demand.
tpm_limit × N_replica — TPM enforcement per-pod instead of cross-pod in multi-replica deployments LiteLLM LiteLLM Updated May 12, 2026
Check if LiteLLM package is safe / recover from compromised PyPI install Includes evidence for LiteLLM troubleshooting demand.
litellm PyPI package (v1.82.7 + v1.82.8) compromised — credential theft via malicious code in proxy_server.py and litellm_init.pth LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM llm_as_a_judge guardrail failing with BadRequestError when using vLLM self-hosted model Includes evidence for LiteLLM troubleshooting demand.
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. LiteLLM LiteLLM Updated May 12, 2026
Fix missing x-ratelimit headers on LiteLLM proxy streaming responses Includes evidence for LiteLLM troubleshooting demand.
x-ratelimit-* headers dropped on streaming and plain-dict responses (v3 parallel_request_limiter) LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM returning 400 errors when using OpenAI Codex CLI with Chinese domestic models via protocol translation Includes evidence for LiteLLM troubleshooting demand.
400 Bad Request — LiteLLM passes unsupported parameters to Chinese models for Codex CLI LiteLLM LiteLLM Updated May 12, 2026
Fix 400 error when using LiteLLM with Codex CLI and Chinese/domestic LLM models Includes evidence for LiteLLM troubleshooting demand.
400 Bad Request — unsupported parameter passed to model API LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM ChatGPT image generation returning 403 Cloudflare challenge page Includes evidence for LiteLLM troubleshooting demand.
Enable JavaScript and cookies to continue LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM image generation failing with Cloudflare 403 challenge for ChatGPT subscriptions Includes evidence for LiteLLM troubleshooting demand.
Enable JavaScript and cookies to continue — 403 Cloudflare challenge on image generation LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM proxy database connection failure on Windows Includes evidence for LiteLLM troubleshooting demand.
Database connection failed when running in database mode on Windows LiteLLM LiteLLM Updated May 12, 2026
Fix LiteLLM spend_log_cleanup silent failure on Kubernetes Includes evidence for LiteLLM troubleshooting demand.
spend_log_cleanup.py:153 - Error during cleanup