Latest archive

Latest developer error fixes

Recently updated troubleshooting pages, kept as a crawlable archive so new and refreshed fixes stay easy to discover.

LiteLLM LiteLLM Updated May 17, 2026

LiteLLM IP Usage-Based Dynamic Priority Queuing (Fair Queuing)

Implement fair queuing in LiteLLM proxy to prevent single client from monopolizing model capacity and causing starvation for other tenants Includes evidence for LiteLLM troubleshooting demand.

No fair queuing — traffic burst from one client starves other clients' requests
Vercel Deployment Updated May 17, 2026

Vercel deployment build failed (out of memory)

Fix Vercel deployment build failure due to out of memory during build step Includes evidence for Vercel troubleshooting demand.

build exited with code 1 (OOM during Vite/vanilla-extract build bundle)
Anthropic API Anthropic API Updated May 17, 2026

Anthropic SDK Mid-Stream Transport Error Leaks as Bare httpx.Exception Instead of APIConnectionError

Developer using Anthropic streaming API loses connection mid-stream (TransportError) and can't catch it with standard except anthropic.APIConnectionError blocks — must know to also catch httpx.TransportError Includes evidence for Anthropic API troubleshooting demand.

Mid-stream httpx.TransportError leaks through as bare httpx exceptions because SSE body iteration has no try/except — customers standard retry ladder misses mid-stream drops
Claude Code AI Coding Tools Updated May 17, 2026

Claude Code /login Shows Success But Reverts to Not Logged In — Auth Persistence Bug

User ran /login in Claude Code on Windows, sees success message but tool remains unauthenticated and rejects all commands afterward Includes evidence for Claude Code troubleshooting demand.

/login shows "Login successful" but immediately reverts to "Not logged in". Any subsequent command fails with "Not logged in".
Docker Docker Updated May 17, 2026

Docker 29.5.0 Rootless Daemon Socket Disappears — Cannot Connect to Docker API

User upgraded to Docker 29.5.0 and rootless mode stopped working — docker socket file disappears, can't run any docker commands Includes evidence for Docker troubleshooting demand.

failed to connect to the docker API at unix:///run/user/1000/docker.sock; check if the path is correct and if the daemon is running: dial unix /run/user/1000/docker.sock: connect: no such file or directory
LangChain AI Coding Tools Updated May 17, 2026

LangChain with_structured_output() Silently Drops Previously Bound Tools

Developer calls langgraph.with_structured_output() expecting it to preserve pre-bound tools, but they are silently dropped — functional regression breaking LLM function calling workflows Includes evidence for LangChain troubleshooting demand.

[langchain-openai] with_structured_output() silently drops previously bound tools and lacks support for OpenAI native tool bindings
LiteLLM LiteLLM Updated May 17, 2026

LiteLLM Proxy GET /v1/models Ignores user.models Restriction — Unauthorized Model List Exposure

Developer using LiteLLM proxy finds that restricted users can see full model list via GET /v1/models endpoint despite access group restrictions, creating a security/authorization inconsistency Includes evidence for LiteLLM troubleshooting demand.

GET /v1/models ignores user.models restriction — shows all proxy models regardless of user access groups
OpenAI API OpenAI API Updated May 17, 2026

OpenAI Python SDK Background Mode Returns No Typed Exception When Response Fails

Developer using client.responses.create(background=True) cannot handle failure cases programmatically because the SDK returns 200 with failed payload instead of raising a typed exception Includes evidence for OpenAI API troubleshooting demand.

Background responses failures lack a stable code/name that maps to an exception class — HTTP poll returns 200 OK with status=failed but no typed exception is raised
Claude Code AI Coding Tools Updated May 17, 2026

Claude Code MCP server auth returns HTTP 431 error on Linux

Fix Claude Code MCP server authentication returning HTTP 431 error on Linux platform Includes evidence for Claude Code troubleshooting demand.

HTTP ERROR 431 — MCP server authentication failure (browser restart!)
Docker Docker Updated May 17, 2026

Docker rootless daemon not connecting after update to version 29.5.0

Fix Docker rootless mode broken after upgrading to 29.5.0 — docker.sock missing Includes evidence for Docker troubleshooting demand.

failed to connect to the docker API at unix:///run/user/1000/docker.sock: dial unix /run/user/1000/docker.sock: connect: no such file or directory
OpenAI API OpenAI API Updated May 17, 2026

OpenAI Batch API returns 404 for GPT-5 mini/nano models

Fix OpenAI Batch API 404 error when using GPT-5 mini or nano models with chat completions endpoint Includes evidence for OpenAI API troubleshooting demand.

The model gpt-5-mini-2025-08-07-batch does not exist or you do not have access (HTTP 404 on /v1/chat/completions batch endpoint)
Anthropic API Anthropic API Updated May 17, 2026

Anthropic Bedrock Server-Side Errors — Mapping Non-200 Stream Events

Handle Anthropic Bedrock server-side errors (5xx/4xx responses) during streaming properly; understand transient error retry behavior and HTTP status exception handling Includes evidence for Anthropic API troubleshooting demand.

non-200 stream events raise ValueError instead of APIStatusError
OpenAI API OpenAI API Updated May 17, 2026

Azure OpenAI Enterprise S0 Tier Rate Limit Issues With GPT 4.1 Models

Fix Azure OpenAI enterprise S0 tier rate limiting for GPT-4.1 models; understand token rate limits vs free tier restrictions and how to increase default rate limits Includes evidence for OpenAI API troubleshooting demand.

RateLimitError — Requests to the ChatCompletions_Create Operation under Azure OpenAI API version 2025-01-01-preview have exceeded token rate limit of your current OpenAI S0 pricing tier
Claude Code AI Coding Tools Updated May 17, 2026

Claude Code OAuth Login Broken After Auto-Update v2.1.79

Fix Claude Code OAuth authentication failure after auto-update preventing login to AI coding assistant Includes evidence for Claude Code troubleshooting demand.

OAuth Request Failed — This isn't working right now. You can try again later
Cursor Cursor Updated May 17, 2026

Cline (Cursor Alternative) HealthCheck Timed Out in JetBrains

Fix Cline plugin failing to load in JetBrains IDEs due to health check timeout; alternative Cursor-like IDE error for same category Includes evidence for Cursor troubleshooting demand.

Healthcheck timed out — Failed to load Cline in IntelliJ/JetBrains
OpenAI API OpenAI API Updated May 17, 2026

n8n Workflow OpenAI 429 Insufficient Quota Despite Available Credits

Fix OpenAI 429 insufficient_quota error in n8n workflows despite having $18+ credits and low token usage; RPM limit bypass strategies Includes evidence for OpenAI API troubleshooting demand.

429 – You exceeded your current quota, please check your plan and billing details. Type: insufficient_quota
OpenAI API OpenAI API Updated May 17, 2026

OpenAI File Upload API Returns 400 Bad Request with PDF Streaming

Fix OpenAI Files API 400 bad request when uploading PDF and referencing in streaming responses; understand file input limitations with Azure OpenAI Includes evidence for OpenAI API troubleshooting demand.

Uploading PDF via Files API and using in Streaming gives 400 bad request
Cloudflare Cloudflare Updated May 17, 2026

Cloudflare Error 524 while loading large files for embedding (timeout)

Fix Cloudflare 524 timeout errors when serving large files through Cloudflare proxy, especially for AI/ML embedding workloads Includes evidence for Cloudflare troubleshooting demand.

524 Error from Cloudflare while loading big file for embedding — origin server timed out
LiteLLM LiteLLM Updated May 17, 2026

Fixing LiteLLM APIConnectionError: Connection timed out

Fix LiteLLM API connection timeout errors by adjusting request_timeout or retry settings Includes evidence for LiteLLM troubleshooting demand.

litellm.APIConnectionError: Request timed out. Please increase the max_retries parameter.
Vercel Deployment Updated May 17, 2026

Vercel MIDDLEWARE_INVOCATION_FAILED: Cannot find module (ESM resolver mismatch)

Fix Vercel deployment failure caused by Vercel runtime resolver using require() instead of ESM path for middleware modules Includes evidence for Vercel troubleshooting demand.

MIDDLEWARE_INVOCATION_FAILED on Vercel deploy: Cannot find module — local build artifact does not reference ESM path
Anthropic API Anthropic API Updated May 17, 2026

Anthropic Bedrock streaming SSE events return TypeError instead of typed APIStatusError — NoneType has no attribute 'model'

Fix intermittent streaming crashes when calling Anthropic Claude via AWS Bedrock cross-region inference profiles; rate-limited responses appear as HTTP 200 error frames causing TypeError instead of catchable APIStatusError Includes evidence for Anthropic API troubleshooting demand.

AttributeError: 'NoneType' object has no attribute 'model' — Bedrock cross-region profile returns HTTP 200 with error payload type='rate_limit_error', SDK decodes as BetaRawMessageStartEvent with message=None
Docker Docker Updated May 17, 2026

Docker 29.4.1 daemon crash panic: crypto: requested hash function #0 is unavailable

Fix Docker daemon unexpectedly crashing with crypto hash function unavailable panic, preventing any container operations until daemon restart Includes evidence for Docker troubleshooting demand.

panic: crypto: requested hash function #0 is unavailable — GitHub goroutine crash in github.com/opencontainers/go-digest.Algorithm.Hash triggered by Docker image digest computation
Ollama Ollama Updated May 17, 2026

Ollama Codex App integration ignores model num_ctx setting, generates excessively large context_window causing severe slowdown

Developers running local Ollama models via Codex App notice extreme slowdowns; root cause is Codex App not reading model parameter num_ctx and defaulting to max context sizes Includes evidence for Ollama troubleshooting demand.

Codex App sends requests with context_window=128000-262144 tokens instead of model's configured num_ctx (e.g., 32768) — causing severe generation slowdowns on local models
Vercel Deployment Updated May 17, 2026

Vercel deploy MIDDLEWARE_INVOCATION_FAILED + Cannot find module @swc/helpers — Next.js 16.2.x + proxy.ts middleware regression

Fix HTTP 500 MIDDLEWARE_INVOCATION_FAILED on Vercel production deployment caused by @swc/helpers module resolution failure when using Next.js 16.2.x App Router with proxy.ts middleware convention and Sentry installed Includes evidence for Vercel troubleshooting demand.

MIDDLEWARE_INVOCATION_FAILED: Cannot find module '/var/task/node_modules/@swc/helpers/esm/_interop_require_default.js' — next@16.2.x + proxy.ts + @sentry/nextjs
Cloudflare Workers Cloudflare Updated May 17, 2026

Cloudflare Wrangler Does Not Support Next.js 16 proxy.ts Convention

Fix Cloudflare Wrangler failing to bundle Next.js 16 apps that use proxy.ts instead of deprecated middleware.ts Includes evidence for Cloudflare Workers troubleshooting demand.

wrangler Next.js integration does not recognize proxy.ts; still depends on old middleware.ts convention
Vercel / Next.js Deployment Updated May 17, 2026

Next.js Standalone Mode Cache Components Memory Leak Causes OOM

Fix Next.js 16.2.2 standalone deployment running out of memory due to cached streamed fetches leaking arrayBuffers under load Includes evidence for Vercel / Next.js troubleshooting demand.

cached internal streamed fetches cause unbounded arrayBuffers growth and OOM
OpenAI Python SDK OpenAI API Updated May 17, 2026

OpenAI Python SDK Streaming Tool Call Fragmentation Drops Arguments

Fix streaming tool calls having incomplete function arguments when speculative decoding produces multiple tool_calls entries per chunk Includes evidence for OpenAI Python SDK troubleshooting demand.

accumulate_delta drops tool_call fragments when one chunk has multiple entries at the same index
Anthropic API Anthropic API Updated May 17, 2026

Anthropic SDK Mid-stream SSE Overloaded Error Returns Wrong Status Code (200 vs 529)

Developer using anthropic SDK streaming sees overloaded_error arrive as HTTP 200 SSE event; SDK creates bare APIStatusError(200) instead of OverloadedError(529); fallback/retry logic breaks because status check >= 500 never matches Includes evidence for Anthropic API troubleshooting demand.

Mid-stream SSE overloaded_error returns status_code=200 instead of 529 — fails fallback retry logic