#12585 · @T1mn · opened Feb 7, 2026 at 6:56 AM UTC · last updated Mar 21, 2026 at 8:38 AM UTC
fix: generate fallback tool call IDs for providers missing id field
Score breakdown
Impact
Clarity
Urgency
Ease Of Review
Guidelines
Readiness
Size
Trust
Traction
Summary
This PR implements a fallback to generate tool call IDs for AI providers like NVIDIA NIM, GLM, AWS Bedrock, and Chutes that often omit the 'id' field in their streaming responses. It resolves multiple AI_InvalidResponseDataError crashes reported by users. The change is small and aligns with existing codebase patterns.
Description
Fixes #6290 Fixes #1880 Fixes #10885 Fixes #10189
Adds fallback ID generation for tool calls when providers like NVIDIA NIM, GLM, AWS Bedrock, and Chutes omit the 'id' field in their streaming responses. Instead of throwing an error, we now generate a UUID using generateId() for these known non-compliant providers while maintaining strict validation for others.
This aligns with the existing pattern used elsewhere in the codebase (toolCall.id ?? generateId()).
Linked Issues
#6290 Add graceful fallback for missing response.id from NVIDIA NIM
View issue#10885 NVIDIA connector is experiencing CRITICAL issues
View issue#10189 nvidia glm4.7 Error 404
View issue#1880 Opencode with GLM 4.5 String issue
View issueComments
PR comments
SurealCereal
I hope this gets merged soon - GPT OSS fails running on the latest freshly-built vLLM main branch. Every tool call fails with AI_InvalidResponseDataError: Expected 'id' to be a string.
TimothyStiles
It's just a hot-fix until this gets merged but I managed to solve this issue for nvidia nim tool usage for myself with this plugin I put together.
PwccaCode
lets hope this gets merged in a reasonable timeframe
dongguantuandaiwang
This issue still exists.
device:mac air
Version information: OpenCode version is 1.2.27, and the model used is kimi-k2-thinking.
Error message: AI_InvalidResponseDataError: Expected 'id' to be a string.
Background: This error occurred when executing the command mkdir -p /Users/user/Documents/work/code/limax/migrate_support_bieti/{core,adapters,services,cli,tests,config,requirements,utils}. I hadn't encountered this before, and removing curly braces from the model didn't resolve the issue.
Changed Files
packages/opencode/src/provider/sdk/copilot/chat/openai-compatible-chat-language-model.ts
+11−3