Unable to use the Claude model normally
A
Arthur
I primarily use DigitalOcean's AI services through the Serverless Inference API. After using it for some time, I found that models like GPT work normally, but Claude models do not function properly.
- Unable to enable reasoning for Claude models: The Serverless Inference API uses an OpenAI-compatible API format and thus only includes the reasoning_effortparameter. However, the upstream Claude model uses the Claude API format, and the Serverless Inference API fails to convertreasoning_effortinto the upstreamthinkingparameter.
- Unable to use tool-calling functionality properly: Similar to the first issue, because the upstream Claude API format differs from the OpenAI API format—and specifically because their tool-calling formats are incompatible—errors occur when using MCP (which requires tool-calling support) in conversations.
The root cause of both issues is that the Serverless Inference API fails to correctly translate requests in OpenAI API format into the Claude API format.