I primarily use DigitalOcean's AI services through the Serverless Inference API. After using it for some time, I found that models like GPT work normally, but Claude models do not function properly.
  1. Unable to enable reasoning for Claude models: The Serverless Inference API uses an OpenAI-compatible API format and thus only includes the
    reasoning_effort
    parameter. However, the upstream Claude model uses the Claude API format, and the Serverless Inference API fails to convert
    reasoning_effort
    into the upstream
    thinking
    parameter.
  2. Unable to use tool-calling functionality properly: Similar to the first issue, because the upstream Claude API format differs from the OpenAI API format—and specifically because their tool-calling formats are incompatible—errors occur when using MCP (which requires tool-calling support) in conversations.
The root cause of both issues is that the Serverless Inference API fails to correctly translate requests in OpenAI API format into the Claude API format.