o3 Mini

Compact o3 reasoning model balancing performance and cost for complex tasks.

o3-mini
STABLE
200,000 context
Starting at $1.10/M input tokens
Starting at $4.40/M output tokens
Streaming
JSON Output

Providers for o3 Mini

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

OpenAI

openai/o3-mini
Context Size
200k
Stability
STABLE
Pricing
Input
$1.10
/M
Cached
Output
$4.40
/M
Capabilities
Streaming
JSON Output
Try in Playground

Azure

UNSTABLE
azure/o3-mini
Context Size
200k
Stability
unstable
Pricing
Input
$1.10
/M
Cached
Output
$4.40
/M
Capabilities
Streaming
JSON Output
Try in Playground
    o3 Mini – AI Model on LLM Gateway