AI Models / Compare
Mistral Small 3
Mistral Small 3 is a 24B-parameter language model optimized for low-latency performance across common AI tasks. Released under the Apache 2.0 license, it features both pre-trained and instruction-tune
- Creator
- Mistral AI
- Lifecycle
- Active
- Context
- 32.8K
- Max output
- 16.4K
- Released
- Jan 30, 2025
- Status
- up
- Input
- $0.05 / 1M tokens
- Output
- $0.08 / 1M tokens
- Cached read
- — / 1M tokens
- Cached write
- — / 1M tokens
- Batch discount
- —%
- Source
- OpenRouter
- Verified
- Apr 5, 2026 (High)
Capabilities
- Modalities
- text→text
- Capabilities
- structuredOutputs
Other models from Mistral AI
Codestral, Codestral 2508, Devstral 2, Devstral 2 2512, Devstral Medium, Devstral Small 1.1, Ministral 3 14B 2512, Ministral 3 3B 2512, Ministral 3 8B 2512, Mistral 7B Instruct v0.1, Mistral Large, Mistral Large 2407, Mistral Large 2411, Mistral Large 3, Mistral Large 3 2512, Mistral Medium 3, Mistral Medium 3.1, Mistral Nemo, Mistral Small 3.1 24B, Mistral Small 3.2 24B, Mistral Small 4, Mistral Small 4, Mistral Small Creative, Mixtral 8x22B Instruct, Mixtral 8x7B Instruct, Pixtral Large 2411, Saba, Voxtral Small 24B 2507