AI Models / Compare
Ministral 3 14B 2512
The largest model in the Ministral 3 family, Ministral 3 14B offers frontier capabilities and performance comparable to its larger Mistral Small 3.2 24B counterpart. A powerful and efficient language
- Creator
- Mistral AI
- Lifecycle
- Active
- Context
- 262.1K
- Max output
- —
- Released
- Dec 2, 2025
- Status
- up
- Input
- $0.20 / 1M tokens
- Output
- $0.20 / 1M tokens
- Cached read
- $0.02 / 1M tokens
- Cached write
- — / 1M tokens
- Batch discount
- —%
- Source
- OpenRouter
- Verified
- Apr 5, 2026 (High)
Capabilities
- Modalities
- textimage→text
- Capabilities
- imageInputpromptCachingfunctionCallingstructuredOutputs
Other models from Mistral AI
Codestral, Codestral 2508, Devstral 2, Devstral 2 2512, Devstral Medium, Devstral Small 1.1, Ministral 3 3B 2512, Ministral 3 8B 2512, Mistral 7B Instruct v0.1, Mistral Large, Mistral Large 2407, Mistral Large 2411, Mistral Large 3, Mistral Large 3 2512, Mistral Medium 3, Mistral Medium 3.1, Mistral Nemo, Mistral Small 3, Mistral Small 3.1 24B, Mistral Small 3.2 24B, Mistral Small 4, Mistral Small 4, Mistral Small Creative, Mixtral 8x22B Instruct, Mixtral 8x7B Instruct, Pixtral Large 2411, Saba, Voxtral Small 24B 2507