Ministral 3B
3Bby Mistral
Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.
Pricing
Input Tokens
Per 1M tokens
Free
Output Tokens
Per 1M tokens
Free
Image Processing
Per 1M tokens
$0.00/1M tokens
Supported Modalities
Input
text
Output
text
Performance Benchmarks
Intelligence Index
Overall intelligence score
10.9
Coding Index
Programming capability
5.4
Math Index
Mathematical reasoning
0.3
GPQA
Graduate-level questions
26.0%
MMLU Pro
Multitask language understanding
33.9%
HLE
Human-like evaluation
5.5%
LiveCodeBench
Real-world coding tasks
6.9%
AIME 2025
Advanced mathematics
0.3%
MATH 500
Mathematical problem solving
53.7%
Specifications
- Context Length
- 131K tokens
- Provider
- Mistral
- Throughput
- 257.474 tokens/s
- Released
- Oct 17, 2024
- Model ID
- mistralai/ministral-3b
Ready to try it?
Start chatting with Ministral 3B right now. No credit card required.
Start ChattingMore from Mistral
View all modelsCompare Models
Select a model to compare with Ministral 3B