Mistral logo

Ministral 8B

8B

by Mistral

Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

Chat with Ministral 8B

Pricing

Input Tokens
Per 1M tokens
Free
Output Tokens
Per 1M tokens
Free
Image Processing
Per 1M tokens
$0.00/1M tokens

Supported Modalities

Input

text

Output

text

Performance Benchmarks

Intelligence Index
Overall intelligence score
12.4
Coding Index
Programming capability
7.6
Math Index
Mathematical reasoning
3.0
GPQA
Graduate-level questions
27.6%
MMLU Pro
Multitask language understanding
38.9%
HLE
Human-like evaluation
4.9%
LiveCodeBench
Real-world coding tasks
11.2%
AIME 2025
Advanced mathematics
3.0%
MATH 500
Mathematical problem solving
57.1%

Specifications

Context Length
131K tokens
Provider
Mistral
Throughput
189.889 tokens/s
Released
Oct 17, 2024
Model ID
mistralai/ministral-8b

Ready to try it?

Start chatting with Ministral 8B right now. No credit card required.

Start Chatting

More from Mistral

View all models