Ministral 8B

Ministral 8B

by Mistral

Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

Capabilities

Text Generation 131K Context
0/500

AI can make mistakes. Handle with care.

Want to Keep Going?

  • 1,000 messages/month
  • Access to all AI models
  • GPT-4, Claude, Grok, Gemini & more
  • Cancel anytime