Mistral Nemo

Mistral Nemo

by Mistral

A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.

Capabilities

Text Generation 131K Context
0/500

AI can make mistakes. Handle with care.

Want to Keep Going?

  • 1,000 messages/month
  • Access to all AI models
  • GPT-4, Claude, Grok, Gemini & more
  • Cancel anytime