May 28th update to the [original DeepSeek R1](/deepseek/deepseek-r1) Performance on par with [OpenAI o1](/openai/o1), but open-sourced and with fully open reasoning tokens. It's 671B parameters in size, with 37B active in an inference pass. Fully open-source model.
Capabilities
Text Generation 163K Context
0/500
AI can make mistakes. Handle with care.
Daily Limit Reached
You've used all 10 free messages for today. Join our waitlist to get unlimited access when we launch!