• First AI Movers Pro
  • Posts
  • Mistral Thinks It Through—Magistral Brings Lightning-Fast, Transparent Reasoning

Mistral Thinks It Through—Magistral Brings Lightning-Fast, Transparent Reasoning

Dual-release model ships open 24 B weights and enterprise muscle, scoring 70-73 % on AIME 2024 while answering up to 10× faster.

In partnership with

Good morning,

France-based Mistral AI just raised the bar for auditable reasoning with the launch of Magistral, a two-tier model built to solve multi-step problems quickly and show its work. Below, we break down the who-what-when-where-why, then sprint through three stealth updates you can bolt into your stack this week.

Lead Story — Magistral

Paris-founded Mistral AI, the open-weights upstart behind Codestral and Le Chat.

Magistral, its first reasoning-first large language model. It ships in two flavors: Magistral Small, a 24 B-parameter Apache-2 model, and Magistral Medium, an enterprise version with stronger weights and hosted API.

Released mid-June 2025 via GitHub for weights and through Mistral’s Le Chat interface for inference, Magistral emphasizes transparent, chain-of-thought reasoning in eight major languages—English, French, Spanish, German, Italian, Arabic, Russian, and Simplified Chinese. Each answer reveals step-by-step logic, a must for regulated verticals like healthcare and finance.

On the math-heavy AIME 2024 benchmark, Magistral Small scores 70.7 % and Medium 73.6 %, climbing to 83–90 % with majority voting, beating many closed competitors at similar sizes. In Le Chat, a Flash Answers mode returns solutions up to 10× faster than rival chatbots, thanks to optimized decoding and caching.

My take: By merging speed with auditability, Mistral tackles two enterprise deal-blockers—latency and compliance. Being able to trace every reasoning step back to the source of truth should ease adoption in “red-tape” sectors and curb hallucinations before they hit production.

Fun Fact

The term “API” first appeared in a 1968 paper on software design, not web tech. Fifty-seven years later, APIs like MCP and Mariner let LLM agents browse email or drive a live browser, proving the acronym’s staying power.

Stay curious, keep those GPUs cool,
— The AI Sailor ⚓️

Find out why 1M+ professionals read Superhuman AI daily.

In 2 years you will be working for AI

Or an AI will be working for you

Here's how you can future-proof yourself:

  1. Join the Superhuman AI newsletter – read by 1M+ people at top companies

  2. Master AI tools, tutorials, and news in just 3 minutes a day

  3. Become 10X more productive using AI

Join 1,000,000+ pros at companies like Google, Meta, and Amazon that are using AI to get ahead.

Reply

or to participate.