Mistral AI Releases New Lightweight Model Mistral Small 3.1

2025-03-18

Mistral AI has officially released its latest lightweight AI model, Mistral Small 3.1, as an open-source project. The company asserts that this model surpasses comparable models developed by OpenAI and Google in terms of performance.

Mistral Small 3.1 contains only 24 billion parameters, making it significantly smaller than many state-of-the-art large models. Despite its compact size, the model excels in processing text and images, delivering performance comparable to much larger models.

Compared to its predecessor, Mistral Small 3, the new version shows notable improvements in text handling, multimodal understanding, and context window expansion, now supporting up to 128,000 tokens. Additionally, the model processes data at a speed of approximately 150 tokens per second, making it ideal for applications requiring rapid responses.

Instead of relying solely on increased computational resources, Mistral AI achieved this breakthrough through algorithmic enhancements and advanced training optimizations. This approach allows smaller model architectures to achieve maximum efficiency.

The Mistral Small 3.1 model can operate on relatively modest hardware setups, such as a single RTX 4090 GPU or a Mac laptop with 32GB of RAM. This capability enables the deployment of advanced AI technologies on smaller devices in remote locations, enhancing accessibility to AI solutions.

Founded in 2023 by former researchers from Google DeepMind and Meta Platforms, Mistral AI has quickly become one of Europe's leading AI companies. To date, the company has secured over $1.04 billion in funding, achieving a valuation of approximately $6 billion.

Mistral Small 3.1 is part of a series of recent product launches by the company. Earlier releases include the Saba model, which focuses on Arabic language and culture, and the Mistral OCR model, which uses optical character recognition to convert PDF documents into Markdown files.

Mistral AI's product lineup also features Mistral Large 2, its flagship model; Pixtral, a multimodal model; Codestral, designed for code generation; and Les Ministraux, a series of highly optimized models for edge devices.

Mistral AI's commitment to open-sourcing its models contrasts sharply with the industry trend toward closed, proprietary systems. This strategy has already shown success, as several high-quality inference models have been built using its previous lightweight model, Mistral Small 3.

The Mistral Small 3.1 model is now available for download via Huggingface and can also be accessed through Mistral's AI API or Google Cloud's Vertex AI platform. In the coming weeks, it will also be accessible via Nvidia's NIM microservices and Microsoft's Azure AI Foundry.