French AI startup Mistral has introduced its first generative AI models designed for edge devices, such as laptops and smartphones.
These new models, named "Les Ministraux," are versatile and can be used for various applications, including basic text generation or working alongside more advanced models to complete tasks.
There are two versions of Les Ministraux — Ministral 3B and Ministral 8B — both featuring a 128,000-token context window, equivalent to processing the length of a 50-page book.
In a blog post, Mistral mentions that their customers have increasingly sought local, privacy-focused AI solutions for critical uses like on-device translation, offline smart assistants, local data analytics, and autonomous robotics. Les Ministraux models are designed to provide an efficient and low-latency solution for these needs.
Ministral 8B is available for download, but only for research purposes. Developers and companies interested in self-deploying Ministral 3B or Ministral 8B must contact Mistral to obtain a commercial license.
Alternatively, developers can access Ministral 3B and 8B via Mistral's cloud platform, Le Platforme, or through other partnered cloud services in the coming weeks. The pricing is set at 10 cents per million input/output tokens (~750,000 words) for Ministral 8B and 4 cents per million tokens for Ministral 3B.
Recently, smaller AI models have been trending, as they are faster and more affordable to train and deploy compared to larger models. Google continues to add to its Gemma model family, Microsoft has its Phi models, and Meta’s latest Llama release includes models optimized for edge devices.
Mistral claims that Ministral 3B and 8B surpass similar models, including Meta’s Llama and Google’s Gemma, as well as its own Mistral 7B, in key benchmarks for instruction following and problem-solving.
Based in Paris, Mistral, which recently secured $640 million in venture funding, is steadily expanding its AI product line. In recent months, the company has launched a free service for developers to test its models, an SDK for customers to fine-tune the models, and even a code-focused generative AI model called Codestral.
Founded by former Meta and Google DeepMind employees, Mistral aims to create top-tier models to rival industry leaders like OpenAI’s GPT-4 and Anthropic’s Claude, while finding ways to generate revenue. While turning a profit has proven difficult for many AI startups, Mistral reportedly began generating revenue this summer.
Post a Comment