Paris-based Mistral AI is constructing an alternative to OpenAI and Anthropic, as its newest statement indicates. The business is introducing Mistral Big, its main big language model. It competes with GPT-4 and Claude 2 in reasoning.
The business also launched Le Chat, a ChatGPT equivalent, with Mistral Large. This chat helper is in beta.
Mistral AI, noted for its capitalization table, gathered a huge amount of money quickly to construct basic AI models. The corporation was formed in May 2023. A few weeks later, Mistral AI secured $113 million in venture funding. Andreessen Horowitz (a16z) led a $415 million fundraising round in December.
Former employees of DeepMind and Meta founded Mistral AI, which was initially an open-source AI company. Mistral AI’s initial model was open source with model weights, but its bigger models are not.
As Mistral AI delivers Mistral Large via a paid API with usage-based pricing, its business model resembles OpenAI’s. Querying Mistral Large costs $8 per million in input tokens and $24 per million in output tokens. An AI model would break the word “Eltrys” into two tokens, “Elt” and “Rys,” in artificial language jargon.
Mistral AI defaults to 32K token context windows (around 20,000 English words). Mistral Large is capable of supporting English, French, Spanish, German, and Italian.
For instance, GPT-4 with a 32k-token context window costs $60 per million input tokens and $120 per million output tokens. Mistral Large is 5–7.5x cheaper than GPT-4-32k. AI prices change often since things change quickly.
How does Mistral Large compare to GPT-4 and Claude 2? It is always difficult to tell. The Mistral AI claims to place second behind GPT-4 on numerous benchmarks. There may be benchmark cherry-picking and real-world use differences. We’ll need to investigate its test performance.
An alternative to ChatGPT
Le Conversation, Mistral AI’s conversation helper, launches today. Anyone may join chat.mistral.ai and test it. The business claims there may be “quirks” in the beta release.
Users may pick between Mistral Small, Mistral Large, and Mistral Next, a prototype model aimed at being quick and succinct. The service is free for now. When used, Le Chat cannot access the web.
Enterprise customers will also get a premium edition of Le Chat. Central billing and moderation will be available to corporate customers.
Mistral AI partnership with Microsoft
Finally, Mistral AI announced a Microsoft partnership today. In addition to Mistral’s API platform, Microsoft will provide Azure customers with Mistral models.
Another model in Azure’s model catalogue seems unimportant. However, Mistral AI and Microsoft are currently discussing partnerships and more. First, this agreement may help Mistral reach additional consumers via this new distribution channel.
Microsoft invests most in OpenAI’s limited-profit subsidiary. It also accepts additional AI models on its cloud platform. Microsoft and Meta cooperate to provide Azure Llama big language models.
This open collaboration model helps Azure customers stay in its product ecosystem. It may also aid anticompetitive scrutiny.