Mistral AI, a Paris-based AI startup, has announced its own alternative to OpenAI and Anthropic with Mistral Large, its large language model. In a blog post, the company described Mistral Large as a “cutting-edge text generation model” with “top-tier reasoning capabilities.”
According to Mistral AI, can be used for “complex multilingual reasoning tasks” such as code generation, transformation, and reading comprehension. It also launched its own answer to Chat GPT with Le Chat, which is currently only available in beta. Initially, Mistral AI emphasized its open-source focus as its main selling point. Its first model was released under an open-source license, but other, larger subsequent models have not.
Introducing Mistral Large
Like OpenAI, Mistral AI offers Mistral Large via paid API and usage-based pricing. According to Tech Crunch, Mistral Large currently costs $24 per million output tokens and $8 per million of input tokens to query Mistral Large. Tokens, the outlet added, are designed to represent small chunks of words, usually divided into syllables. So, for instance, “ReadWrite” would be split into “read” and “write” and be separately processed by the AI language model.
Also, according to the outlet, Mistral AI does, by default, support context windows of 32,000 windows. This translates into over 20,000 English words and supports numerous other European languages like Italian, French, German, and Spanish.
But that’s not all. As mentioned, Mistral AI is launching Le Chat, it’s own version of Chat-GPT. It’s available at chat.mistral.ai and is currently a beta release.
Specifically, users can choose between three models: Mistral Large, Mistral Small, and Mistral Next — a prototype which, according to Tech Crunch, is “designed to be brief and concise.” For now, at least, Le Chat is free to use — but there’s a chance of this changing in the future.
Featured Image: Photo by Possessed Photography on Unsplash