Mistral AI releases new open-source large model Mixtral MoE 8x7B

2023-12-11

Open-source model startup Mistral AI has released a new LLM (Large Language Model) in the form of a seed link.

This move stands in stark contrast to Gemini, which Google released this week, sparking widespread discussion in the community.

A demo video from Google has received heavy criticism in the past 24 hours due to its apparent over-editing and pre-set scenes.

On the other hand, Mistral simply released a seed link - a large seed file used to download their new model, named MoE 8x7B.

A Reddit post describes Mistral LLM as a "scaled-down version of GPT-4," which is based on "a MoE (Mixture of Experts) containing 8 7B experts." The post goes on to explain that for each word inference, only 2 experts are used, adding, "From leaked information about GPT-4, we can infer that GPT-4 is a MoE model with 8 experts, each having 111B parameters, and 55B shared attention parameters (166B parameters per model). For each token inference, only 2 experts are also used."

AI consultant and founder of "Machine & Deep learning Israel" community, Uri Eliabayev, stated that Mistral is "well-known" for this type of release, "without any papers, blogs, code, or press releases." Open-source AI advocate Jay Scambler also commented that this release method is "definitely unusual, but I think that's exactly why it's generating so much discussion."

This behavior has been praised by many in the AI community. For example, entrepreneur George Hotz made a comment:

Eric Jang, AI Vice President at 1X Technologies and former Google research scientist in robotics, wrote that Mistral's brand "has become one of my favorite brands in the AI field."

Mistral is a Paris-based startup that has achieved a valuation of $2 billion in a funding round led by Andreessen Horowitz. It is known for its record-breaking $118 million seed funding (said to be the largest seed funding in European history) and its first large-scale language AI model, Mistral 7B (released in September).