Share this @internewscast.com
Meta has introduced Llama 4, its latest set of AI models now supporting Meta AI on the web, WhatsApp, Messenger, and Instagram Direct. These models are available for download through Meta or Hugging Face and include Llama 4 Scout, a compact model that can “fit in a single Nvidia H100 GPU,” and Llama 4 Maverick, comparable to GPT-4o and Gemini 2.0 Flash. The company is also in the midst of training Llama 4 Behemoth, which Meta CEO Mark Zuckerberg recently highlighted on Instagram as “the highest performing base model worldwide.”
Meta reports that the Scout model boasts a 10-million-token context window—the operational memory of an AI model—and surpasses Google’s Gemma 3 and Gemini 2.0 Flash-Lite models as well as the open-source Mistral 3.1 across various widely reported benchmarks, all while “fitting in a single Nvidia H100 GPU.” The company makes similar claims about the performance of the much larger Maverick model when compared to OpenAI’s GPT-4o and Google’s Gemini 2.0 Flash, asserting that its results match those of DeepSeek-V3 in coding and reasoning tasks while utilizing “less than half the active parameters,” which are the variables guiding the behavior of AI models.
In parallel, Llama 4 Behemoth features 288 billion active parameters amidst a total of 2 trillion parameters. Meta claims that Behemoth eclipses its rivals, specifically GPT-4.5 and Claude Sonnet 3.7, “in several STEM benchmarks.”
For Llama 4, Meta says it switched to a “mixture of experts” (MoE) architecture, an approach that conserves resources by using only the parts of a model that are needed for a given task. The company plans to discuss future plans for AI models and products at LlamaCon, which is taking place on April 29th.
As with its past models, Meta calls the Llama 4 collection “open-source,” although it has been criticized for its licenses’ less-than-open requirements. For instance, the Llama 4 license requires commercial entities with more than 700 million monthly active users to request a license from Meta before using its models, which the Open Source Initiative wrote in 2023 takes it “out of the category of ‘Open Source.’”