More on News
PM Modi Discusses Role of Technology in Agriculture, Education, and Health with Bill Gates
-
Team Eela
Continuing its hot streak of AI advancements, Google has introduced Gemini 1.5, a next-generation model, and its upgrades, like Gemini 1.5 Pro. Even though Gemini 1.0 launched in December, the new model promises substantial upgrades, such as a longer context window, refined understanding abilities, and enhanced performance.
Google CEO Sundar Pichai expressed his enthusiasm about Gemini 1.5 Pro, stating, “Longer context windows show us the promise of what is possible.” He added, “They will enable new capabilities and help developers build more useful models and applications.”
At the core of Gemini 1.5 lies a new version of the Mixture-of-Experts (MoE) architecture. This enables the model to learn and selectively activate the most relevant pathways within its neural network, thereby increasing efficiency and performance.
Gemini 1.5 Pro introduces a paradigm shift in processing capabilities and can run up to one million tokens in production. This massive increase from its predecessor, Gemini 1.0, significantly expands the model’s capacity to absorb information, resulting in more informed and accurate responses.
In a demo, Gemini 1.5 Pro effortlessly processed a 44-minute silent film and then answered all sorts of questions, like multimodal queries. Furthermore, the model excelled across various benchmarks, outperforming its predecessor in 87% of Google’s benchmark evaluations, including the Needle in A Haystack (NIAH) and Machine Translation from One Book (MTOB) assessments. Moreover, Google stated it has conducted extensive evaluations to ensure the safe and responsible use of Gemini 1.5 Pro.
Gemini 1.5 Pro is available in a limited preview at no cost to developers and enterprise customers through AI Studio and Vertex AI. Once the model is ready for wider release, Google plans to introduce 1.5 Pro with pricing tiers ranging from the standard 128,000 token context window to one million tokens, catering to diverse user needs.
More on News
More on News