StairQI Releases Preview of Step-2 Trillion-Parameter MoE Language Model

2024-03-25

At the 2024 Global Developer Pioneer Conference held in Shanghai on the 23rd, the highly anticipated universal large-scale model startup, Leap Star, made its official debut. With its self-developed Step-1V trillion-parameter multimodal model, Leap Star topped the list of multimodal model evaluations on the authoritative large-scale model evaluation platform "Sinan" (OpenCompass), with performance comparable to GPT-4V. At the conference, Dr. Jiang Daxin, the founder and CEO of Leap Star, announced the preview version of the Step-2 trillion-parameter MoE language model. This model adopts advanced MoE architecture and is dedicated to the exploration of deep intelligence. It also opens API interfaces for trial use by selected partners. It is reported that the transition from trillion to trillion parameters poses significant challenges to computing power, systems, data, algorithms, and other aspects. However, Leap Star has successfully overcome these difficulties, demonstrating its strong technical capabilities. Training a trillion-parameter model not only showcases Leap Star's core technical capabilities but also demonstrates its determination and strength in catching up with OpenAI, a global leading enterprise in the field of general artificial intelligence. This important progress signifies that Leap Star has taken solid steps in the field of artificial intelligence, laying a solid foundation for future development.