Meituan Open-Sources AI Model to Challenge Alibaba and DeepSeek

▼ Summary
– LongCat-Flash-Chat has 560 billion parameters and uses a Mixture-of-Experts (MoE) architecture.
– The MoE architecture divides the model into specialized sub-networks called “experts” that handle subsets of input data.
– Meituan’s release of this model is part of its effort to build an AI business after acquiring Light Year.
– Light Year was established by a co-founder of Meituan, the food delivery services giant.
– Chinese AI companies are narrowing the gap with US peers by open-sourcing models for third-party use and modification.
Meituan has unveiled its new open-source artificial intelligence model, signaling a direct challenge to established players like Alibaba and DeepSeek in China’s rapidly advancing AI sector. The move underscores the company’s strategic push to build a competitive AI business following its acquisition of Light Year, a firm founded by one of Meituan’s own co-founders.
The newly released model, named LongCat-Flash-Chat, boasts an impressive 560 billion total parameters, which are the essential variables that shape how the model processes and generates information. It employs a sophisticated Mixture-of-Experts (MoE) architecture, a design that breaks the system into specialized sub-networks. Each “expert” focuses on a particular segment of input data, working collaboratively to execute complex tasks more efficiently.
This development highlights a broader trend among Chinese tech firms, which are increasingly leveraging open-source strategies to accelerate innovation and narrow the technological gap with leading US companies. By making their AI source code accessible for third-party use, modification, and distribution, companies like Meituan are fostering a more collaborative and transparent ecosystem for AI development.
(Source: South China Morning Post)