This Chinese Institute Has Created The World’s Largest AI Pre-Training Model – With Trillions Of Parameters

Damo Academy, an Alibaba subsidiary, has officially confirmed that its “Multi-Modal Multitask Mega-transformer” (M6) artificial intelligence (AI) system has increased its number of parameters from one trillion to ten trillion, surpassing Google and Microsoft’s previous trillion-level models.   

M6 is now the world’s largest AI pre-training model. “It has cognition and creativity beyond traditional AI, is good at drawing, writing, question, and answer, and has broad application prospects in many fields such as e-commerce, manufacturing, literature, and art,” InfoQ, a popular Chinese tech magazine, stated.

M6 is a general AI model that can perform multi-modal tasks. Its cognitive and creative capabilities outperform most AI today, and it excels at design, writing, and Q&A functions. According to the academy, the model could be used in e-commerce, manufacturing, literature and the arts, scientific research, and other fields.

M6 is unique because it has many more “neurons” than other AI systems currently being tested, allowing it to learn like a human brain. According to Alibaba, M6 has been used in over 40 scenarios, with hundreds of millions of daily parameter volumes.

“Next, we will deeply study the cognitive mechanism of the brain and strive to improve the cognitive ability of M6 to a level close to human beings. For example, by simulating human cross-modal knowledge extraction and understanding of humans, the underlying framework of general AI algorithms is constructed,” said Zhou Jingren, head of the data analytics and intelligence lab at Damo Academy.

“The creativity of M6 in different scenarios is continuously enhanced to produce excellent application value.”

Leave a Reply

Your email address will not be published. Required fields are marked *