Table of Contents

Adopting Self-Made Chips to Reduce AI Training Costs

Bloomberg, citing sources familiar with the matter, reported that Ant Group has used domestically manufactured chips, including hardware provided by Alibaba and Huawei, to train its AI models. They adopted “Mixture of Experts” (MoE) machine learning technology, achieving results comparable to Nvidia’s H800 chip while reducing AI model training costs by 20%.

The report pointed out that while Ant Group still uses Nvidia’s chips for AI development, it now increasingly relies on AMD and domestically produced semiconductors as alternatives to reduce dependence on US-imported chips. Data indicates that training 1 trillion tokens on high-performance hardware costs about 6.35 million RMB (approximately 880,000 USD), but with optimization techniques, this cost can be reduced to 5.1 million RMB (approximately 700,000 USD).

This research suggests that Ant Group is exploring more cost-effective ways of training AI models, attempting to break through the limitations of expensive GPUs and promote the widespread adoption of AI technology.

The MoE model has become a major trend in the AI field, with both Google and Chinese startup DeepSeek using this technology to enhance computational efficiency. This model improves AI training performance by dividing tasks into different parts, similar to the collaboration of specialized teams.

Ant Group has open-sourced its “Ling” series AI models. Among them, “Ling-Lite” has 16.8 billion parameters, while “Ling-Plus” reaches 290 billion parameters. In comparison, experts estimate that ChatGPT’s GPT-4.5 has 1.8 trillion parameters, while DeepSeek-R1 has about 671 billion.

Recent research papers published by Ant Group claim that its AI models outperform Meta (Facebook’s parent company) in certain benchmark tests, and both the Ling-Lite and Ling-Plus models surpassed DeepSeek’s models in Chinese language benchmarks. If these results are verified, it will demonstrate that China’s AI technological strength is steadily improving.

Jensen Huang, CEO of Nvidia, previously stated that enterprises will continue to rely on more efficient GPUs to increase computing power rather than seeking cheaper alternatives. However, Ant Group’s research shows that, in the context of US technology blockades against China, Chinese AI companies are looking for alternatives to develop their technologies at lower costs.

China’s Growing AI Technological Strength

Robert Lea, an analyst at Bloomberg Industry Research, pointed out that Ant Group’s research shows that China’s AI technological innovation capabilities are accelerating and gradually reducing dependence on US high-end chips.

China’s Path to AI Self-Sufficiency

However, the research also indicates that Ant Group still faces stability challenges when training AI models. For instance, small changes in hardware or model structure could lead to a spike in error rates, suggesting that Chinese-made chips still need further optimization in AI training.

LEAVE A REPLY

Please enter your comment!
Please enter your name here