Entering 2025, the popularity of AI will continue to rise globally. The influence of the big model radiates to all aspects of production, life, social and cultural activities, and financial markets. We will vaguely have a feeling that the AI industry is undergoing rapid changes. Its operating logic and industrial supply and demand relationship are rapidly integrated and reorganized. But what is the real core of these changes? What is the latest development consensus of the global AI industry? There seems to be some concealment and obscurity.
To see clearly the new changes in AI technology and industry, the most important thing is to combine the judgments of technology leaders with the trends of the technology industry. On February 11, United Arab Emirates Dubai held the World Governments Summit 2025 Summit. Baidu founder Robin Li, Google CEO Sandar Pichai, Tesla and xAI founder Musk all expressed their latest views and predictions on the AI industry at the summit.
These technology leaders are very unanimous in their view thatLarge models are ushering in a very critical cost reduction cycle。For example, Robin Li said,”We live in a very exciting era. In the past, when we talked about Moore’s Law, performance would double and cost would be halved every 18 months; but today, when we talk about large-language models, we can say that every 12 months, reasoning costs can be reduced by more than 90%.”
Echoing the global consensus that the cost of large models has dropped, on February 13, Wenxinyiyan and ChatGPT, the largest large language models in the East and the West, both announced their free use. Perhaps we can think that we must understand the trend and logic of cost reduction in large models in order to predict the future trend of global AI.
Then we might as well start from this gathering of technology leaders in Dubai and combine it with the latest trends of AI companies such as Baidu and OpenAI to discuss in depth the inevitability of large models moving towards a cost-reduction cycle and the resulting industrial driving force.
Where will AI go in the next period of time? The answer lies in the four words “era of reduced capital”.
We must first make this judgment: any general technology-level innovation must go through four steps: experimental stage, high-priced trial stage, cost reduction stage, and popularization stage.
The one that best reflects this characteristic is household electricity. After Edison and other inventors invented the electric light, for a long time, household generators were considered a must-have. However, the high cost and safety hazards of household power generation are beyond the reach of most households. At this stage, innovation does occur, but the cost is too high and the application is difficult.
As a result, high-voltage transmission and regional power grids emerged. Each household does not have to prepare a separate power generation system, and the cost of electricity has been greatly reduced. In this case, home electricity and home lighting are truly popularized.Today’s big language model coincides with the cost reduction cycle of the “grid” moment.
At the World Governments Summit 2025 Summit, United Arab Emirates AI Minister Omar Sultan AI Olama had a conversation with Robin Li. During this period, Yanhong Li said that looking back over the past few hundred years, most innovations have been related to cost reduction, not only in the field of artificial intelligence, or even just in the IT industry,”If you can reduce costs by a certain amount, a certain percentage, then this means your productivity has increased by the same percentage.” I think this is almost the essence of innovation. Today, innovation is much faster than before.”
In the past cycle, the training and reasoning costs of large models have been dropping rapidly. But this does not mean that previously costly innovations are meaningless. On the contrary, it is precisely because of the huge investment in AI infrastructure that subsequent AI innovations can have sufficient infrastructure bases for trial, error and exploration. Similarly, it is precisely because of the discovery and practice of Scaling Laws and the training of large models of extremely high quality that we can consider how to reduce costs and optimize model architecture based on these efforts. It’s like having lights first and then having power grids. The two cycles cannot replace each other.
With the continuous promotion of the big language model, mainstream AI companies around the world have seen the feasibility of cost reduction.
For example, Google CEO Sandar Pichai said at the just-held Paris Artificial Intelligence Action Summit that AI technology is undergoing rapid progress, and the significant drop in costs is particularly significant. In the past 18 months, the cost of processing tokens has dropped by as much as 97%, from $4 per million to 13 cents.
A few days ago, OpenAI founder Sam Ultraman said on social media,”The cost of using a specific level of artificial intelligence drops 10 times about every 12 months, and lower prices will bring more use.”
This means that the world’s top AI companies such as Baidu, Google and OpenAI have all seen the rapid decline in the cost of large models every year. The reduction in AI innovation costs means that basic large models and AI native applications will usher in a faster development rate. Just like a bright APP in the Internet era can cause a trend. Next, we will see the “small power produces miracles” in which various AI models and AI applications bloom.
This is inevitable for the development trend of large models, and will also push the AI market to a new stage-free use of high-quality models.
The large model moving towards a cost-reduction cycle will bring us many surprises. Among them, there are the amazing new models, and there are also mature models that are moving towards inclusive development of free use. With the support of reduced costs, it is becoming a new consensus for new AI to use the most mature system and cutting-edge large models for free.
The development that most supports this view is on February 13, when Wenxinyiyan and ChatGPT simultaneously announced free use.
With the iterative upgrades of the Wenxin Model and the continuous decline in costs,Wenxinyiyan official website announced that it will be fully free from 0:00 on April 1. All PC and APP users can experience the latest models of the Wenxinyan series, as well as ultra-long document processing, professional search enhancements, advanced AI painting, and multilingual dialogue. and other functions.
At the same time, Wenxinyiyan has also launched in-depth search functions from now on, giving the large model more powerful thinking planning functions and tool calling capabilities. It can provide users with expert-level replies, handle multi-scene tasks, and realize multi-modal input and output.
Across the ocean, OpenAI announced the latest developments in GPT-4.5 and GPT-5, and announced that the free version of ChatGPT will use GPT – 5 for conversations without restrictions under standard smart settings. Previously, OpenAI had announced that ChatGPT Search would be open to everyone without registration.
Earlier, Google also announced that it would open the latest Gemini 2.0 model to everyone, including Flash, Pro Experimental and Flash-Lite versions.
It is not difficult to see that driven by the cost reduction cycle of large models, inclusiveness will become a common choice for AI companies around the world.Next, we will apply the large model at a lower cost and faster reach, and the ubiquitous and extremely low threshold large model capabilities will also demonstrate the application imagination of intelligently changing everything, just like the popularity of the Internet.
So, in the face of the wave of AI inclusiveness and cost reduction, what new AI trends will enterprises, developers and individuals usher in? I think Baidu’s actions have built a reference blueprint.
When large models enter the era of cost reduction, companies need to adjust their strategic directions in a timely manner so that their development intentions can better match the next market demand and user expectations. What opportunities can we find in the era of large-scale model cost reduction? Baidu’s actions have actually given us the answer. In the development cycle of cost reduction and universal benefit as the main theme, there are three opportunities that are the most critical:
1. Continue to build AI infrastructure.
Cost reductions in large models do not mean that infrastructure investment can stagnate. On the contrary, cost reductions will provide larger development space and larger users for large models. Effective infrastructure construction will be the cornerstone of ensuring the continued development of AI innovation and reaching the public. Yanhong Li believes that we still need to continue to invest in chips, data centers and cloud infrastructure to create better and smarter next-generation models. From Kunlun chips to the paddle deep learning framework, basic large models, and Baidu smart cloud, Baidu is making more extensive and heavy investment in the construction of full-stack AI infrastructure.
2. Explore native AI applications.
It is true that the big language model itself is a very valuable application form, but the capabilities of AI go far beyond that. Taking the big model as the base and exploring the native application forms and application capabilities of AI is the next real highlight in the AI field. Users can look forward to the next colorful AI native applications. For enterprises and developers, investing large model capabilities into the application layer has always been the core opportunity.
In terms of AI native applications, Baidu has also made many achievements. For example, Baidu app, Baidu Library, and Wenxin Intelligent Platform have all shown advantages. Yanhong Li believes that even at the current level, the big language model has created a lot of value in various scenarios. At the same time, attention needs to be paid to value creation at the application level.
3. Create the next generation of big models.
The continuous decline in training and reasoning costs for large models means that it has become possible to create large models with stronger capabilities and better effects. The next generation of big models is the vanguard in promoting the continuous evolution of the AI industry and the central axis connecting AI infrastructure and AI applications.
At present, European and American AI giants such as OpenAI have announced plans for the next generation of large models. According to a CNBC report on February 12, Baidu plans to release the next generation artificial intelligence model Ernie 5.0 in the second half of this year, which will significantly enhance the model’s multimodal capabilities.
Wenxin 5.0 can respond to the expectations of all walks of life for the next generation of big models in terms of strategic rhythm and technological upgrades. How to break through the upper limit of the ability boundary of the large model during the cost reduction cycle? This will be the question that Wenxin 5.0 needs to answer.
Li Yanhong said,”Technology is advancing very rapidly. Although we are satisfied with the achievements we have made so far, think about it six months or two years later, the situation will change a lot.”
With the rapid advancement of technology and infrastructure, the speed of AI development may be beyond everyone’s expectations. Trigger better results with lower costs. Use a small fulcrum to pry a miracle with infinite possibilities. This may be the beautiful background of the intelligent era.