DT
PT
Subscribe To Print Edition About The Tribune Code Of Ethics Download App Advertise with us Classifieds
Add Tribune As Your Trusted Source
search-icon-img
search-icon-img
Advertisement

Jefferies favours China's AI strategy over US, citing energy advantage and open-source model

  • fb
  • twitter
  • whatsapp
  • whatsapp
featured-img featured-img
ANI 20251031081948
Advertisement

New Delhi [India], October 31 (ANI): Jefferies, in its latest report, highlighted that it favours the Chinese approach to artificial intelligence (AI) over the US approach, citing structural advantages in China's energy capacity and the open-source model driving innovation.

Advertisement

The report said that China's open-source approach to AI is expected to lead to an "inference boom" as entrepreneurs develop targeted and cost-effective AI use cases.

Advertisement

"The open-source model should lead to an inference boom as entrepreneurs come up with targeted cost-effective use cases for AI," it said.

Advertisement

Jefferies also pointed out that China holds a massive advantage over the United States when it comes to access to cheap and abundant energy, a critical factor in powering AI data centers and computational infrastructure.

The report noted that China's power generation capacity increased by 426 GW in 2024, while the US electricity generation capacity's net addition, accounting for new installations minus retirements -- was only 30 GW.

Advertisement

The report stated it "continues to favour the Chinese approach to AI over the US. China has a massive advantage in terms of its access to almost unlimited cheap energy."

On the investment side, the report maintained its base case that the ongoing AI capital expenditure (capex) boom will likely lead to a phase of massive over-investment in data centers and related infrastructure.

This over-investment, Jefferies said, is being driven by a fear among major technology players of being disrupted.

"AI capex mania will culminate in massive over-investment in data centers and the like as the Big Tech players feel compelled to participate for fear of being disrupted," the report stated.

At the same time, the report drew attention to the demand side of AI, which it said is being fueled by growing belief in the "S-curve" pattern of consumer demand.

This belief, Jefferies noted, is motivating significant investments and growing collaborations between AI leaders such as OpenAI and technology giants including Nvidia, Broadcom, and Oracle.

However, the report cautioned that it remains unclear who will ultimately emerge as the winner in the global race to build large language models (LLMs).

It observed that no "killer app" has yet been developed for the masses that could define the practical application of AI.

Jefferies also highlighted that when the current AI investment boom eventually cools, it could result in a sharp drop in the cost of "inference" -- the process of running AI models -- much like what happened after the Dotcom bust.

During that period, the excess capacity in fiber optics led to a steep decline in broadband costs, triggering an explosion in e-commerce.

"The upside of the seemingly inevitable over-investment bust when it happens, in terms of the amount being spent on chips and AI data centers, is that the costs of so-called 'inference' should collapse and demand should surge," Jefferies said.

The report also noted a key difference between the current AI boom and the Dotcom era, AI chips have a much shorter shelf life of around 3-4 years, compared with the 25-year lifespan of fiber optic cables, implying that the pace of technological obsolescence in AI is far faster. (ANI)

(This content is sourced from a syndicated feed and is published as received. The Tribune assumes no responsibility or liability for its accuracy, completeness, or content.)

Advertisement
Advertisement
Advertisement
tlbr_img1 Classifieds tlbr_img2 Videos tlbr_img3 Premium tlbr_img4 E-Paper tlbr_img5 Shorts