AI's 'mad cow disease' problem tramples into earnings season

This is The Takeaway from today's Morning Brief, which you can sign up to receive in your inbox every morning along with:

  • The chart of the day

  • What we're watching

  • What we're reading

  • Economic data releases and earnings

It used to be enough to mention AI on an earnings call for Wall Street to celebrate. But a more discerning reality is setting in.

Grand ambitions of AI technologies are propped up by gargantuan costs — from extreme demands on natural resources to immense hardware investments. Big Tech's enormous valuations seem less justified, butting up against the improbable state of affairs in AI development.

If 2024 is the "show me" year, we're still waiting.

Now earnings season is coming, and AI is once again driving a handful of mega stocks. But the latest wave of skepticism suggests the hyped-up returns may never arrive.

The web's collection of content — the material that inspires advanced models to generate contrived images or churn out convincing LinkedIn posts — is itself a finite resource. Even the vastness of the internet ends somewhere.

That's triggered a mad dash among AI companies to seek more content: pilfer copyrighted works, transmogrify videos into text, or even use AI-generated material as training data for AI systems.

But relying on synthetic data degrades the quality and reliability of AI models, as research has shown. That highlights a major limitation in the promise of advanced AI.

Researchers at Rice University likened the danger of training generative models on synthetic material to "feeding cattle with the remains (including brains) of other cattle", crafting an AI training analogy to mad cow disease.

The explosion in AI tools has already littered the web with synthetic content, which continues to make up a greater and greater share of the internet. You've probably already noticed it gaming search engine results — authorless, synthetic, and, in the end, useless articles that get your click and brief attention as you search for trustworthy, human information.

This, of course, means that existing AI systems have already ingested their own results.

"It really is about brains corrupting future brains," said Richard Baraniuk, professor of electrical and computer engineering at Rice University, who co-authored the paper.

The limits of human-made content are just the latest example of the AI story confronting impassable boundaries. There's an array to choose from.

AI models “are just insatiable in terms of their thirst” for electricity, Rene Haas, the CEO of the chip design company Arm (ARM), said earlier this week.

"By the end of the decade, AI data centers could consume as much as 20% to 25% of US power requirements," Haas told the Wall Street Journal. “That’s hardly very sustainable.”

And these words are coming from a CEO, not a hater.

His remarks echo a January report from the International Energy Agency that said a query to ChatGPT requires almost 10 times as much electricity as the average Google search. Measuring from 2023, power demand by the AI industry is expected to grow at least 10 times by 2026, the agency said.

Other drags on the AI dream are closer to home.

Tech companies are scrambling to reduce their dependence on outside suppliers of AI chips, pouring billions into hardware and infrastructure. Google (GOOG, GOOGL) and Meta (META) unveiled new homegrown chips this week, flashing their costly commitments.

The investments are tickets to prosperity in the AI-led future. But the spending — like the warnings over data and resources — will bring them closer to having to prove it.

morning brief image
morning brief image

Hamza Shaban is a reporter for Yahoo Finance covering markets and the economy. Follow Hamza on Twitter @hshaban.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance

Advertisement