High costs cast shadow over ChatGPT revolution
WASHINGTON
The explosion of generative AI has taken the world by storm, but one question all too rarely comes up: Who can afford it?
OpenAI bled around $540 million last year as it developed ChatGPT and says it needs $100 billion to meet its ambitions.
“We’re going to be the most capital-intensive startup in Silicon Valley history,” OpenAI’s founder Sam Altman told a U.S Senate panel recently.
And when Microsoft, which poured billions of dollars in investment into OpenAI, is asked about how much its AI adventure will cost, the company answers with assurances that it is keeping an eye on its bottom line.
Building something even near the scale of what OpenAI, Microsoft or Google have on offer would require an eye-watering investment on state-of-the-art chips and recruiting prize-winning researchers.
“People don’t realize that to do a significant amount of AI things like ChatGPT takes huge amounts of processing power. And training those models can cost tens of millions of dollars,” said Jack Gold, an independent analyst.
“How many companies can actually afford to go out and buy 10,000 Nvidia H100 systems that go for tens of thousands of dollars a piece?” asked Gold.
The answer is pretty much no one and in tech, if you can’t build the infrastructure, you rent it and that is what companies already do massively by outsourcing their computing needs to Microsoft, Google and Amazon’s AWS.
And with the advent of generative AI, this dependency on cloud computing and tech giants deepens, leaving the same players in the driver’s seat, experts warned.
From Main Street to Fortune 500, the dependency on the AI-amped will be an expensive one and companies and investors are drumming up alternatives to at least reduce the bill.
“AI training, GPT training will become a very important cloud service going forward,” said Spectro Cloud CEO Tenry Fu.“But after training, a company will be able to get their model back for real AI application” and the dependence on the cloud giants will hopefully be reduced, he added.
Regulators are hoping that they can keep up, and not leave the giants in charge, imposing their terms on smaller companies.
But it might be too late, at least when it comes to which companies have the means to provide the groundwork of generative AI.
“It is absolutely true that the number of companies that can train the true frontier models is going to be small just because of the resources required,” Altman told the panel last week.
“And so I think there needs to be incredible scrutiny on us and our competitors,” he added.