Keeping an AI running is expensive, and the cost is increasing by the day. OpenAI is looking for ways to save costs.
According to a study by The Information, OpenAI was spending $700,000 a day to maintain the ChatGPT-3 servers. That amount has increased now that it uses GPT-4, which requires more power.
Few people know that artificial intelligence requires similar resources to cryptocurrencies. The AI is based on making calculations and making decisions, which requires millions of operations per second. That’s why it needs enormous processing power, and a high consumption of electricity.
All this generates a high cost when buying the hardware for the servers that use artificial intelligence, and the electricity bill. And it is a cost that increases exponentially : as the AI improves and the more users it has, the more hardware and electricity it needs.
The little expense of ChatGPT
With GPT-4, and with the increase in users, OpenAI may already be spending over a million dollars a day maintaining ChatGPT servers, The Information reports, via Futurism.
The first reason is that OpenAI uses graphics processors from NVIDIA, which are expensive and consume a lot, although they are very powerful. Luckily this time gamers are not going to be left without graphics cards like in the nightmare of cryptocurrencies, since the AI uses specialized GPUs created for it.
This is the case of the popular NVIDIA A100 chip, which has 95% of the market. According to The Guardian, NVIDIA sold tens of thousands of A100 cards to Microsoft a few weeks ago to power ChatGPT, as well as 20,000 H100 cards, the successor to the A100, to Amazon. And another 16,000 to Oracle.
NVIDIA GPUs are very powerful, but expensive, and they also consume quite a bit of electricity. That is why, according to The Information, OpenAI is looking for cheaper alternatives, and you have found them… at Microsoft itself.
Microsoft has invested tens of billions of dollars in OpenAI. An alternative to buying the company, which allows you to gain privileges, such as integrating ChatGPT into Bing or Office.
He has also spent three years developing an artificial intelligence chip that is cheaper than NVIDIA’s, codenamed Athena. And apparently, there are already employees of Microsoft and OpenAI who are testing it.
Not only is it cheaper, but it also consumes less, which could save OpenAI a lot of money.
ChatGPT’s creators are also working on reducing the size of its AI model, which already handles over a trillion parameters. The bigger the model, the more powerful and realistic it is, but more power and hardware is needed.
ChatGPT costs OpenAI more than a million dollars a day in hardware and electricity. The solution could lie in a new AI chip created by Microsoft, called Athena. Even so, the relationship with NVIDIA is assured, because they have signed a multi-year collaboration contract.