OpenAI is thinking about making its own computer chips for artificial intelligence. Sam Altman, the boss of OpenAI, says they don’t have enough GPUs, which are like the “brains” for AI tasks. This shortage is causing problems for their services and costing them a lot of money.
Every time someone asks a question to ChatGPT, OpenAI’s chatbot, it costs the company 4 cents. That might not sound like much, but they have 100 million people using it every month. That adds up to a lot of questions and a lot of money.
Stacy Rasgon of Bernstein Research paints a grim picture. If ChatGPT’s query volume reaches even a tenth of what Google handles, OpenAI would require GPUs worth an eye-watering $48.1 billion. The annual expenditure on chips would be a staggering $16 billion. These numbers make it clear why OpenAI is exploring making its own AI chips.
Currently, Nvidia holds the reins of the AI chip market. Even Microsoft, who supports OpenAI, is trying to make their own. They’ve been at it since 2019 and even have a chip called Athena that OpenAI has tested.
While OpenAI’s plans to manufacture AI chips are still in the decision-making phase, the stakes are high. The company faces the monumental task of not just creating efficient chips but also ensuring they are cost-effective. Purchasing an already-established chip-making company could be a shortcut. Either way, it’s going to take time and a lot of money. And there’s no guarantee it will work.