“The unanticipated demand to create AI software has outstripped the cloud service capacities of Amazon Web Services (AWS), Microsoft, Google Cloud, and Oracle”
The big challenge in generative AI is GPU shortage and costs. This affects both big players like OpenAI /Stability AI as well as small startups. That’s partly why the big cloud platforms will continue to be active investors in the space. They get their investment dollars back in the cloud bill. The recent $10 billion investment by Microsoft to acquire 51% of OpenAI (much of it in the form of cloud credits) or Google’s $300M investment in Anthropic are good examples of this.
The costs should come down with time, but for now, there’s a mild GPU shortage and the costs are significant. ChatGPT costs approximately $100,000 per day or $3 million per month to run on Microsoft’s Azure Cloud, with each word generated costing $0.0003. Assuming the continued exponential growth of ChatGPT, which has become one of the fastest growing consumer products, the company needs deep pockets to continue to providing the service.
In comparison, Google spends about $100 billion a year on infrastructure costs. It translates roughly to a cost of 5 cents for every query we type into the search box. The difference is that Google knows how to make $2 for every $1 they spend, using Adwords. When it comes to monetisation, OpenAI is expected to make $200M in 2023 and $1Billion in revenue in 2024.
The big winner from this rapid increase in demand is Nvidia, the leader in the market for chips designed to excel at AI computations. Nvidia GPU demand has seen its stock nearly double in the first 3.5 months of 2023. But even Nvidia is two to three months behind on new order fulfilment for cloud server chips. Intel, once the leader in semiconductors, has a lot of catching up to do, especially as the US chips act is trying to bring manufacturing home to the US.
Regardless of cost, it’s clear that generative AI and ChatGPT are rapidly changing consumer behaviour, and the big tech generative AI race is on as previously covered on VC Cafe. Generally speaking, Google has little to worry about when it comes to search marketshare as you can see in the chart below. However small, Bing is growing given their Bing AI integration and Samsung’s consideration of possibly replacing Google search with Bing has put Sundar Pichai in panic mode.
Google’s response in AI so far has been perceived to be more of a follower trying to catch up with BARD and by slowly making AI features available in GSuite, compared with Microsoft’s break neck pace of AI integrations in Bing, Office 365, Microsoft Team’s etc. Google is hoping to change this perception with “Project Magi”, a new search experience would offer users a far more personalised experience than the company’s current service, attempting to anticipate users’ needs as you can see in the video below.
Amazon also positioned itself in the generative AI space with “Project Bedrock” which provides a way to build generative AI-powered apps via pre-trained models from startups including AI21 Labs, Anthropic and Stability AI. Available in a “limited preview,” Bedrock also offers access to Titan FMs (foundation models), a family of models trained in-house by AWS.
The latest entrant to the generative AI space is Elon Musk, who after signing a letter requesting to slow down the training of new AI models is rumoured to be starting a company to rival OpenAI, under the newly purchased X.ai domain. Even at this stage in the game, the opportunity is huge. According to Ori Goshen, co-founder and CEO of AI21 Labs, an Israeli startup developing LLMs that also rival OpenAI:
“Every business will enter the world of generative AI, because every other business in its field will implement solutions from this world“.
For now, the main implication of this for generative startups is that they should either have a clear business model, deep pockets, or a deal with one of the leading cloud providers to be able to serve customers despite limited cloud resources and costs.