[ad_1]
In the recent few months, the world’s tech giants came rushing with their artificial intelligence platforms into the mainstream market. ChatGPT, Bing, and Bard are the big names that will be competing for the user base, and are already being integrated into online search systems.
But there’s a bigger problem, surpassing even those $100 billion that Google lost from its market value due to a forgotten error in the ad of its AI.
Platforms similar to the popular ChatGPT have impressive content creation capabilities, in some scenarios rivaling those of humans. Artificial intelligence uses so-called generative algorithms to create human-like reasoning based on material existing in their databases, and can output thousand-word-long essays, create song texts, and generate movie scripts from just a few lines of text submitted as a base guidance by the user.
Everything is done within several seconds and even faster.
But there is a thing that is hidden from our eyes: the cost of operating such artificial intelligence.
These costs come from the energy consumption and computational power required to crunch data. For example, OpenAI Chief Executive Sam Altman has recently noted in his post on Twitter that their ChatGPT has “eye-watering” computing costs which amount to a couple or even more cents per single conversation. It is quite safe to assume that similar costs are typical to competing platforms, too.
Several cents may seem not much. However, Google alone receives around 9 billion search queries every day, or more than 100 thousand searches every second. If a generative AI gets integrated into Google’s search platform, that would cost the company around $90 million, if we assume that the cost of a single search is just 1 cent.
Of course, certain optimizations can be done, because intricate machine learning processes are not needed to process every single query. But the number is still exceptionally high.
According to Alphabet’s Chairman John Hennessy, using generative algorithms and so-called large language models are 10 times more expensive compared to technologies currently used to sift through online content based on submitted keywords. Still, Hennessy is positive that it will be possible to optimize these expenses.
But analysts predict that even accounting for predicted increased revenue obtained from the use of new algorithms, Google may face several billion dollars of additional costs. Currently, a single search query costs roughly a fifth of a cent, but if the engine is upgraded with ChatGPT-like AI which is applied for half of all queries and has to generate answers containing on average 50 words, the company’s expenses by rise by as much as $3 to $6 billion.
But the race is on. Microsoft already announced earlier this month it plans to embed its AI chat technology into its own search engine Bing, which is a smaller but significant competitor to Google Search, clearly taking aim at the 91% market share occupied by Alphabet.
Even while companies have an optimistic outlook, saying that technology gets cheaper with time and with increased scale of use, energy and hardware costs are an obvious factor limiting the pace at which the generative chatbot-like algorithms are rolled out as a full replacement of existing tech.
[ad_2]
Source link