AI has evolved from predictive AI to generative AI (GenAI) and multi-modal GenAI over a relatively short period of time.
AI has already been adopted in various areas, and GenAI is further pushing the envelope, creating new possibilities. Various use cases have emerged, for example, disease diagnosis, application development (assisted code development), marketing content creation, and demand forecasting for retail businesses. Against this backdrop, GlobalData expects global enterprise spend of GenAI to grow at a rapid pace from $2.4bn in 2023 to $33bn by 2027. However, amid the excitement, there have been concerns. One area AI technology providers have not been highlighting is the physical facility required to host hardware for AI workloads.
Surge in power for GenAI
The processing power required for GenAI workloads has pushed the semiconductor industry to develop faster AI-optimised solutions. Training a large language model (LLM), for example, with ChatGPT can take a very long time with multiple GPUs (years with a single GPU). The industry is working on solutions such as NVIDIA’s H200 processors, Google’s TPU architecture, and Amazon’s CPU chips including the ARM-based Graviton series.
However, the hardware will consume more power and require more efficient cooling systems. Most of the LLM development and AI model training are being done in data centres, and this is driving demand for data centre space as well as how it is being designed.
Yet, building new facilities in key data centre hubs will get more difficult due to power requirements. The growing demand for energy will be a major concern as countries have commitments to meet carbon emission targets. For example, Singapore had a four-year moratorium that began in 2019 to stop the development of new data centres due to high energy consumption, impacting Singapore’s efforts to achieve its sustainability goals. While the moratorium was lifted in mid-2023, Singapore has been more selective in approving new data centres.
Lack of capacity in power and cooling
In London (England), it was also reported that in some boroughs west of the city, the lack of sufficient electrical capacity – due to data centre demand – has impacted approvals for new housing developments. Power supply and sustainability considerations are driving the data centre industry to build more sustainable facilities through self-generation using fuel cells, renewables, and batteries. The International Energy Agency (IEA) sees electricity consumption from data centres doubling by 2026, having globally consumed an estimated 460 terawatt hours (TWh) in 2022.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataData centres packed with high-density servers will also require new systems to remove the heat. There are limitations with existing systems that are mostly based on air cooling. According to Vertiv, legacy facilities are ill-equipped to support widespread implementation of the high-density computing required for AI, with many lacking the required infrastructure for more efficient liquid cooling.
Companies will need to retrofit their data centres, which requires additional investment. However, this is also an opportunity for companies to become more sustainable by adopting a more efficient cooling system. While enterprises are keen to harness the power of GenAI, they need to review data centre requirements to pave the way for innovation happening at speed and scale.
Related Company Profiles
NVIDIA Corp
Google LLC
Amazon.com Inc
ARM Ltd
Vertiv Holdings Co