レポート

AI Changes Big and Small Computing
en
概要
  • AI’s voracious appetite for computing power will spur growth in data centers, from today’s 50–200 megawatts to more than a gigawatt.
  • AI will also transform edge computing, as small, domain-specific language models will support tasks requiring lower latency.
  • These changes will strain already-stressed supply chains as leaders vie for resources, especially labor and electricity.
  • As data centers and edge computing evolve, enterprises may need to reassess market positions and revisit strategic ambitions.

This article is part of Bain's 2024 Technology Report.

AI’s need for computing power will radically expand the scale of large data centers over the next five to 10 years. Today, big data centers run by hyperscale cloud service providers range from 50 megawatts to more than 200 megawatts. The massive loads demanded by AI will lead these companies to explore data centers in the 1 gigawatt and higher range. That will have huge implications on the ecosystems that support these centers (including infrastructure engineering, power production, and cooling), and affect market valuations. The architectural requirements for achieving the necessary computing, electrical power, and cooling density for gigawatt data centers will influence the design of many smaller data centers (see Figure 1).

Figure 1
Data center requirements will rise significantly to meet AI’s computing demands

The ubiquity of AI will also change the nature of edge computing. Domain-specific language models—smaller, simpler, and optimized for specific purposes—will be necessary to handle computing loads that may require faster response, lower latency, or are able to use a simpler model due to a narrow focus. Innovation at the edge will extend to the form factor of user devices, which will also change to meet the needs of people engaging with AI.

The implications of these changes will be transformative across a number of critical dimensions, including speed of technology development, sector leadership, power generation and consumption, construction and industrial supply chains, environmental considerations, market economics, national security interests, and financing and investment. To remain in the top tier of the market, leaders will need to make unprecedented levels of investment in technology infrastructure. If large data centers currently cost between $1 billion and $4 billion, costs for data centers five years from now could be between $10 billion and $25 billion.

Strain on resources

The power demands and price tags of these large data centers will impose limits on how many can be built and how quickly. The scramble to acquire AI resources is already creating extreme competition for resources at the high end of the market, and growing data center requirements will further strain capabilities.

Power consumption is one critical example. Utilities are already fielding requests from hyperscaler customers to significantly expand electrical capacity over the next five years. Their needs will compete with rising demand from electric vehicles and re-shoring of manufacturing, stressing the electric grid. Growth in electricity demand has been essentially flat for the last 15 to 20 years, but investments to expand and strengthen the grid and add new power sources (including on-site generation and renewables) will need to increase significantly.

Infrastructure providers and technology supply chains, including networking, memory, and storage, are also investing to meet the demands for high-performance compute from hyperscalers, digital service companies, and enterprises. Large data centers will push the limits and unleash innovation in physical design, advanced liquid cooling, silicon architecture, and highly efficient hardware and software co-design to support the rise of AI.

Large data centers are major construction efforts, requiring five years or more. Demand for construction and specialized laborers—as many as 6,000 to 7,000 workers at peak levels—will strain the labor pool. Labor shortages in electrical and cooling may be particularly acute. Many projects occurring at once will stress the entire supply chain, from laying cables to installing backup generators.

Innovation at the edge

As companies weigh the trade-offs between cloud and edge computing for AI, deciding where to handle inferencing is critical. One consideration is how closely to focus on specific domains and specific tasks, in order to use better curated and more focused data to build targeted models that reduce the compute infrastructure burden.

Another issue is how to move more computing power closer to the edge for AI in environments with low tolerance for latency, like autonomous driving. The rise of smaller models and specialized compute capable of running these models at the edge are important steps in this direction. Meanwhile, the industry is rapidly developing new form factors for the edge, including edge AI servers, AI PCs, robots, speakers, and wearables.

Preparations for expansion

The changing nature of data centers and edge computing increases the likelihood of AI reshuffling the technology sector and establishing a new order for the next era. Enterprises across the sector should be examining their market position and rethinking strategic ambitions to ensure they remain competitive in their chosen domains.

  • Cloud and data center service providers. The overriding challenge for large players at this end of the market will be to find ways for their AI capabilities to meet the future demands of their customers. Providers will need to decide what to deliver as a service and what to provide as enabling technologies at the industry level. Their efforts will also center on accelerating model development and working through the supply chain to construct large and distributed data centers. This will require the ability to refocus on compelling opportunities, build capabilities rapidly, and form partnerships that strengthen the platform. Meta, for example, is competing with OpenAI, Alphabet, and others to secure a leadership role in large language models. To support these ambitions, Meta has massively increased the scale of its compute capacity over the past two years. Meta has also released Llama as an open-source language model, to serve as an enabler in the broader ecosystem.
  • Infrastructure providers. AI workloads require more specialization than prior generations of compute. Companies that design and manufacture servers, networks, storage, cooling, power, cabling, and all the other elements that go into building a data center will need to design their products to support AI. They will develop scale solutions to optimize compute and the performance of AI software. These companies also play significant roles in the delivery of infrastructure and services to customers. Accelerating the time to market of AI is an important opportunity for enterprises.
  • Software providers will continue infusing AI into their core products to remain competitive. Increasingly, their business will need to focus on capturing and interpreting data insights while optimizing language models to deliver better (and faster) outcomes for customers. These aspects of their business will complement each other as software vendors build up their capabilities to augment the skills of their customers’ workforce.
  • Edge device makers will find ways to capitalize on innovation across the ecosystem, testing new form factors and interfaces, and using AI to increase personalization across devices. Sorting out users’ privacy preferences will be critical to boosting adoption rates.
  • Data center supply chain providers have a formative opportunity to reshape their roles in the market as mega centers proliferate and edge computing evolves. These players will focus on building capacity to scale and developing meaningful partnerships with engineering firms that can help meet the challenges of large data centers and more sophisticated edge computing.

As hyperscalers and other large companies plan for the large data centers necessary to accommodate AI’s needs, additional factors will also require consideration. Paramount among these may be the investment requirements, as companies compete for funding of many massive projects at once. Stresses on the power grid are another area where companies have limited direct control. They may also have to manage the environmental implications of expanding data centers and electricity usage, including the effect on their carbon footprints and emission-reduction promises. The challenges are broad and complex, but as the global race to win in AI heats up, no company in this ecosystem can afford to stand by and wait; the time to act is now.

Read our 2024 Technology Report

Tags

お気軽にご連絡下さい

私達は、グローバルに活躍する経営者が抱える最重要経営課題に対して、厳しい競争環境の中でも成長し続け、「結果」を出すために支援しています。