AI EDUCATION: What Is a Data Center?

468

Each week we find a new topic for our readers to learn about in our AI Education column. 

This week in AI Education, we’re going to visit the now ubiquitous topic of data centers. Data centers themselves are not just a decades-old computing concept, they’ve been used in practice for decades as well. 

However, the continued proliferation and evolution of artificial intelligence is inextricably tied to data centers. Behind every deployment of artificial intelligence is a large amount of data and computing power, and that data and computing power most often resides in a data center. 

So Just What Is a Data Center? 

A data center, according to Cisco Systems, “is a physical facility that organizations use to house their critical applications and data.” In other words, it is a place that houses computer systems, information storage and telecommunications capabilities to link that computing power and information to users.  

Data centers exist to supply users—usually workers in one or more businesses—with shared applications and data. Data centers may be housed on-site at a business, within their own building or groups of buildings nearby, or may be a geographically dispersed network of facilities. 

A data center includes routers, switches, firewalls, storage systems, servers, and application-delivery controllers, according to Cisco. They also may include large and powerful computers like mainframes, racks for storage devices, cooling systems, redundant and continuous power and water supply, backup generators, security systems, fire suppression, and systems to control humidity and static electricity.  

Why Are We Talking Data Centers? 

News about data centers is always on the periphery of artificial intelligence news, and every week there seems to be at least something new happening on the data center front—this week was no different. For one thing, several large new data centers opened or were announced. 

Edged Energy opened Edged Atlanta, a data center designed for artificial intelligence workloads, the first in what will become a 168 megawatt “high tech campus” in the Atlanta area. Novva Data Centers announced plans to build a $3 billion, 300-megawatt data center in the Mesa, Ariz. area. Overseas, Singtel and Hitatchi announced a partnership to develop more sustainable, next-generation data centers in Japan and potentially the wider Asia Pacific region.  

Also of interest, the Texas Advanced Computing Center at the University of Texas at Austin is partnering with Sabey Data Centers, an offsite Austin-based data center provider, as a colocation partner for a new academic supercomputer funded in part by the National Science Foundation. 

Different Types of Data Centers 

IBM classifies data centers into a few different types. An enterprise data center, according to IBM, hosts all of a single company’s IT infraastructure on-premises. This is the traditional model for handling a business’s computational and storage needs, and is still the easiest model to use for highly regulated industries to meet their compliance requirements. 

A public cloud data center houses IT infrastructure for shared use by multiple customers via an internet connection, according to IBM. Large cloud data center access is offered by big technology companies like Amazon (Amazon Web Services, or AWS), Google (Google Cloud Platform), Microsoft (Azure), IBM (IBM Cloud) and Oracle (Oracle Cloud Infrastructure).Public cloud data centers may offer hyperscale data centers in a centralized location, edge data centers located closest to customers and their clients, or a blend of the two. 

IBM defines two options for organizations that lack the space or expertise to manage their IT infrastructure on premises but can’t or prefer not to expose their IT infrastructure to a public cloud center: colocation facilities and managed data centers. In a colocation facility, the client company leases space to host IT infrastructure—hardware—that they own. Colocation can include management and monitoring services on behalf of the client companies, but often do not. In managed data centers, client companies lease the hardware from the data center provider, which provides management, administration and monitoring on behalf of the client company. 

How We Got Here 

Data centers are really just another iteration of distributed computing, which dates back to some of the earliest electronic computers of the 1940s. Huge computers required their own room or facility, with distinct power, cooling, security and design needs. Rather than send a person who needed use of a computer to that facility, it became easier to use terminals that could access centralized computing power from a separate location. As smaller and personal computers emerged and smaller enterprises adopted them, networking equipment and servers continued to be stored in a dedicated room, which became the data center. 

With the growth of the internet, it became desirable for many institutions and companies to be able to maintain continuous connections and operations—but few had the resources to install and store the necessary equipment, which led to the use of off-site data centers.  

Modern data centers, according to Cisco, are designed to support virtual networks in which data may be “connected across multiple data centers, the edge and public and private clouds.” 

What Is The Link Between Data Centers and AI? 

There’s a natural give-and-take between artificial intelligence and data centers. In some ways, AI will augment the power of the data center and distributed computing. In other ways, AI is going to make tremendous demands on existing infrastructure, and a lack of data infrastructure may throttle the future development of AI. 

An AI data center can refer to either a facility designed to meet the computational demands of artificial intelligence, according to data center provider Sunbird, or it may be a separate class of facilities. 

With its appetite for data and computing power, AI has accelerated the demand for data centers, says EY. Demand for power and capacity has the potential to be more volatile. Global data creation is expected to triple in the five years between 2021 and 2025. Between 2022 and 2030, the market demand for data center power is expected to double.  

Sunbird believes that AI will ultimately change the way data centers are designed—more dense concentration of hardware, new cooling methods and new networking infrastructure to improve hardware performance. While AI is making for more efficient and powerful data centers, those efficiencies are being outpaced by the rising demand, according to EY. 

What Might the Future Bring 

The data center power problem cannot be overstated. According to Goldman Sachs, a ChatGPT query requires nearly 10 times the electricity to process than a Google search. The bank estimates that data center power demand is going to grow 160% between this year and 2030. Data centers presently consume 1-2% of generated power globally—by the end of this decade, Goldman Sachs believes it will likely rise to 3-4% of generated electricity, doubling the carbon dioxide emissions accounted for by data centers. In the U.S., data centers ay come to account for 4.6% to 9.1% of generated electricity, according to the Electric Power Research Institute. 

This is occurring as AI adoption accelerates, pushing the demands on the power grid higher. Solutions include using more energy efficient cooling techniques, modernizing the grid and adopting alternative power sources, including renewable energy and small modular nuclear reactors, according to JLL. 

However, the real problem is that there may not be enough hardware, power, space or human resources to meet the demands of artificial intelligence. Data centers—and their needs—place a hard limit on AI’s ability to grow and proliferate.