Data Centers and AI: The New Challenge

21/05/2025

Data Centers and AI: The New Challenge

By María Gayo, Communications Manager at LACNIC

Artificial intelligence is redefining the role of data centers in our region. As AI adoption accelerates, it brings new demands: higher energy consumption, advanced cooling systems, low-latency connectivity, and scalability.
At LACNIC 43, industry leaders addressed these challenges during the panel “Data Centers in the Age of AI,” led by Tomás Lynch.

Infrastructure Is Growing, but Not Evenly Distributed

“Over 60% of data center infrastructure is still managed in-house,” said Heubert River, Head of Data Center Operations at Cirion in Brazil. In other words, a significant share of the infrastructure remains under internal company management—an approach that often falls short when it comes to meeting AI’s growing scalability demands.

In countries like Brazil, the majority of large data centers are located in the South and Southeast, while regions such as the Northeast continue to lag behind.
Even though the country’s energy mix is made up of 93% clean sources, the distribution infrastructure poses a significant challenge. As River noted, “The energy exists, but it doesn’t get where it needs to go.”

Diversifying Geography to Improve Connectivity

Esther Fernández, from Telxius, emphasized that many coastal cities—such as Barranquilla, Fortaleza, or Valparaíso—offer key advantages.
“These cities serve as landing points for submarine cables and are located closer to renewable energy sources,” she explained. Their strategic location positions them as ideal alternatives to relieve pressure on overcrowded data center hubs like São Paulo or Santiago, while also improving service latency and resilience.

Air Cooling Is No Longer Enough

AI is pushing energy consumption and heat generation in data centers to unprecedented levels. In warmer climates, relying on free cooling—that is, using outside air—is no longer a viable option. This has led to the rise of more efficient technologies such as direct liquid cooling (DLC).

“A data center is basically a toaster—all the energy that goes in turns into heat,” said Frederico Neves of NIC.br.

(Free access, no subscription required)

Today, the goal is to lower Power Usage Effectiveness (PUE) to values between 1.2 and 1.3, compared to the 1.7 range that was considered efficient just ten years ago.

Developing Talent: A Critical Priority

Rafael Astuto, from Ascenty, pointed out that the biggest challenge is not building data centers—it is finding qualified people to run them. “There simply aren’t enough technicians trained in key areas like liquid cooling and high-density infrastructure management,” he noted. “It is urgent that technical schools and universities begin incorporating these subjects into their training programs.”

Looking Ahead: Edge, Repatriation, and Sustainability

River highlighted two key trends: moving workloads back from the cloud to on-premise infrastructure, and the growth of edge data centers closer to users. In the coming years, data centers of 200–300 MW are expected to expand, though not widely distributed.

The views expressed by the authors of this blog are their own and do not necessarily reflect the views of LACNIC.

Subscribe
Notify of

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments