Cost of data centres’ energy use hidden by secrets and NDAs

Sandra García, a 38-year-old factory worker who lives with her husband and son, opens a faucet in her home, but no water comes out. This has become the new normal for her and many of the inhabitants of Colón, a small municipality in the central Mexico state of Querétaro, a region increasingly hit hard by drought. In May this year, the local water authority, Conagua, declared that 17 out of the state’s 18 municipalities had not received enough rainfall, leaving its dams running dangerously dry.
The situation was so bad that the local government began to ration water, with some families only getting access one day out of three. García cannot always wait for her share and makes the trek to her landlord’s house to fill jerry cans just to get by.
While many residents of Colón make the connection between failing rains and dry taps, fewer are aware that they face a major new industrial competitor for their scant water resources. The state of Querétaro is fast becoming famous for another thing: it is one of the Latin American hubs for the rapid expansion of the data centre industry. According to the Mexican Data Centre Association, there are 14 facilities in the state. The local development secretary listed 19 permits, responding to a public information request by our reporters. Most of these data centres are in Colón or El Marqués, both municipalities near the regional capital, also called Querétaro.
The cloud is a loaded image that evokes weightlessness and the idea that we are pulling down the digital services that many of us rely on from some ethereal space. But the reality of the cloud could scarcely be more grounded. Google Drive, OneDrive, Amazon Web Services (AWS) and others require gigantic physical infrastructure. The nodes in this network are data centres; buildings stacked with computers called servers that store reams of information, from your personal details, to the images and texts that load when you doom scroll on social media, to the encrypted data that allows you to bank online.
Data centres have existed for decades, but the full implications of this industry are only now beginning to come into focus. Since the emergence of generative artificial intelligence, new, more complex, and more expensive AI data centres have been built, or are planned, across the globe. This new generation of infrastructure generally requires more power and water than its earlier counterparts. These greater demands are driving a global expansion that makes comparatively less exploited regions like Latin America a target for the biggest technology companies, since many of the region’s countries have vast natural resources, cheap energy and governments willing to lower taxes for these companies so as not to miss out on the future.
Real estate and construction industry pressure governments
The real-world industry that lies beneath the cloud is more than just the big tech titans in the US and its counterparts in China. It also includes actors from real estate, construction, energy and hardware. And this industry has been investing in strengthening its relations with national authorities, often influencing regulations that will later benefit them. Latin America, with its mix of fragile governance and resource riches, is now witnessing what this unprecedented infrastructure campaign means at ground level.
Data centre companies argue that AI is the future of the economy, and that countries that refuse to embrace its potential will be left behind, missing out on tax revenue, jobs and services. Critics who warn that the industry consumes water and power at the expense of communities and the environment are told that AI will solve these problems in the near future as it advances sufficiently to develop new technologies.
At the heart of the construction campaign in Latin America is an industry wishlist around deregulation and tax cuts, and an unproven economic manifesto promising future prosperity. For nine months, a coalition of 17 newsrooms in 15 countries has used freedom of information laws, confidential sources and community reporting to capture how this wishlist has been pursued, and to test the promises made against the reality on the ground in Brazil, Chile, El Salvador, Mexico and Paraguay.
The resulting investigation, Big Tech’s Invisible Hand, led by Brazil’s Agência Pública and the Latin American Centre for Investigative Reporting (CLIP), found that promises of hundreds of thousands of direct and indirect jobs to be created were unfounded, economic benefits for local economies remain unproven, industry claims of renewable energy disguise the deployment of new fossil fuel facilities to power data centres, and multinational corporations’ complex structures deflect regulation and complicate efforts to monitor environmental impacts and resource consumption.
Time and again, this investigation found that promises of near-future solutions were being made to push through decisions that do not make sense in today’s technology and resource context.
Keep it cool
One of the central design challenges for data centre providers is cooling. This is where the need for water comes from. Servers are comprised of boards with processing units that provide the computing power. These servers run constantly and must be kept within a stable temperature range, most often using cooling systems that rely on heavy water consumption. Central Processing Units (CPUs) were the key component in data centres providing cloud services until recently, when General Processing Units (GPUs) were found to be much more effective at meeting the computing demands created by AI. The new generation of AI data centres runs far hotter and therefore requires far more water.
“Water is often the last consideration when making siting decisions for data centres because it’s cheap compared to the cost of real estate and power,” Sharlene Leurig, a managing member of water consulting firm Fluid Advisors, told Bloomberg.

When authorities do question water consumption, companies — including Microsoft — argue they are solving the water issue by creating a “closed-loop system” in which, theoretically, no water evaporates, so there is no need for continuous water access.
In Mexico’s Querétaro state, our coalition partner N+ discovered both the state and federal governments had launched a new strategy to attract data centres, despite reports from the Comisión Nacional del Agua, Conagua (the dependency of the Ministry of the Environment in charge of regulating the use of water), which recommended not granting new water-use permits. The state capital faced its worst drought in a century in 2024, when 14.8% of the population lacked drinking water and farmers struggled to irrigate crops.
N+ found that big tech companies and international agencies worked together to present AI infrastructure as part of the solution to water shortages. Microsoft partnered with the United Nations agency UN-Habitat in a joint call for an 82 million Mexican peso (approx AUD $6.7 million) investment to boost the local economy in Querétaro. The accompanying report, released after visits to eight communities, identified drought as one of the main challenges.
However, the report did not recommend investments to solve this drought. Instead, it recommended investing in infrastructure for the region, such as paving roads or building a roof for a town square. The report claimed that “data centres represent an opportunity for the socioeconomic transformation of the state of Querétaro, and more specifically, for the municipalities of El Marqués and Colón.”
Despite the international backing and the resources of Microsoft, it has made no investment in infrastructure to combat drought or any other socioeconomic problems in the state, as this investigation verified by visiting seven of the eight municipalities mentioned in the report and talking to local authorities. However, Microsoft did build a “data centre region” in Querétaro, which began operation in early 2024.
Searching for answers from Microsoft, Amazon and Google
This investigation reached out to Microsoft and asked about the purpose of this report, and whether their data centres could aggravate the water shortage in the area. Microsoft replied with a link to a fact sheet.
We also reached out to Amazon, a company that announced this year that it would also launch a data centre region in Querétaro, to ask about the water use of its data centres in the state. The company replied, saying its data centre region would use a design that will not use water for cooling. It added that this will get them closer to being water-positive by 2030.
In Chile, it is another of the world’s richest companies that has led the way in constructing data centres. Google built a facility in 2015 before announcing, in a press conference with then president Sebastián Piñera and some members of his cabinet, a US$140 million expansion in 2018. By the time the tech giant announced plans for a second data centre on the outskirts of the capital, Santiago, it already faced resistance, with public protests after local authorities complained about the potential scale of water consumption.
The dispute reached the Chilean courts, which revoked the license for the second facility, after which Google committed to drafting a new plan for a data centre that would use waterless refrigeration.
The court decision and the apparent concession by Google were celebrated as wins by campaigners concerned at the impact of runaway AI infrastructure, but reporting by our partner LaBot shows that the Chilean government had quietly allowed the data centre constructors to bypass environmental assessments and gain permits.
The government decision, which was not made public, ended the need for impact assessments that were meant to ensure that resource consumption was clarified and that communities were consulted on the potential effects of developments. When LaBot raised questions about it, Chile’s Environment Ministry acknowledged the change and stated that being able to assess the amount of water consumed by data centres would require new regulation.
The battle over data centres in Chile has led to an emerging awareness of the potential harms and, in turn, to a questioning of who this infrastructure actually benefits. While some data centres provide services to nearby communities — like higher-speed streaming television — AI data centres can often be found dedicating their computing power to the training of large language models like ChatGPT, with little or no benefit accruing to local people or economies.
Half the data centres in Brazil, including those in planning, construction and operation, are in the state of São Paulo, mainly around the city of Campinas, according to data obtained by Núcleo Jornalismo, a partner of this investigation. This number includes both AI data centres and also older facilities, according to information from Data Centre Map, which plots site locations worldwide. This part of São Paulo faces regular water crises; the most serious in 2014, when many municipalities in the state, including Campinas and the city of São Paulo, faced extended water rationing.
Other locations identified by the industry in Brazil have faced similar droughts. Data obtained by a freedom of information request by Agência Pública showed that nine of the 14 cities slated to host new data centres were classified by the Water Risk Atlas as being of medium-high overall water risk, with the other three at high risk. Those high-risk cities include Campo Redondo in the state of Rio Grande do Norte, Igaporã in Bahia, and Caucaia in Ceará — all in north-east Brazil. TikTok is building a data centre in the latter municipality, even though it is expected to worsen the drought in the area, according to a report published by Intercept Brasil.
This investigation reached out to Luis Tosse, vice-president of the Brazilian Data Centre Association (ABDC), for comment on, among other things, the impact of water consumption by data centres in areas with serious shortages. Tosse said that “we have water available”, and therefore this shouldn’t be considered a problem in Brazil. Andrei Gutierrez, president of the Brazilian Association of Software Companies (ABES), added that “It’s the same thing as saying: ‘hey, we’re not going to build a road because the road makes the soil waterproof.’ Do you understand? I think it’s all a matter of needing planning.”
With the exponential growth of data centres, even if individual facilities use less water, more energy is going to be needed to power them. The scale and pace of this infrastructure boom are expected to delay or reverse the energy transition away from fossil fuels and deepen the climate crisis, putting water-stressed areas at even greater risk.
Industrial secrets
The energy consumption of AI is difficult to calculate; each individual query depends on various processes created previously, whose costs are difficult to individualise. The International Energy Agency (IEA) estimates that a traditional Google search consumes 0.3 watt-hours. On the other hand, a ChatGPT query, on average, consumes 2.9 watt-hours.
The IEA published a report in which it calculated that data centres consumed around 1.5% of all global electricity production in 2024. This is equivalent to 415 terawatt-hours. The IEA said it expected this number to more than double by 2030, a forecast which other experts judged to be conservative.

Ireland deploys the highest share of overall electricity production into data centres (21% in 2024) — mainly by burning fossil fuels. This proportion led Irish authorities to restrict new data centres around the capital, Dublin, out of fear of creating blackouts.
Data centre companies claim they are solving this problem. Meta, which is one of the world’s main data centre players, has five facilities in Ireland (Data Centre Map lists 135 data centres in this country). But they also have offices in the country, where they said that they have partnered with Brookfield Renewable Energy Partners “on a long-term renewable energy supply agreement to provide 100% renewable wind energy to the Clonee data centre and Meta’s Irish offices”.
An accurate assessment of energy use by the sector is prevented by the companies themselves, who claim this information is an industrial secret. Data from Cushman & Wakefield, a real estate firm, shows that in 2023, the US and China were the countries in the world that used the most electricity to power data centres, as measured by megawatt capacity (1 megawatt, or MW, can power about 1,000 homes). This same data also shows three Latin American cities among the global top 50 in electricity usage by data centres.
The city of Querétaro in Mexico made the list, with a capacity of 150 MW consumed by data centres, as did São Paulo, in Brazil, with 122 MW, and Santiago, in Chile, with 61 MW. However, these numbers are changing fast, and it is difficult to give a precise, updated figure. For example, a report from JLL, a real estate company close to the data centre business, claims that in 2023, São Paulo had a 670 MW capacity for data centres already in operation, with an additional 382 MW capacity under construction and a further 388 MW capacity planned.
Agência Pública asked the Ministry of Mines and Energy for a list of all data centres in Brazil that requested access to the country’s basic energy grid, along with information on how much energy they consume. The response included only 22 data centres. Thirteen of the 22 requested confidentiality (as the country’s law allows them to do) of their energy use data, so the data was not made available. Among those requesting non-disclosure was TikTok, for the facility it is building in Caucaia. Data obtained by Núcleo with Data Centre Map shows there are an estimated 170 data centres in Brazil, a number that includes AI facilities as well as older ones designed solely for data storage and processing.
While an accurate picture remains obscured by secretive industry practices and incomplete public data, the pressure from some of the richest companies in the world to expand electricity production is intense.
In 2024, Mexico’s Federal Electricity Commission (CFE) announced that it would increase Querétaro’s electrical grid capacity by 50%, citing the building of data centres in the region as one of the main drivers. The commission announced a new power station, El Sauz II, which uses gas to generate electricity, adding more climate emissions. In an email sent to this investigation, Ascenty said: “The claim that the installation of new data centres in Querétaro, including those developed by Ascenty, would have motivated the construction of a new fossil fuel-powered plant is unfounded.”
In an interview with N+, Arturo Bravo, the general manager of Ascenty Mexico (one of the data centre companies in Querétaro), said his company finances part of the reinforcement work needed to cover the electricity demand of its data centres, that it is constantly talking to CFE and CENACE (the National Energy Control Centre), and that its petitions for electricity demand are always accompanied by studies.
Digital Realty Trust, which has a 49% stake in Ascenty, said in response to written questions sent via email by this investigation that “the total capacity at Querétaro is 8 megawatts (MW) — <0.01% of Mexico’s total reported grid capacity (total installed capacity of ~89 gigawatts (GW) in 2023.”
While some data centres promise to use renewable energy — such as DataTrust in El Salvador, which says it uses photovoltaic energy for its power, an assertion that has been impossible to verify — fossil fuel consumption keeps increasing.
Trump clears the way to power AI with fossil fuels
In the US, the rapid development of data centres is driving a surge in demand for electricity that the Trump administration intends to meet by burning more fossil fuels. The US AI Action Plan, released in July, and an executive order on “Accelerating Federal Permitting of Data Center Infrastructure” both seek to clear regulatory obstacles, including environmental protections.
In last year’s environmental report, Google admitted that its greenhouse gas emissions have increased by 48% since 2019, mainly due to “increased data centre energy consumption and supply chain emissions”. In the same report, Google said that “as we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI computing.” The company now assesses its much-vaunted net-zero goal in 2030 as “extremely ambitious”.
A Guardian investigation last year found that emissions from data centres owned or operated by four of the largest tech companies in the world — Google, Microsoft, Meta and Apple — were potentially 662% higher than they officially reported.
Data centre associations contacted by this investigation deny that the industry is looking to build more fossil fuel energy plants. Gutierrez, from ABES in Brazil, said: “I don’t see anyone building data centres wanting to encourage the use of thermoelectric, coal energy, [or] diesel energy, which is more expensive.”
In its written response, Digital Realty also said that its company “has a validated science-based target initiative (SBTi) to reduce Scope 1, 2, and 3 emissions in accordance with climate science”, and that Digital Realty “has further committed to climate neutrality for its European operations by 2030 in accordance with the Climate Neutral Data Centre Pact. In 2024, the company matched 75% of its global energy footprint with clean electricity, including 185 data centres matched with 100% clean power. Digital Realty has executed more than 1.5GW of renewable contracts to support our decarbonisation efforts.“
While Brazil’s hydroelectric resources mean its grid is greener than most countries’ — with 86% of electricity coming from renewables and 64% from hydroelectric plants — climate change has affected rainfall, so energy production by thermoelectric plants has grown in recent years.
Nuclear is back
The scale of the data centre explosion has brought nuclear energy back to the forefront. In Brazil, the Minister of Mines and Energy Alessandro Silveira argued that the demand for new projects made nuclear energy the only solution, and that this would demand costly investment from the government. In the US state of Pennsylvania, the Three Mile Island nuclear power plant, which was the scene of the worst nuclear accident in the country’s history, will reopen to power Microsoft’s data centres.
In Argentina, President Javier Milei has proposed a plan to increase the country’s production of nuclear energy to power new data centres, among other things. Nuclear is considered clean, since it does not emit carbon gases, but its development costs are much higher than those of other energy sources, it has a checkered history of accidents, and there are no long-term solutions to nuclear waste, making it very controversial among environmental experts and activists.
Countries with clean energy to spare, like Paraguay, are also being courted by the tech industry, our investigation partner El Surti reports. In May, US Secretary of State Marco Rubio proposed using Paraguay’s energy surplus to power AI data centres. Paraguay owns the Itaipú Hydroelectric Plant along with Brazil, which currently produces more energy than it can use, so it sells its surpluses to Brazil and Argentina at below-market rates.
The courting of Paraguay comes as locals have begun to question the expansion of crypto-mining data centres — the use of power-intensive compute to generate Bitcoin and similar tokens on the blockchain. These facilities also produce serious noise pollution that impacts nearby communities.
“We’ve already suffered from the extractivism of our electricity, yielding it to Brazil and Argentina at miserable prices,” Mercedes Canese, former vice minister of Mines and Energy in Fernando Lugo’s government, told El Surti. “Now we also have crypto-mining and other digital services, which don’t create jobs. The United States’ interest is just another chapter in this extractive logic, which we’re familiar with. It won’t change the reality in Paraguay.”
Paraguay’s current Minister of Information and Communication Technologies, Gustavo Villate, dismissed these concerns and welcomed US interest that “positions us in an unprecedented way”. Villate told El Surti that the government has been working to close deals with data centre companies.
Non-disclosure agreements hide the real stats
Despite the industry’s resistance to disclosure of its power and water consumption, there is an awareness of the negative impacts. Internal documents obtained via the US Securities and Exchange Commission’s EDGAR public-access tool show that one of the board of directors at Digital Realty Trust was asked by a shareholders’ group to “create a comprehensive policy articulating our company’s respect for and commitment to the human right to water”.
The board rejected this request, arguing that three out of four data centres in its portfolio did not use water for cooling. Digital Realty, in response to written questions sent by this investigation via email, added that “shareholders overwhelmingly (~90%) voted against the shareholder’s proposal at the 2025 annual general meeting (AGM)” and that “42% of Digital Realty’s global water use is sourced from recycled water supplies, minimising the impact on local freshwater resources.” The same document found on EDGAR, however, shows that some of its facilities still use local resources and could impact communities.
“I’ve spent years trying to break down [the data centre industry’s] financial records,” Max Schultze, director of the European think tank Leitmotiv, told this investigation. “But [they] don’t break down where they pay taxes at all. They don’t even break down revenue per country … We have yet to see a cloud provider release the actual energy consumption or water consumption, or something like this, in a way that was either independently verified by a third party, or in a way that we can verify it independently.”
Schultze’s organisation was among those that successfully lobbied the German government to enact a law forcing the industry to report water and energy consumption. However, “half of the data centres in Germany simply refused to report it,” he said. “They are all getting a fine of a hundred thousand euros for not reporting, but they don’t care about a hundred thousand euros.”
Tech Policy Press, the US partner in this investigation, asked leading environmental and consumer advocates how much information they could access on national data centre projects. Most of them said it can be very difficult to get details, especially before the project’s approval.
This asymmetry is exacerbated by the widespread use of non-disclosure agreements (NDAs) signed by data centre companies and local governments. “It’s very difficult with the NDAs, prior to them actually being approved, to know what is going to be in the facility,” Julie Bolthouse, director of land use at the nonprofit Piedmont Environmental Council, told Tech Policy Press. “We don’t know how much water they are going to use, we don’t know what kinds of emissions, and we don’t know how much electricity they will need or what type of service they will require,” she said. “So basically we gather information after the fact.”
This is an edited extract. For the full story, go here.
Big Tech’s Invisible Hand is a cross-border, collaborative journalistic investigation led by Brazilian news organization Agência Pública and the Centro Latinoamericano de Investigación Periodística (CLIP), together with Crikey (Australia), Cuestión Pública (Colombia), Daily Maverick (South Africa), El Diario AR (Argentina), El Surti (Paraguay), Factum (El Salvador), ICL (Brazil), Investigative Journalism Foundation – IJF (Canada), LaBot (Chile), LightHouse Reports (International), N+Focus (Mexico), Núcleo (Brazil), Primicias (Ecuador), Tech Policy Press (USA), and Tempo (Indonesia). Reporters Without Borders and the legal team El Veinte supported the project, and La Fábrica Memética designed the visual identity.

