Down the Rabbit Hole of AI: Data Centers and the Material Impacts of the ‘Cloud’ – An interview with Tamara Kneese
By Yama Chiodi in collaboration with Damny Laya
The common approach of framing AI’s environmental cost as a matter of individual consumerism is a political and material misdirection. Calculating the water wasted per ChatGPT query, for instance, shifts responsibility from corporations to users, obscuring the true scale of the problem. The reality is that individual use, however large it grows, will always be a small fraction of the issue. Data centers operate on an industrial scale, computing vast amounts of data for model training and corporate services, with a single facility capable of consuming in a day the energy an entire city uses in a month.
We often imagine AI living in a seamless “cloud,” but it actually runs in massive data centers that demand huge amounts of resources. The social and environmental impacts are severe. Data centers are immense consumers of energy, water, and land, create local noise and air pollution, and follow patterns of environmental racism. Their development often involves secretive deals that bypass community input, repeating a global playbook of injustice. Following the infrastructure of data centers will also make us realize their impact on rivers, land, indigenous territories, and the increasing stress on the planet from the demand for critical minerals.
According to researcher Tamara Kneese, this infrastructure is creating a new form of technological colonialism. The environmental harms are consistently shifted to vulnerable communities, from rural towns to marginalized neighborhoods, which become sacrifice zones for the industry’s growth. Furthermore, the growing dissatisfaction of communities in the Global North is being met by a very colonial movement of outsourcing data centers to the Global South.
Over the past few months, I have been investigating the environmental impacts of AI for a journalistic project. In September, I was joined by my friend Damny Laya, who authored earlier this year a lengthy report on the impacts of data centers in Brazil (Portuguese). Together we are producing episodes for the Oxigênio Podcast, set to launch in November, with additional material to follow.
This research process led us to the work of Tamara Kneese. After reading her writing and seeing her speak at events on AI, labor, and the environment, we were very impressed and decided to invite her for an interview. She was very generous with her time, and we spoke on October 8. After reviewing our conversation, Thais Lassali, the blog’s editor, agreed it was a perfect fit for this space.
Tamara Kneese is the Director of the Climate, Technology, and Justice program at the Data & Society Research Institute. She currently leads research on the environmental costs of technology, serving as Principal Investigator for a National Science Foundation [United States] project that uses participatory methods to assess AI’s impact. She also holds a grant from the Internet Society Foundation to study community and worker resistance to the expansion of data and energy infrastructures.
Her perspective is shaped by a unique background in both the tech industry and academia. She previously worked on Intel’s Green Software team and was an Assistant Professor of Media Studies and Program Director of Gender and Sexualities Studies at the University of San Francisco. She is the author of “Death Glitch: How Techno-Solutionism Fails Us in This Life and Beyond” and is currently working on a new book about social scientists and ethnographic work within the tech industry.
She holds a Ph.D. from New York University’s Department of Media, Culture, and Communication, and her research, in her own words, “juxtaposes histories of computing and automation with ethnographies of platform labor and care work. I’m currently writing about collective action in the tech industry around AI’s relationship to both labor rights and environmental destruction.” Her work can be found in academic journals, but also on multiple popular outlets such as Wired and The Verge.
In the following interview, we explore her work and the critical social and environmental challenges posed by data centers, a crisis that is intensifying with the rapid acceleration of generative AI tools. A Portuguese version of this interview is also available here.

For those interested in following Kneese’s work more closely, she invited our readers to subscribe to the Climate, Technology, and Justice newsletter. Next month, she will be releasing a report on climate action in the tech industry concerning data centers. You can sign up for Data & Society’s newsletters at this link. We hope you find this conversation as thought-provoking as we did.
Yama Chiodi: Hello Tamara, thank you so much for accepting our invitation to talk. As I told you before, my questions will be around your work on the social and environmental impacts of data centers. First of all, I want to start with the cloud metaphor. I read a very interesting text you wrote about it. You argued that the cloud is a misleading metaphor. Can you explain why that is, and why this metaphor obscures some physical aspects of data centers?
Tamara Kneese: Yeah, totally. I think when most people think about the cloud, they’re kind of relieved about the convenience of being able to store files that aren’t just connected to your hard drive. With cloud services, like Google Drive, you’re able to store a lot of data and access it through your account, wherever you are. The relationship that people have with even just their personal computer becomes much more disembodied; it’s not connected to a particular place. You don’t even think about your hard drive or your computer as the point of access. We also saw that with the rise of platforms like Zoom, where you often feel like you are talking to people through a kind of magical, ethereal system, and it can be easy to forget that all of these interactions, all of this data requires massive data centers, which allow for the storage of all of this information.
There have been a lot of scholars out there, like Mél Hogan, and many others in critical data center studies, who have been tracking the growth of “hyperscalers,” these massive data centers associated with companies like Google or Microsoft. These data centers have been around for a while, but they’ve really taken off in recent years. The pandemic was a turning point where computing resources were even more important for people. You had to be connected in order to go to work, to go to school. And so we saw a growth of data centers, and now with generative AI, there’s even more of an investment.
The problem with our relationship to computing is that, most of the time, we don’t really think that much about the materiality of the computing system and the larger supply chain. Everything we do relies not just on our own device, or the particular cloud services, but on a global supply chain. Where does the hardware come from that we are using, and what kind of labor practices are going into that? Further back in the supply chain, we have raw materials and critical minerals and other forms of extraction, and human rights abuses and labor abuses. There’s a really strange way in which cloud computing becomes software divorced entirely from hardware, even though that is never actually the case.
YC: In times of generative AI, it became more important than ever to understand the infrastructure that revolves around data centers. Could you briefly describe its main components?
TK: Essentially, they’re like giant warehouses of chips, of servers, of networked systems. They look like basically nondescript square buildings. In a city environment, they might be quite small, just taking up part of a building. You wouldn’t really know that it’s a data center unless you look at the lighting and realize that it’s not inhabited by people; it’s kind of just glowing with compute. Particularly with the kinds of AI that companies are investing in right now, there’s a need for more powerful chips, GPUs. Data centers are about providing enough energy and computational power for these powerful language models to be trained and then used. Because it’s incredibly energy-intensive, you also need water to cool the servers.
There’s a very strange relationship to heat. Because while you’re relying on electricity to generate a lot of energy, you also need to make sure that the servers don’t overheat, so you need a lot of fans and cooling systems that use water. On top of that, you need backup energy sources. Because there’s such a draw on the power grid, you have to have backup generators, often diesel-fueled, to make sure that the data center can keep going. For people who live near them, they’re very loud. They will talk about light pollution and noise pollution. It’s been interesting to hear from communities near crypto mining facilities, who complain of things like migraine headaches and sleep deprivation. The air pollution is quite noticeable, with a lot of particulate matter from the diesel-fueled backup generators.
The data center itself is not the most exciting place. There usually are not very many people working inside of it. It’s not like a factory or an Amazon warehouse. There’s a few technically professional people running it, but it is basically a place where you just have a lot of servers, energy, and the systems that are running, and then people coming in to maintain them.

YC: I might be wrong about this, but at least until now I have not seen complaints about the noise pollution especifically regarding diesel generators in data centers in Brazil. I was wondering just as you were talking if this is related to the fact we have relatively abundant renewable energy available, which we know is not exactly true nor free of environmental and social impact. While some isolated communities in Brazil in fact use diesel generators as a main form of getting energy, this scenario is a very small percentage of our energy matrix. Noise pollution is a big thing with wind power plants, though. There are many complaints about it, even regarding data centers. But with diesel generators, can this maybe be more of a problem in places where renewable energy is not available so easily?
TK: Well, it’s interesting, because even in California, where we have data centers that are, in theory, greener, there still are usually backup generators. Before the political situation in the U.S. shifted, a lot of Bitcoin companies would talk about their relationship to renewable energy, saying that as energy demand grows, having these infrastructures in place will help incentivize the growth of renewable energy. But what we’re seeing is that the energy needs are so great that even in places where there should be more renewable energy, we’re also seeing coal plants being kept open when they were supposed to be sunsetted. The clean energy conversations in the U.S. completely fell away and are now basically forbidden. But the behaviors of the companies themselves haven’t really changed.
YC: Yeah, that makes sense. So, moving on further into environmental and social impact. What would you say are the most significant environmental impact factors when it comes to AI and data centers?
TK: The energy problem has probably gotten the most attention, because it is a source of anxiety at a time when we’re supposed to be transitioning away from fossil fuels. But I think on a local level, things like the water consumption can matter more. If tech companies move into rural areas in Mexico and use up all of their water, basically preventing people in the town from having access, that is incredibly problematic. In water-stressed areas and areas where the people living in a place don’t have as much negotiating power, don’t have as much political power, and especially if places are already treated as sacrifice zones, which we’ve seen repeatedly with Indigenous land in particular, the consequences may go far beyond just the immediate energy-related problems. It is kind of place-specific and context-specific. That is why having a sense of the entire AI supply chain is really helpful. Even if you’re, in theory, using renewable energy to build a data center, you still are relying on a lot of other materials, including chips and minerals, that are possibly going to be harming communities and causing environmental disruption. The idea of having to curb demand and consumption overall, instead of endless corporate growth, is a conversation that most companies are not gonna be open to.
YC: Earlier this week, we talked to an Indigenous leader, Cacique Roberto Anacé. He’s trying to, if not stop, at least have a conversation with the government, because TikTok is building a huge data center inside their land. The local impact will be enormous, and this kind of thing will become more and more of a problem here in Brazil.
TK: I read about this TikTok data center, which is now owned by even more nefarious people like Larry Ellison, the notorious founder of Oracle (laughs). In the US, in South Memphis, there’s a data center connected to xAI. So you have this platform that is so racist and so incredibly harmful to Black communities, and then has the audacity to actually pollute their land and air even more. The way of framing particular kinds of harm is important. It’s not just about people’s energy bills going up, or quantifying the energy or water use, but really thinking about the relationship between those social harms and algorithmic harms and the environmental racism and other forms of embodied harms that communities are dealing with on that hyper-local level.
YC: I watched a lecture you gave some weeks ago where you mentioned the South Memphis case as an informative story on environmental racism.
TK: South Memphis has become one of the major sites, particularly because a lot of local environmental justice organizations and civil rights organizations got involved. What you have, basically, are parts of cities that have historically been Black, that have been disenfranchised, that have dealt with other waves of industrial development and pollution. If an oil and gas company comes to town and is polluting everything with petrochemicals, this is how you get what we call “Cancer Alley” in Louisiana. The question is, what are the places where people are not deemed worthy of protection? Who’s expendable? In this country [the United States], with its history of white supremacy and general lack of civil rights, a lot of the places where Black communities have traditionally been tend to be the ones sacrificed for various types of development, like putting up interstates or Amazon warehouses. Data centers are just a continuation of what was already happening.
Then you have crooked deals on the local level, where a mayor and other local officials think that they’re getting something economically of value from making a deal with a data center operator. They meet behind closed doors, make decisions in their best interest, sign non-disclosure agreements, and don’t tell their constituents, not giving them any time to fight back. Development happens pretty quickly. In certain places in the U.S., there are way fewer environmental protections. We have a patchwork system. In South Memphis, Elon Musk was building these data centers that were flagrantly going against the EPA’s air permitting rules, but they’re willing to do whatever. They’ll take the fine, or count on local officials to look the other way because there’s so much corruption. The only way to fight them is through the law; everything ends up becoming a lawsuit. Unfortunately, you need very specific forms of evidence and data for a lawsuit to be viable. There have been cases where community-gathered data was not considered to be legal evidence. It’s exhausting for communities that are already dealing with these violent histories to have the burden of proving when they’ve been harmed.
Damny Laya: It is impossible to not notice a pattern here. Because what you are describing is the same behavior of this industry we saw in Chile, in Uruguay, in Mexico, and now here in Brazil. The pattern is the same. That’s why we need strong policies, social participation and political power to the affected communities. There’s no way without these.
TK: Yeah, definitely. We could probably look at all the different scenarios and reverse engineer whatever their playbook is. It would be helpful to put out something on what exactly all the lies are that you will hear. The strategy does seem to be pretty much the same everywhere, especially relying on officials who are willing to sell people out.
YC: Xenophobia also plays a part in how environmental racism works. The case of China is remarkable. Can you talk about how sinophobia is affecting the way companies in the U.S. and in Europe are thinking about and regulating AI?
TK: It certainly cuts across political lines here; it’s not unique to the Trump administration. It echoes earlier versions. When Japan was the tech capital in the Clinton era, the “information superhighway” was considered to be a matter of national security, and earlier, the origin of the internet was about Cold War fears regarding the Soviet Union. It’s always about American competition and innovation. Europe’s paranoia is always that they’re regulating themselves out of being able to innovate in the same way the US or China can. They’ll produce strict regulations, then walk it back, saying they don’t want to prevent innovation.
Right now, we’re seeing fears over China’s dominance. We are in a completely incomprehensible, irrational place here in the U.S. A lot of what the Biden administration was doing around the CHIPS Act, to bring chip manufacturing to states like Arizona and Ohio, was specifically to fight Chinese dominance and preempt fears about China invading Taiwan. Now, the plans from the Biden administration are being countered by the Trump administration’s actions, like tariffs, which make it harder to do anything here. China has become the new focus, and it is incredibly racist. The way it’s playing out is also a focus on critical minerals, vying for dominance there. The idea that we’re going to be in a good position to have access to critical minerals without bullying other countries is why there’s now a focus on critical mineral mining operations in the U.S., even on tribal lands.
YC: Yeah, I think it’s very important the remark you made on this not being a Trump administration thing. If you look at the way AI has developed in China, that’s a direct consequence of politics and policies made by the Biden administration. It’s a very complicated thing.
TK: Definitely, and everyone freaked out about DeepSeek. If you’re worried about competing with China on AI innovation and research, but you dismantle the NSF and academia — which is where innovation happens here, not at private companies — you’re shooting yourself in the foot. This is what happens when people don’t pay attention to history.

YC: Moving to a different topic. The big tech companies tend to publish detailed sustainability plans and try to achieve carbon neutrality. Google, for instance, says they are going to achieve carbon neutrality for its data centers in a few years, which seems crazy to me considering they are actually rising emissions, year after year. How much trust should we place in these promises, and what are the most common pitfalls or instances of greenwashing to look out for?
TK: Tech companies have used “net-zero”1 creatively. Even when things were more progressive, they used creative accounting mechanisms, carbon offsets, and renewable energy certificates to arrive at “carbon negative” status. We got crazy stories during the Web3 era of tech people going to Brazil, saying they were going to tokenize the rainforest and help companies achieve carbon neutral status by planting trees or keeping trees from being cut down.The numbers were always questionable. Now, companies are like, we don’t have to pretend we care about that anymore because we have AI, and we can tell people that AI will magically help fix the climate crisis. At NYC Climate Week, I was on a panel where a director of sustainability at Microsoft had a presentation about how AI was going to help companies reach their net-zero goals. It was very obvious greenwashing, but she published a paper in Nature, and people in the room were taking it seriously because people want to believe corporations are trying to do the right thing. There’s still a sense that tech companies are good companies with progressive values. Now, there’s been pushback around the emissions tied to generative AI. So companies are releasing metrics through selective disclosure, putting out numbers that minimize their apparent impact. Then you have people trying to counter that, pointing out how their math is wrong. I think we’re in a moment where people are no longer buying the corporate tech line.
YC: At the same time, it seems like the Trump administration had this effect on companies. They don’t have to care that much now; well, they don’t have to pretend they care that much.
TK: Yeah, they could just say whatever they want.
YC: Even so, organized resistance still plays an important role in this scenario. One of the most interesting aspects of your work in my opinion is precisely paying attention to how local communities and organized groups are trying to resist the imposition of data centers. Can you describe how you see social resistance against data centers in the US right now?
TK: There are places in the US where data centers have been in high concentration for a long time, like Northern Virginia. There is a more developed, policy-oriented resistance there because data centers are everywhere; you can’t ignore them. People are annoyed about electricity bills going up. Organizers dealing with new data center development in other parts of the U.S. are looking at Virginia to see what worked and what didn’t. It looks different everywhere. In Tucson, Arizona, environmental justice groups and the community were able to stop a data center from being built. We’ve seen other mini data center moratoriums in different counties.
The problem is that every town and county has different regulations. What works in one place might not work in another. Data center expansion is happening so quickly, and the process where local officials are promised economic boons and jobs happens over and over. The community may not have time to organize or even know it’s happening. People might stop a data center in one location, but it’s like whack-a-mole; they’ll just move to the next town. Even if your town says no data centers, you might still be dealing with the transmission lines from data centers in a neighboring town or state. Energy isn’t bounded by municipal boundaries.
Another issue is that if the U.S. had a strong federal-level data center policy — which won’t happen — or if there’s enough resistance to stop them, they’re just going to move somewhere else, to the Global South, accelerating what we’re already seeing. Organizing against big tech is not only on a hyper-local level, but also a much larger political fight about who gets to determine what happens to the Earth’s resources.
YC: I do agree with you. Brazil’s scenario is a very complicated one. Not only we won’t have a national policy trying to stop data centers, but our government is building a national policy to attract them. And considering the political climate with the U.S. at the moment, it is hard not to think this policy for attracting data centers is somewhat related to building leverage to negotiate the tariffs. This makes me go to my next question, which you kind of already answered, but I want to ask you directly: do you think of this movement of data centers being rejected in the U.S. and moving to the Global South as a new form of technological colonialism?
TK: Yeah, I mean, definitely. Thinking about different eras of globalization and outsourcing, tracking the shift of electronics manufacturing from Silicon Valley to places like Taiwan… The shifting of environmental burdens, the search for cheap labor, the search for places where the fight to do what you want without regulation won’t be as difficult. It’s absolutely a form of colonialism.
YC: That is unfortunate, to say the least. Before we finish the interview, I’d like to talk about labor. How is the AI industry changing infrastructure and impacting workers? And how AI is changing the way work is seen in the future of labor?
TK: I had written before about the AI supply chain and labor, and scholars who’ve done work on cross-class solidarity. In the US, groups like the Tech Workers Coalition were started by software engineers and office workers who felt solidarity with janitors and cafeteria workers trying to unionize on tech campuses. We’ve seen this with Amazon Employees for Climate Justice, organizing around delivery drivers and warehouse workers. But things have shifted. The labor dynamics within tech companies are not the same. The economy has collapsed for a lot of tech workers. There’s a lot of anxiety, especially for workers on visas. Everyone is feeling expendable. AI is being shoved in people’s faces; they’re asked to use Copilot, told to increase productivity. Junior developers are being replaced by AI; companies aren’t hiring and are laying people off. The idea that software engineers are special and privileged and don’t have to organize has shifted. Now they can be serfs, too. We’re all versus a handful of billionaires. As terrible as it is, there’s more opportunity to bring in people who maybe were not willing to organize before.
YC: It is very interesting how new solidarities and forms of unionization can bloom from chaos. I don’t wanna hold you much longer, so let’s head to our last question. You came from a background in Feminist Science and Technology Studies, as I did too. What do you think are some feminist STS insights that could make a reflection about AI, to see a more political interpretation of it? I mean, beyond gender difference, on a theoretical and political level.
TK: I’m inspired by people like Lucy Suchman, who started out critiquing the tech industry from within and has been willing to talk about the relationship between AI and genocide in a way a lot of people in academia have not. The idea of what counts as evidence is also really important from a feminist STS perspective. Whose stories and experiences are taken up as part of the official record, the archive, or make their way into policy? How do you ensure you’re not leaving out those messy stories that are not quantifiable? Then I think also opening up space for imagining technology that people do want. Generative AI notwithstanding, technology can be useful for communities dealing with climate change, like climate modeling. The role of feminist STS is to not think about “AI for good” projects, but to radically think about what we mean by technology. An IUD is a form of technology. There are a lot of forms of technology that are useful, valuable, and feminist in nature that have nothing to do with big tech companies.
YC: It is interesting you mentioned reproductive technologies, because if you think about the work of scholars like Donna Haraway, I think her classic article “The Virtual Speculum” has some hints for us on how to react to AI technologies, including going back to your reflection on the cloud metaphor. There’s so much to think about, so much to do. Anyways, that’s all the questions I have for you today, Tamara. Thank you so much for your time and for your work.
TK: Great, I’m excited. I’m also very excited that this will be released in Portuguese too. I’ll share it with all of my Brazilian friends. Bye, thank you so much, have a good night.
Yama Chiodi is an anthropologist and science journalist. He holds a PhD in Social Sciences (Unicamp) and a Master’s in Scientific Communication (Unicamp). His research focuses on climate change and the socio-environmental impacts of technology.
Damny Laya is a social scientist and science journalist. He holds a PhD in Science and Technology Policy (Unicamp) and a Master’s in Social Studies of Science (IVIC – Venezuela). His research investigates internet governance and the intersections between internet infrastructure and sovereignty, as well as its socio-environmental impacts.
- Net-zero carbon refers to the state in which the amount of greenhouse gases (like CO2) emitted into the atmosphere is supposedly balanced by the amount removed from it. ↩︎
0 comentário