Author Archives: Gerry McGovern

The anti-Nature Valley

It worked, and like a magic trick, the digital warmonger was born and boomed as something greener, something softer. However, impressions cause reality distortions. By keeping things low and out of sight, urban sprawl spread faster than anywhere else in the United States. Those low-lying ‘campuses’ grew everywhere as the Valley experienced almost exponential growth, snarling traffic, putting intense pressure on housing, particularly that for ordinary working-class people. For those poor, stressed, unhealthy female migrant workers, who were still so unfortunately needed in this brave bright green, clean tech world. “By 1970,” writer and philosopher, Aaron Sachs wrote, “San Jose had only 3.2 hectares of open space per 1,000 residents, half of which consisted of school playgrounds, compared to 14.2 hectares per 1,000 people in San Francisco and 28.7 per 1,000 in Washington, DC.” The greenwashing of the Valley was in full flow.

The reality distortions would persist and grow. Behind the scenes of the Valley film set lay a region full of stressed people stuck in traffic, struggling to get by. Poisoned land and water, and house prices were crazy. Long, long commutes, particularly for the poorest workers. Rising homelessness, and starker and starker income divides, as the average working wage declined while the tech bros cluttered the stars with their satellites, while reaping the richest of dividends. By the 2020s, life for those outside the elite was a reality struggle in the Valley. Many were saying they’d leave if they could.

The working class. They couldn’t get rid of them. While manufacturing was declining in much of the US from the 1960s onwards, it was growing rapidly in the Valley. Female factory workers. “In 1970, 70 percent of the production workers in the electronics industry were women, about half of whom were minorities, mostly Mexican-Americans and Filipino-Americans,” Sachs wrote. Cheap and disposable. Headaches, miscarriages and cancers were high due to inhaling the hydrocarbon solvents used to clean semiconductors. Luckily, these poor women were easy to discard without worrying too much about lawsuits. This culture of extreme worker disposability would become a core characteristic of this new Valley. Even the white programmers were not immune. It was made brutally clear to programmers that over forty was over the hill and out of the Valley. Few got rich quick. Many died trying. Making toxic products in a macho toxic, no-worker-rights culture would be refined and later marketed as the liberating bright green “gig economy.” Meanwhile, the tech executives were ever wary. One of their favorite pastimes—yoga and meditation included—were weekend retreats focused on how to bust and crush unions.

Silicon Valley “is fundamentally misleading and ahistorical in its approach,” Aaron Sachs wrote. It doesn’t have a past that it hasn’t buried in some underground leaking chemical tank, and its present and future are constantly being reinvented by bright green marketers and branders, who excel at telling compelling stories of fantasy convenience, innovation and efficiency. Hardcore accelerated innovation is the religion and every traffic-snarled road leads to some vaunted progress. “Perhaps most dangerous is the seemingly concerted attempt of high-tech boosters to inspire scorn for the actual, physical world,” Sachs wrote in 1999. He mentioned futurists like Gregory Stock, who celebrated “comfortable indoor environments.” These tech bros deliberately set out to weaken and ultimately eliminate “the emotional links between humans and the ‘natural’ environment.” Why? Because the physical environment is the key competitor of technology. The more time you spend online, away from Nature, the more money they make and power over you they acquire.

Silicon Valley: designing for invisibility

“A lot of that design was about deliberately placing industrial infrastructure out of sight,” scientist Josh Lepawsky explained to me. “Literally putting it underground. Things like chemical storage tanks needed to store the chemicals for the manufacturing process. So, it was a deliberate urban design process, and I think it has been with us since at least the 1950s. Why does it all matter? One of the ways that it matters is that it is very useful for the marketing and the industrial interests out of which digital technologies emerge, that they can trade on these images of being light, green.

Think of all of the metaphors that go with the digital technologies we use, like ‘The Cloud’, for example. This myth of digital as ethereal can be very useful as a way to divert attention from the many classic problems that come along with industrial production, that is the use of energy and materials and the pollution that pretty much always results.” Put the chemical storage tanks and pipes underground. Think about that for a moment. Underground, these tanks were much harder to maintain. They were much more likely to leak, and the chemicals they leaked were some of the most toxic known to man. It didn’t matter. The health of people didn’t matter. The soil and water didn’t matter. What mattered was the branding.

Silicon Valley had other secrets. Like the overall computer industry, it was born from war. Its first design challenge was how to kill more efficiently. The cold and warm wars nurtured the digital seeds, with the Second World War giving birth to the semiconductor industry. It grew quickly, fed by military contracts. Early invention and innovation was focused on improving guidance systems for missiles to make them more effective killing machines. Sputnik and the arms race were the seeds from which the Internet would blossom as a result of generous military research grants, with the objective of creating a network that would be robust enough to withstand multiple nuclear strikes.

In the nascent Valley, the bright green new world order needed to “attract a better class of workers,” as Stanford University business manager, Alf E. Brandin, stated in 1956. They wanted engineers and thinkers, marketers and branders, and fewer of those unkempt working-class riffraff, those illiterate and disposable female migrant workers coming from Mexico and the Philippines to steal the jobs nobody else wanted to do. Or at least that was the impression Alf E. Brandin and his modern marketers wanted to present, and as we all know, when it comes to marketing, impressions count more than reality. Thus, the Stanford Industrial Park got renamed the Stanford Research Park. Companies that located there “had to follow strict building codes, which included ‘complete concealment’ of things like smokestacks, generators, transformers, ducts, storage tanks, and air conditioning equipment,” writer and philosopher Aaron Sachs explained. The buildings were kept low lying, hugging the ground, blending into the landscape, becoming the landscape.

The greenwashing of Silicon Valley

It wasn’t always known as the Valley of Pimps and Pushers. Once upon a time, they called it the Valley of Heart’s Delight. From far and near, families would come on Springtime pilgrimages to participate in the famed “blossom tours” in Santa Clara Valley, California. “Miles and miles of fragrant orchards, spreading in a vista of never-ending loveliness under sunny Spring skies, hold promises of rich treats to come,” a promotional video from the 1940s stated. Up the road, in San Francisco Bay, they found rich harvests of the freshest oysters, while the soil of nearby San Jose was famous for its fertility, overflowing with prunes, apricots, cherries and apples.

Those times would pass. The valley of fruits would become the valley of silicon, while the man who used an apple for his company’s logo would keep a private orchard next to his house to reminisce on Nature and those pleasant times gone past. The soil and water would be soaked in a multitude of chemicals and heavy metals used in the making of silicon chips and other electronics. The e-waste sites would proliferate. Chemicals, ethers and gases used in the chip “clean rooms” would cause all sorts of health issues, particularly for the reproductive health of the mainly migrant female workforce. The high-tech sewage dumped freely in public drains, the leaky underground chemical tanks, laced the environment with cadmium, nickel, and lead, the nitrogen and phosphorous runoff from Green Revolution agriculture. It would all add up. There would be and a rash of deaths from food poisoning. Thus, the surge in oyster restaurants to satisfy the refined tastes of the tech bros of San Francisco could not meet their needs from local Bay-fished oysters anymore. For thousands of years, the local Indigenous had grown healthy and strong eating this abundant seafood. No more. Santa Clara County would move from being known as the Valley of Heart’s Delight to the Valley of Superfund Sites, as it attained the notorious reputation of having more toxic dumpsites than any other county in the USA. Luckily, the Valley had excellent marketing and branding. The pain and suffering of the female migrant workforce would remain well hidden, as would all the other stories of environmental degradation, well covered up by the brilliant shine of the ethereal Valley’s clean and bright green Big Tech brands.

As early as the 1950s, Big Tech began to master the fine arts of greenwashing. It would take an actual green valley and turn it toxic brown while branding it bright green. This new Valley would reflect a clean break from the dirty and polluting smokestack industries of the past. It was to be a digital world, insubstantial , almost invisible, light as a cloud. Its planners eagerly embraced the architecture of seemingly open spaces, campuses and parks, low-rise university-style buildings, soft, green landscaping, evoking sustainability and renewability. Underneath the slick branding, the chemicals bubbled in carefully hidden underground tanks, the gases rose, and the heavy metals stirred. The working environments of the female migrant workers were cruelly filled with cancer-causing gases.

It takes coal and charcoal to make silicon

That silicon stuff. Sand, right? Right. It takes coal and charcoal, multiple metals and materials, and over 400 toxic chemicals, to make a silicon chip—the foundation of everything that happens in clean tech. “The environmental impact of chip making is huge,” Ian Williams, professor of applied environmental science at the University of Southampton, told Mongabay. “Large quantities of natural resources and energy are used to make chips. And each new generation requires more energy and water and generates more greenhouse gases than the previous generation.”

A computer chip is composed of silicon, copper, plastics, aluminum, silver, gold, arsenic, boron, phosphorus, and other metals and materials. Silicon is the primary component. Mining silicon is hazardous. Silicon dust irritates the skin and eyes, and miners are known to get lung cancer. Making one ton of silicon causes five tons of CO2, as it takes tremendous heat to take the raw ore and turn it into pure silicon—up to 3,000 degrees Celsius (5,400 Fahrenheit). It’s like “working in a volcano,” one worker said. Achieving such intense heat requires petroleum coke, coal, and high-grade charcoal.

Making charcoal is an inefficient and wasteful process, with up to 75% of the original wood being lost as CO2, smoke and wasted heat. Its production is a key driver of deforestation, particularly in African and Amazonian tropical areas. It’s an accelerator of wildfires. In the Amazon, large areas of forest have been cut down to produce it, particularly for the steel industry, and there is a long history of illegality, environmental and worker abuse. In Myanmar, an investigation by Emmanuel Freudenthal for Mongabay found that, every year, 14,000 football fields’ worth of forests were being cut down to produce charcoal for the Chinese silicon smelting industries. Charcoal made from tropical hardwood is preferred because of its burning intensity.

Charcoal burns with a greater constancy, intensity than wood. It is a great example of the efficiency paradox. To achieve efficiency in one area, waste and inefficiency occurs in another. Charcoal is much lighter than wood and takes up much less volume because the excess weight of the wood has been burned off. Its lightness means you can more efficiently transport it over longer distances and still make a nice profit. So, measured economically, it is efficient. You are robbing Peter’s forest to pay Paul’s oven. Paul gets a benefit of 20 and the forest and our environment loses 80. Because, to an economist, the only good tree is a dead tree, and they therefore only measure the benefit to Paul’s oven, our environment takes on another burdensome debt as it edges closer to bankruptcy and collapse. Big Tech says that using charcoal will help it create “carbon neutral silicon,” which will support it in reaching “zero carbon dioxide emissions in silicon production.” Figure that one out.

Down through the rough road of civilization, as James Scott explained in his groundbreaking book on early states, Against The Grain, the way charcoal was used became a sign of impending state collapse. When it started being used for heating—instead of its more typical use as a fuel for cooking—that was a key sign the end was coming. To a dying state that had cut down all the forests close by, whose rivers were silting up because its emaciated soil was flooding into them, desperate charcoal had one last efficiency. Since its heat-to-weight ratio was much higher than that of wood, it still made some economic sense to transport it over long distances and use it for heating, as well as cooking. Such pillaging of distant forests couldn’t go on indefinitely, as the people at the periphery either fled or rebelled, rather than be unwilling witnesses as all around them was sucked into the dying fires of a voracious state.

The three chip problem

They like their chips well engineered in the USA. Long, straight and thin. Good looking. To get such handsome chips requires large, round, smooth potatoes, and to grow such potatoes requires soft, loamy, sandy soil. Such soil is thirsty. In such thirsty soil, water seeps through like a sieve. They could choose firmer soil and use half the water. The potatoes wouldn’t be as straight, though. They’d taste the same, have the same nutritional value. Not good enough. Must be straight and thin and well engineered, nice on the eye.

As some of the worst droughts on record gripped Minnesota in the 2020s, the chip farmers were not too worried. They cranked up their massive sprinkler equipment, hooked up to deep, deep wells and sprayed billions of gallons extra on their thirsty crop, blowing through limits that they knew would not be policed, in the land of the free water. Limits that were there to protect stressed aquifers that have been in constant decline.

Pumping water like there’s no tomorrow requires great engineering intelligence and innovation. Intelligence focused on designing the best sprinklers, wells, and water pumps. All this intelligence is quickly draining groundwater throughout the United States. “The practice threatens not only drinking water supplies for millions of Americans but also the nation’s status as a leading exporter of food,” Dionne Searcey and Mira Rojanasakul wrote for the New York Times. The land is literally sagging, as the drilling equipment drills deeper and the sprinklers rotate faster. It’s a race to the bottom in the pursuit of straight-looking chips and other such vanities. This is the story of the intelligent human.

It takes an awful lot of water to make straight, clean semiconductor chips, even though “a minimum of 40 per cent of all existing semiconductor manufacturing plants are located in watersheds that are anticipated to experience high or extremely high water stress risk by 2030,” according to scientist, Josh Lepawsky. Potato chips and semiconductor chips—very water intensive.

The only difference between Silicon Valley and Las Vegas is in the chips they use. Silicon Valley and Las Vegas are physically, philosophically and emotionally twins. They are both gambling meccas, full of grifters, scammers and con artists, betting chips in the hope of fast bucks. Silicon Valley is a financialized hype market gambling house, and in such markets of speculation it is imperative to move at the fastest speeds possible, cashing in while staying ahead of the curve of reality. The gullible must have their eyes fixed firmly on the next big thing. Otherwise, they might start thinking about the previous big thing and how it didn’t pan out the way the tech bro boy sun god grifters promised.

Cut the crap

We’ve never had more data and yet we’ve never had less information architecture skill. Organizations don’t want to invest in the hard and vital work of professionally organizing and managing data. AI is making things worse because it is feeding the idea that humans no longer need to worry about how we create and organize our data—that AI will look after all that. It won’t. It is making things much worse because AI is a great big lying, great big crap-producing machine.

Teachers are finding that students, brought up on Google search, don’t even know what the concept of a file is, let alone where it is saved or how to organize it in a classification hierarchy with other files. To the Google generation, “the concept of file folders and directories, essential to previous generations’ understanding of computers, is gibberish to many modern students,” one professor stated.

Archiving data can significantly reduce overall data pollution because the most important decision in archiving is what not to keep. Bob Clark, director of archives at the US Rockefeller Archive Center, has stated that less than 5% of stuff is worth saving in any situation, while a representative from Library and Archives Canada told me that only 1% to 3% of information in any department has archival or historical value. And yet archiving is not worth the effort, according to most managers.

“Don’t make me think” has been a mantra of modern design and user experience. This philosophy is equally pervasive when it comes to technology in general. Buy this technology, the pitch goes, it does the thinking for you, it does the hard work for you. And it’s always on, always available. Store everything and no matter what time of day or night it is, you can get exactly what you want instantly. In the data center industry, they call it 99.99% uptime. It comes at the same cost to the environment as making silicon 99.99% pure. Huge costs. Massive costs.

In a typical data center, “only 6 to 12 percent of energy consumed is devoted to active computational processes,” Steven Gonzalez Monserrate has estimated. “The remainder is allocated to cooling and maintaining chains upon chains of redundant fail-safes to prevent costly downtime.” Perhaps this has changed somewhat because of the voracious processing demand from AI. However, the basic point remains true. Guaranteeing your convenience, and your access to all that crap data you’re never going to look at again, costs 90% more mining, 90% more materials, 90% more electricity, 90% more water, 90% more waste. All so that you can potentially access that photo or file that there is a 99.99% chance you will never access. A data center is like before the start of an Formula 1 race. All these high-performance, energy-intense cars revving and revving for a race most of them will never run. Here we are. This is us. This is civilization, modernity, progress, innovation. Spending so much energy to create and store crap.

Digital will sink in its own crap

It’s not simply crap content. Computer code bloat is everywhere. For starters, most software, most features, serve no useful function. A pile of crap software is launched and then either it dies a death, or else for years badly designed fixes are made to try to get it to work in the most basic manner, while making it even more complex and bloated. Most commercial software apps hardly even get downloaded. Those that do, hardly ever get used. Thirty days after news apps, shopping apps, entertainment apps, education apps, have been downloaded, most will have lost over 90% of their users.

Big Tech laughs about all this crap production. This is how Big Tech makes so much money in its data centers. It sells them as plush five-star hotels for superior VIP data, when in reality it’s running a datafill, a data dump. If most of the data stored in a typical data center was processed and accessed every day, then everything would explode, the servers would fry. The demand would crash everything. The data center business case is dependent on most people never accessing the data they’ve stored. You’re paying for a data graveyard.

To protect their profits, Big Tech has historically made it very hard for you to delete. Artist, Honor Ash, observed how, “Initially, Gmail didn’t even include a delete button. Not only was it no longer necessary to delete emails regularly to make space in your inbox, but it was actually not even possible. This shift broke everyone’s deletion habit—it ended the ritualistic appraisal of what should be kept, and ushered in a default in which literally everything should.”

For almost thirty years of professional content management work, I’ve had to deal with the argument that we have to keep everything because you never know what will be important in the future. This was in the middle of intranets, websites and computer systems, where nobody could find anything because of the terrible search design and because of the awful information architecture. And what they did find was usually some sort of dodgy draft, some copy of a copy, or something that was way out of date, inaccurate or useless. Keeping all this junk data does not simply reduce the chances of findability, it also increases cybersecurity risk. Huge quantities of poorly structured and badly maintained data are an invitation to hacking and other risks.

Even if we could put a data center in every town and village in the world, we couldn’t keep everything. There is too much data being produced, vastly too much. We are now into the era of zettabytes. As my previous book, World Wide Waste, explained:

A zettabyte is 1,000,000,000,000,000? MB or one quadrillion MB. If a zettabyte was printed out in 100,000-word books, with a few images thrown in, then we would have one quadrillion books. It would take 20,000,000,000,000 (20 trillion) trees’ worth of paper to print these books. It is estimated that there are currently three trillion trees on the planet. To print a zettabyte of data would thus require almost seven times the number of trees that currently exist to be cut down and turned into paper.

Soon, we will be producing thousands of zettabytes a year. It’s a tsunami of data every day, every hour of every day, every minute of every hour, every second of every minute. As a result, important data that definitely does need storing is getting lost. In relation to research, for example, we are flooding our research environments with low-quality—often AI-produced—research paper garbage. It is becoming more and more expensive to store all this stuff. Research repositories are thus disappearing and lots of good research is being lost. In a thousand years, there may be more quality data artifacts on the Maya and Inca than on our digital generation. Digital is fragile, transient, and it will sink in its own crap.

Crap data everywhere

We need to talk about data. Crap data. We’re destroying our environment to create and store trillions of blurred photos and cat videos, binge watch Netflix, and ask ChatGPT inane questions and get instant wrong answers. We’re destroying our environment to store copies of copies of copies of stuff we have no intention of ever looking at again. We’re destroying our environment to take 1.4 trillion photos every year. That’s more photos taken in one single year in the 2020s than were taken in the entire 20th century. 10 trillion photos and growing, stored in the Cloud, the vast majority of which will never be viewed again. Exactly as Big Tech wants it. 

I have spent almost thirty years working with some of the largest organizations in the world, trying to help them to better manage their content and data. 90% plus of commercial or government data is crap, total absolute crap. Period. It should never have been created. It certainly should never have been stored. The rise of digital saw the explosion of data crap production. Content management systems were like giving staff diesel-fueled diggers, whereas before they only had data shovels. I remember around 2010 being in conversation with a Microsoft manager, who estimated that there were about 14 million pages on Microsoft.com at that stage, and that four million of them had never been visited. Four million, I thought. That’s basically the population of Ireland of pages that nobody has ever visited. Why were they created? All the time and effort and energy and waste that went into all these pages that nobody had ever read. We are destroying our environment to create crap.

Everywhere I went it was nothing but the same old story. Data crap everywhere. Distributed publishing that allowed basically anyone to publish anything they wanted on the intranet. And nobody maintaining anything. When Kyndryl, the world’s largest provider of IT infrastructure services, was spun off by its parent, IBM, they found they had data scattered over 100 disparate data warehouses. Multiple teams had multiple copies of the same data. After cleanup, they had deleted 90% of the data. There are 10 million stories like this.

Scottish Enterprise had 753 pages on its website. 47 pages got 80% of visits. A large organization I worked for had 100 million visits a year to its website, with 5% of pages getting 80% of visits. 100,000 of its pages had not been reviewed in 10 years. “A huge percentage of the data that gets processed is less than 24 hours old,” computer engineer, Jordan Tigani, stated. “By the time data gets to be a week old, it is probably 20 times less likely to be queried than from the most recent day. After a month, data mostly just sits there.” The Southampton University public website found that 0.2% of pages got 90% of visits. Only 4% of pages were ever visited. So, 96% of the roughly four million pages were NOT visited. One organization had 1,500 terabytes of data, with less than 2% ever having been accessed after it was first stored. There are 20 million more stories like these.

Most organizations have no clue what content they have. It’s worse. Most organizations don’t even know where all their data is stored. It’s even worse. Most organizations don’t even know how many computers they have. At least 50% of data in a particular organization is sitting on some server and nobody in management knows it even exists. The average organization has hundreds of unsanctioned third-party app subscriptions being paid for by some manager’s credit card, storing everything from project chats to draft reports to product prototypes.

The Cloud made crap data infinitely worse. The Cloud is what happens when the cost of storing data is less than the cost of figuring out what to do with the crap. One study found that data stored by the engineering and construction industry had risen from an average of 3 terabytes in 2018 to 26 terabytes in 2023, a compound annual growth rate of 50%! That sort of crap data explosion happened—and is happening—everywhere. And this is what AI is being trained on. Crap data.

Extreme secrecy of data centers

As soon as Lars Ruiter stepped out of his car, he was confronted by a Microsoft security guard seething with anger, Morgan Meaker recounted for Wired. Ruiter, a local Dutch councilor, had parked in the rain outside a half-finished Microsoft data center that rose out of the flat North Holland farmland. The guard had seen Ruiter before and did not like him snooping around. Suddenly, the guard had his hands around Ruiter’s throat.

Is there a more secretive empire in the world than Big Tech and its data centers? Big Tech realizes the power of data, the power it has over us when it has our data. Naturally, Big Tech—being full of intelligent people—realizes that if we had the same level of data on it as it has on us, then perhaps we might seek to control Big Tech in similar ways that Big Tech now controls our lives.

There’s an old saying in Big Tech when its tech bros have to respond to people who worry about all the data that Big Tech is sucking up about them: “If you’ve got nothing to hide, you’ve got nothing to worry about.” It’s such a disarming and innocent-sounding phrase. So comforting. However, does this mean that Big Tech, which hides its own data with a fanatical religiosity, has something very big to hide? Yes, of course it does. Big Tech has an awful lot to hide. It’s royally screwing us and the environment to build and maintain its empire.

Big Tech goes to huge efforts to deny academic institutions or other research bodies access to data that would help highlight the harm Big Tech does. “Without better transparency and more reporting on the issue, it’s impossible to track the real environmental impacts of AI models,” Kate Crawford, a research professor at USC Annenberg, who specializes in the societal impacts of AI, told the Financial Times. According to Julia Velkova, an associate professor at the University of Helsinki, “Big Tech corporations keep their operation secret. These companies are unapproachable and largely disconnected from the places in which they are built. They refuse to talk to researchers or the public but instead communicate through press releases and YouTube videos, through the platforms that they own.”

Data center secrecy is rampant, deliberate and consistent. “When it comes to Google, what’s really striking is the lack of transparency and information when it comes to these projects,” Sebastián Lehuedé, an expert in AI, has stated. How much water do US data centers use? “We don’t really know,” Lawrence Berkeley National Laboratory research scientist Dr. Arman Shehabi explained. “I never thought it could be worse transparency than on the energy side, but we actually know less.” According to Sebastian Moss, writing for Data Center Dynamics, “We don’t know how much water data centers use. We just know it’s a lot.” And Philip Boucher-Hayes, a journalist with the Irish national broadcaster, RTE, stated, “We have been really bad at reporting data centres accurately, largely because the data centres refuse to be transparent. I spent months trying to get interviews with some of the hyperscale operators here. They refused.”

Data centers are noisy as hell

You do not want to live close to a data center. Having one near your home is like having a lawn mower running in your living room 24/7, as one local resident described it. Residents talked about low-pitched roars interspersed with high-frequency screeches, as the whir of loud fans echoed through the air. A growing body of research shows that the type of chronic noise emitted by data centers is a hidden health threat that increases the risk of hypertension, stroke and heart attacks.

As Zac Amos, writing for HackerNoon, explains:

“Many data centers have on-site generators. Their cooling systems—essential for keeping hardware operational—contain intake and exhaust fans, which are objectionably loud. They produce between 55 and 85 dB typically. The noise is even more noticeable in rural areas where massive, nondescript buildings replace spaces that used to be forests or farmland.”

“Are data centers noisy at night? Most are since they run around the clock. Even if their volume doesn’t increase after hours, their loudness is more noticeable when it gets quiet. People often describe the noise as a buzzing, tinny whining or low-pitched roar. Even 60 dB—the low end of the typical spectrum—sounds like overlapping conversations or background music.”

According to Christopher Tozzi, writing for Data Center Knowledge:

“Hundreds of servers operating in a small space can create noise levels of up to 96 db(A). At the same time, the ancillary equipment that data centers depend on, like the HVAC systems that cool servers or generators that serve as backup power sources, add to the noise. These systems can be especially noisy on the outside of a data center, contributing to noise pollution in the neighborhoods where data centers are located. Data centers have long been noisy places, but they’re becoming even noisier as businesses find ways to pack ever-greater densities of equipment into data centers, and as they expand the power and cooling systems necessary to support that equipment.”

Antonio Olivo, writing for The Washington Post, told a story about Carlos Yanes and his family:

“Carlos Yanes believes he can tell when the world’s internet activity spikes most nights. It’s when he hears the sounds of revving machinery, followed by a whirring peal of exhaust fans that are part of the computer equipment cooling system inside an Amazon Web Services data center about 600 feet from his house. The sound keeps him awake, usually while nursing a headache brought on by the noise, and has largely driven his family out of the upstairs portion of their Great Oaks home, where the sound is loudest.”

With the evil bitcoin data centers, it’s even worse. “Residents of the small town of Granbury, Texas, say bitcoin is more than just a figurative headache,” Andrew Chow wrote for Science Friday. “Soon after a company opened up a bitcoin mine there a couple years ago, locals started experiencing excruciating migraines, hearing loss, nausea, panic attacks, and more. Several people even ended up in the emergency room. The culprit? Noise from the mine’s cooling fans.”