Author Archives: Gerry McGovern

Data centers are noisy and smelly

You do not want to live close to a data center. Having one near your home is like having a lawnmower running in your living room 24/7, as one local resident described it. Residents talked about low-pitched roars interspersed with high-frequency screeches, as the whir of loud fans echoed through the air. A growing body of research shows that the type of chronic noise emitted by data centers was a hidden health threat that increased the risk of hypertension, stroke and heart attacks. As Zac Amos, writing for HackerNoon, explained:

“Many data centers have on-site generators. Their cooling systems—essential for keeping hardware operational—contain intake and exhaust fans, which are objectionably loud. They produce between 55 and 85 dB typically. The noise is even more noticeable in rural areas where massive, nondescript buildings replace spaces that used to be forests or farmland. Are data centers noisy at night? Most are since they run around the clock. Even if their volume doesn’t increase after hours, their loudness is more noticeable when it gets quiet. People often describe the noise as a buzzing, tinny whining or low-pitched roar.”

According to Christopher Tozzi, writing for Data Center Knowledge:

“Hundreds of servers operating in a small space can create noise levels of up to 96 db(A). At the same time, the ancillary equipment that data centers depend on, like the HVAC systems that cool servers or generators that serve as backup power sources, add to the noise. These systems can be especially noisy on the outside of a data center, contributing to noise pollution in the neighborhoods where data centers are located … They’re becoming even noisier as businesses find ways to pack ever-greater densities of equipment into data centers, and as they expand the power and cooling systems necessary to support that equipment.”

Antonio Olivo, writing for The Washington Post, told a story about Carlos Yanes and his family:

“Carlos Yanes believes he can tell when the world’s internet activity spikes most nights. It’s when he hears the sounds of revving machinery, followed by a whirring peal of exhaust fans that are part of the computer equipment cooling system inside an Amazon Web Services data center about 600 feet from his house. The sound keeps him awake, usually while nursing a headache brought on by the noise, and has largely driven his family out of the upstairs portion of their Great Oaks home, where the sound is loudest.”

With the evil bitcoin data centers, it’s even worse. Andrew Chow wrote for Science Friday:

“Residents of the small town of Granbury, Texas, say bitcoin is more than just a figurative headache. Soon after a company opened up a bitcoin mine there a couple years ago, locals started experiencing excruciating migraines, hearing loss, nausea, panic attacks, and more. Several people even ended up in the emergency room. The culprit? Noise from the mine’s cooling fans.”

Air pollution from power plants and backup diesel generators that supply electricity to data centers is “expected to result in as many as 1,300 premature deaths a year by 2030 in the United States,” a study by University of California and Caltech scientists found. “Total public health costs from cancers, asthma, other diseases, and missed work and school days are approaching an estimated $20 billion a year.”

Data center energy surge

Big Tech’s vaunted data center energy efficiency gains were not what they seemed. Despite substantial progress between 2007 and 2018, energy efficiency didn’t actually improve much after that. Amazon, Google, Microsoft and Meta more than doubled their energy demand between 2017 and 2021. Driven by AI, emissions figures from data centers were expected to double, treble or quadruple by 2030. Were AI-driven data centers adding a Germany-worth or a Japan-worth of electricity every couple of years? Was one data center using as much electricity as three million people? Did new data centers come with nuclear power plants attached?

“Something unusual is happening in America,” The New York Times reported. “Demand for electricity, which has stayed largely flat for two decades, has begun to surge.” In little old Ireland, by the early 2020s, data centers were consuming over 20% of Irish electricity, more than every city, town and village in the country.

Data centers often have an even bigger impact on electricity use in a region than the actual electricity they use, because of a practice called ‘air bookings’, where Big Tech locks up future electrical capacity just in case it needs it. This blocks development in the local area. In Skåne, Sweden, for example, “Microsoft booked so much electricity from the grid in the Malmö region that the local Swedish bread company Pågen could no longer build a bread-baking factory in the area and had to expand elsewhere,” said Julia Velkova, assistant professor at the University of Helsinki.

All this surge in data center electricity demand meant that coal plants were being kept in service longer, while oil, gas and nuclear stocks were booming. In the USA, “residents in the low-income, largely minority neighborhood of North Omaha celebrated when they learned a 1950s-era power plant nearby would finally stop burning coal,” Evan Halper wrote for The Washington Post. The good news didn’t last long because of the explosive growth of data centers. The same happened in Georgia, where “one of the country’s largest electric utilities, Southern Company, made a splash when it announced it would retire most of its coal-fired power plants,” Emily Jones wrote for Grist. However, because of an “extraordinary spike in demand for electricity” driven by data center growth, they too had to cancel their plans.

Even the data centers that claimed to use wind, solar or hydro energy had ‘backup’ diesel generators that were getting used more and more as their frenzied growth maxed out local electricity grids. Some were also questioning how Big Tech was monopolizing wind, solar, etc., that could instead be used to help homes and local businesses move away from fossil fuels. “For a lot of individuals and politicians, the fact that we use energy from newly constructed wind parks for the benefit of hyper-scale data centers feels out of balance,” explained Julia Krauwer, a technology analyst at Dutch bank ABN Amro.

Data center energy scam

For years, energy efficiency was the great big shining bright green fabulously good spinning story of the Big Tech data center love of and care for our environment. This was quite a feat of PR spinning, when you consider that a small area of a data center in the 2020s had more power than all of available computing around 1980, and that the electricity demand per square meter can be 50–100 times greater than for a normal office building. Despite this, the ‘efficiency’ story was spun from the early 2000s that while data was exploding, the quantities of energy required to run these data centers were not growing at anywhere near the same pace. Added to that, whatever energy was needed was ‘renewable’, and we all know that renewable energy has zero cost to the environment, don’t we? So all’s good in techland. The Efficiency Bros have come riding to the rescue yet again. Data centers were energy efficient, and getting more efficient every year, to the point where they would soon dematerialize and become invisible, requiring nothing to run on other than fresh air. Woo-hoo!

Behind the scenes, in reality, Big Tech worked very hard to manipulate, lie, greenwash, scam and spin like a trumpista, as it piled on the creative, innovative accounting. “Data center emissions were probably 662% higher than big tech claims,” Isabel O’Brien wrote for The Guardian.

“Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent—all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone—a more than 19,000 times increase.”

The way data centers measure electrical energy usage is through what is called Power Usage Effectiveness (PUE). The closer the PUE is to 1, the more ‘efficient’ the data center is. Let us park for a moment the awkward truth that improvements in technological efficiency have rarely if ever led to overall reductions in energy use—quite the total opposite, actually. Park that nasty thought for a moment. Here’s another nasty thought to park. Much of the PUE efficiency gains were achieved by massively increasing the use of water for cooling. Another way was by churning through computer servers and creating mountains of e-waste. Energy use was “being reduced by just throwing more material at the problem,” Johann Boedecker, founder of the circular economy consultancy Pentatonic, told the Financial Times. Data centers became a prime source of e-waste as they dumped perfectly good working servers, chasing that new server with that slightest of slight gains in energy efficiency. Again, we see how vital it is not to allow the focus to be on a single metric. We must calculate the true and total long-term costs to our environment—to the water, the air, the soil, the biodiversity, the climate.

It’s not drought, it’s pillage: data centers take the water

Using a slew of aliases to buy land and make sweet deals, getting secret, historic tax breaks, getting electricity at less than half of what ordinary people pay, being sold public land for less than half the market value, slurping down the cheapest of cheap water like there’s no tomorrow, this is how Big Tech rolls. “Google faced criticism for its plans to build a massive data center in Mesa, Arizona, after it was revealed that the company would pay a lower water rate than most residents,” Eric Olson, Anne Grau and Taylor Tipton reported for the University of Tulsa. “The deal, negotiated with the city, allowed Google to pay $6.08 per 1,000 gallons of water, while residents paid $10.80 per 1,000 gallons.”

In Uruguay, the people thought their fresh water was so plentiful they wrote it into their constitution as a basic citizen right. Along came climate change. In 2022, “Montevideo was the first capital in the world to arrive at ‘day zero’ and run out of potable water,” environmentalist Eduardo Gudynas told Mongabay. The citizens were forced to drink salty water from the Rio de la Plata river. Meanwhile, Google had plans for Uruguay’s fresh water and, as usual, it wasn’t telling anyone. Citizens had to go to court to force Google to disclose that its cooling towers would need 7.6 million liters (2.0 million gallons) of fresh water a day. “No es sequia, es saqueo!”—“It’s not drought, it’s pillage,” the citizens said. While over in Chile, they were asking, “With Google as my neighbor, will there still be water?”

When it comes down to a choice between a poor person drinking and the data drinking, it’s clear that the data comes first. “When Hurricane Maria and Hurricane Irma devastated Puerto Rico, the data centers on the island did not go hungry for power or thirsty for water,” researcher Steven Gonzalez, whose family are from the country, told me. The data drank its fill, even as the citizens went without access to fresh water for months. In Nigeria, they had to “buy water to cook,” Felix Adebayo, resident of Lagos, told Abdallah Taha and Alfred Olufemi. Meanwhile, close by, at least ten data centers drank their fill.

We are only at the beginning of the Big Tech war for water and other resources. It’s going to get much worse. The Big Tech Nazis are waving their arms and telling us who they truly are. We should believe them. The Big Tech Nazis would think nothing of wiping out half the planet once it helped them build their rocket to Mars, or whatever technology they’re building that will help them consolidate power.

There is a global water crisis. Lands are desertifying at frightening rates as soils collapse dues to the stresses of overconsumption driven by Big Tech. We have pumped so much groundwater in the last 50 years, we have altered the earth’s tilt. And now along comes ravenous AI, its thirst doubling and doubling again. Until recently, most data centers have been getting water so cheap they haven’t even bothered managing it. Only constant public pressure can have any chance of reining in the excesses of Big Tech.

Big Tech’s water use is 100 times bigger than expected

The total amount consumed by Big Tech could be much, much higher than what they nominally disclose. “When it comes to water, Big Tech only shows its direct water consumption, while hiding its real water footprint,” Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, told me. Based on his research, “Apple’s real water footprint is 100 times what it shows for its direct water consumption. By some calculations, I found that Apple’s real water footprint was about 300-600 billion liters in 2023, which is comparable to Coca Cola’s overall annual water footprint.” Shaolei went on to tell me that:

“More importantly, water footprint has “colors”: Coca Cola’s water footprint is largely “green water footprint” (i.e., water contained in soils and only usable for plants). On the other hand, Big Tech’s water footprint is mostly blue water footprint (i.e., water in surface water and groundwater that is directly usable for humans). The real issue with AI’s water footprint is a lot more (10-100 times) serious. Each Big Tech is a hidden “Coca Cola” in terms of the water footprint.”

In a town called The Dalles in Oregon, USA, local people were worried that Google’s water use was soaring. As is so often the case, the city officials, who had given Google millions in tax breaks, had no intention of letting anyone know how much water Google was using. It was up to a regional paper, the Oregonian, to try and find out. They were forced to bring a case to court. City officials were ordered by Google to argue in court that Google’s use of scarce public water was a “trade secret”. After more than a year of proceedings, city officials were forced to tell their own citizens how much public water Google was using.

“But most troubling in the affair,” Binoy Kampmark wrote for Scoop, “leaving aside the lamentable conduct of public officials, was the willingness of a private company to bankroll a state entity in preventing access to public records.” Actually, it was even worse than that, as Erin Kissane, a respected technology writer, informed me. “Rather than the Oregonian suing for access, the city of The Dalles actually sued the Oregonian in a ‘reverse public records lawsuit’ to prevent the paper from disclosing the data, despite their county district attorney having already ruled that the information should be disclosed. Google funded the suit until the press got too bad and then pulled out, so the city settled.”

In another story of “trade secrets”, David Wren, writing for the Post and Courier in Dorchester County, USA, warned that the amount of public water Google was demanding “is a closely guarded ‘trade secret.’” Imperious Google had imposed a gag order on Dorchester city officials, warning them that they must not tell the public anything about the Google project, particularly how much public water Google was slurping. Again, dragged kicking and screaming, KGB Google was finally forced to tell the public of Dorchester how much public water it was using. “After fighting its disclosure for more than a year, Dorchester County has agreed to publicize the amount of water used at a data center Google is building, reversing its previous stance that the information is a closely guarded trade secret that shouldn’t be shared with the public,” David Wren wrote for the Post and Courier. A victory of sorts for the community? Except that the community would also learn in the disclosure that Google demanded from Dorchester that in case of any natural emergency, its data center would have priority on the water. Like everywhere else, Big Tech demands that its data must drink its fill before people get to drink.

Big Tech lies about its water use

There is a reason why Big Tech has been so super-secretive about its water use. It’s because it was getting it so cheap that it wasn’t even worth measuring it. Water was this invisible externality, and because it was invisible to the public, Big Tech could keep telling the story of how it was becoming more and more efficient, more and more green, more and more “in the Cloud.” The cloud is on the ground. Big Tech abuses and misuses water at an alarming rate, treating it as essentially a free resource, and to keep on doing that, it was essential that the public should not know what was happening.

“Water consumption in data centers is super embarrassing,” a data center designer stated as far back as 2009. “It just doesn’t feel responsible.” Survey after survey showed that about 60% of data centers saw “no business justification for collecting water usage data.” Think about that. They were getting the water so cheap they didn’t even bother metering it. In a Microsoft data center in San Antonio, Texas, it was found that the actual cost of water should have been 11 times higher than what the company was paying. In drought-stricken Holland, a Microsoft data center slurped 84 million liters of drinking water in one year, when the local authority said the facility would only need 12 to 20 million liters.

Data centers know they have been hugely abusive of their water use and thus are desperate to hide usage figures from the general public. “There were actually water documents tracking how much this data-center campus was using,” journalist Karen Ho explained about a Microsoft data center. “But when the city came back with documents about that, everything was blacked out. They said that it was proprietary to Microsoft and therefore they couldn’t provide that information.” Public water use is proprietary to Microsoft. Sounds about right.

“The reason there’s not a lot of transparency, simply put, I think most companies don’t have a good story here,” stated Kyle Myers, a senior manager at a data center company. Data centers have a choice. They can either consume less water and use more electricity. Or they can use less energy and consume more water. “Water is super cheap,” Myers said. “And so people make the financial decision that it makes sense to consume water.”

In 2023, Bluefield Research estimated that, on a global basis, data centers were using more than 360 billion liters of water a year, including water used in energy generation. It predicted this figure would rise to over 600 billion liters a year by 2030. However, China Water Risk estimated that in 2024 China’s data centers alone used about 1.3 trillion liters of water, which is the equivalent of what 26 million people need. This amount would grow to more than 3 trillion liters a year by 2030, due to the explosive growth of AI. Which figures are correct? We don’t know.

Why do data centers love deserts?

In so many ways, data center water use is more intensive than the way an ordinary person uses water, as Shaolei Ren explained to Reece Rogers for Wired:

“The water that is available for people to use is very limited. It’s just the fresh surface water and groundwater. Those data centers, they’re just evaporating water into the air. They’re different from normal, residential users. When we get the water from the utility, and then we discharge the water back to the sewage immediately, we are just withdrawing water—we’re not consuming water. A data center takes the water from this utility, and they evaporate the water into the sky, into the atmosphere, where it may not return to the earth’s surface until a year later.”

Data centers love to be efficient. That’s why they love deserts, because dry air reduces the risk of damage and corrosion to their sensitive servers and other electrical equipment, and thus helps them to run more cost effectively and efficiently. As Steven Gonzalez explained to me:

“If you have access to cheap fresh water, deserts are a great place for data centers because they are so dry—and computers hate moisture and high humidity. That’s why there are so many data centers in Arizona. It’s almost like the goldrush. It’s a water-rush. All these companies are clustering to get this cheap water. But it’s doomed. We see how communities are struggling to pay their water bills, while data centers and other industries are getting water at a much cheaper rate. There are farmers who are directly competing with data centers to grow food. Indigenous communities are also having difficulties accessing water. The draining of the Colorado river is affecting the migration patterns of salmon and other fish, which are really important to their lifecycles.”

Of course, water evaporation in a desert environment is going to be very intense. Nicolas Dubé of Hewlett Packard Enterprise, described it as a criminal activity.

“Some hyperscalers, I’m not going to name them, built large datacenters in Arizona, New Mexico, and very dry countries. You build datacenters there, and if you use evaporative cooling, you’re going to have spectacular PUE (Power Use Efficiency). However, you’re going to consume a resource that’s way more important to that community than optimizing for a few percent of the energy consumption. I think that’s criminal. I think they should be jailed for doing that.”

Big Tech capitalist pursuit of efficiency and lowest costs is not simply ruinous to local water and to freshwater supply in general. “There is more moisture in a warming sky, a 7% increase for every degree Celsius of warming,” Kate Marvel wrote in The Climate Book. Data center evaporative cooling sends even more water up into the atmosphere, thus accelerating global warming and super storms. Historically, this water was used by plants and animals, stored in soils or underground in aquifers. Now, it’s up in the skies, accelerating droughts and super storms, where it can rain more in one day than it usually rains in a year. Droughts and floods are terrible twins.

Data center water scam

Data gets hot. Computer servers get hot. They need to be cooled. There is a direct correlation between the amount of energy a server or data center is using and the amount of water required to keep it cool. “On average, between one to nine liters of water are evaporated during the cooling process of one kilowatt hour (kWh), measuring the amount of energy needed to power one machine that consumes 1,000 watts for one hour,” Maxim Melamedov wrote for Data Center Dynamics in 2024.

With AI, this digital thirst is surging. Our data is drinking more water than we do. Why does data need to drink so much water? Because, despite there being a global freshwater crisis, water is sold to Big Tech at super-cheap rates. Thus, it can be up to 10 times cheaper for Big Tech to choose water as a cooling agent versus other options.

If you look at the back of some fridge freezers you will find piping running back and forth. A server rack can have the same type of intricate and extensive piping, down which water—or other liquids—flow. As the servers hum, their heat is transferred through these pipes into the water. When the water reaches a certain temperature, it moves along the piping out of the server room and into what is called a cooling tower. There, the pipes are sprayed by other water to cool them down. So, there are two sets of water involved, one that is in the pipes and one that is in the cooling tower, and they don’t mix. As the water is sprayed on the pipes in the cooling tower, much of it evaporates. This is what is called evaporate cooling. As the water evaporates, it leaves behind a waste made up of minerals, metals and salt. Historically, data center water systems had been known as a source for Legionnaires’ disease. To avoid such infectious risks, chlorine and bromine-based chemicals and disinfectants are added. Bromine is highly toxic for living systems, targeting the nervous system and brain. Even with chemical treatment, the water in these pipes cannot be used endlessly, as data center expert Steven Gonzalez explained to me:

“As water is being warmed and flowing through these data centers, microorganisms flourish in these conditions. That is one reason why data centers turn to drinking water because that water has already to some degree been treated, so there is less of a risk of these microbial blooms happening. For the same microbial reason, the water can’t be endlessly recycled. It has to be dumped or returned to the sewers because even with reverse-osmosis filters and other techniques, these microbes will flourish.”

It is important to understand that we are only scratching the surface of data center demand for water, energy and other materials. A tsunami would not sate the big mouth of Big Tech. And be certain that, as water gets scarcer, Big Tech will be first in line to drink its fill. We are only at the beginning of the age of Big Tech’s hunger for atoms.

Anatomy of a data center

A data center moving into a community is like a prison setting up. Only worse. Super-high, aggressive security; ugly warehouse buildings. A prison will bring a decent quantity of jobs. Data centers bring hardly any jobs. What’s more, a data center will consume massively more water and electricity than a prison, while also causing far more noise and toxic e-waste.

Data centers are the new mines. They’re there to mine data, to mine electricity, to mine water, to mine us, to make us slaves to the data, data slaves, imprisoned and exploited by our own data. We are the new ore that the old imperialists and colonizers have come to extract from and then discard. For most data is not for the common good. It is for the sale of goods. Particularly for the sale of goods we don’t need and that are bad for the environment, for from these types of goods and services are the maximum profits made. Data centers are the hubs of surveillance-capitalist planned-obsolescence overconsumption.

Data is physical. For every byte that exists, energy, water and materials are required. Data exists on a machine in a building. A typical data center might be around 9,000 square meters (about 100,000 square feet). That’s about the size of a large supermarket. A data center tends to get classified as “hyper-scale” when it has more than 5,000 computer servers and is more than 10,000 square meters. At about 1.6 million square meters, the CITADEL, in Nevada—the driest state in the USA—is one of the largest data centers. It’s about 175 supermarkets in size. Very physical.

Inside these buildings, we mainly find what are called server racks or server cabinets. These are generally metals cabinets that are used to store the computer servers. A typical server rack is 42U (U being the measurement of 1 rack unit). The rack is about 200 cm (78.5 in) high, 48 cm (19 in) wide, and 80 cm (36 in) or 107 cm (42in) deep, making it about the size of a large family fridge freezer. In a non-AI rack, you might find about 30 servers.

Before the 2020s, each computer server might be consuming about 200 watts of electricity, with the total watts for the rack being about 5 kW or 5,000 watts. This began to change rapidly as the 2020s progressed, with some non-AI servers consuming as much as 700 watts, with the average being around 400 watts. So, an early to mid-2020s non-AI server rack with 30 servers, might be consuming about 12,000 watts and potentially as many as 21,000 watts.

AI radically changed those calculations. AI servers themselves can be four to five times taller than a traditional server, containing as many as eight graphic processor units (GPU). These are the intense processors that made NVIDIA one of the most valuable companies in the world. Each GPU can demand between 700 and 1,000 watts. Assuming an 800 watts average, an AI server could be demanding about 6,400 watts per server. Meaning each AI GPU server was using what a whole rack used to use in 2020.

You might get about eight AI servers in a rack, meaning that an AI rack could easily have a demand of 50,000 watts or more. That’s 10 times what a 2020 non-AI server rack would have demanded. That fridge freezer that’s about the same size as a server rack consumes about 200 watts, so that AI server rack is demanding 250 times or more energy than the fridge freezer. In the good old days—that’s pre-2020—a decent-sized data center might be classified as being five megawatt, which meant it had about 1,000 server racks. By the middle of the 2020s, data centers were not talking megawatts anymore. It was now gigawatts. One gigawatts is a billion watts. That’s enough electricity for about two million people. Some of super-mega data centers were demanding up to three gigawatts. There were plans for certain data centers to have their very own nuclear power plant. It’s not sustainable.

Moore’s Law is breaking down

Chips and storage would get cheaper, faster and more powerful forever, they said. No limits. Every couple of years, Santa would arrive with twice the power at half the price. They called it Moore’s Law. For what seemed like an eternity—from the 1960s—chips were getting faster and smaller and cheaper. Then in the 2010s, things began to slow and the limits of physics came into sight. This changed a lot. For one thing, it meant even more data centers would be needed based on data growth because if the servers weren’t getting that much more powerful and cheap every year or so, you were going to need more of them for all that exploding data.

They called it Dennard Scaling. As the technology generations advanced, chips would halve in size and in power consumption requirements. You could thus double the number of transistors while maintaining the same power consumption. From about 2005, Denard Scaling stopped scaling quite like it used to. It seems things can only get so small before running up against those basic laws of physics. When the walls in the chip architecture get to a certain level of thinness, stuff begins to leak through. Inefficiencies and constraints emerge.

They called it Koomey’s Law. It proclaimed that “at a fixed computing load, the amount of battery you need will fall by a factor of two roughly every eighteen months.” Great news for battery-driven devices getting smaller and more powerful. Except that after 2000, this law too began to show stress. It was negatively affected, of course, by the slowing of Moore’s Law and Denard Scaling. By those awkward laws of physics.

Toxic optimism is the fevered faith of Silicon Valley, as many still claimed that Moore’s Law was the law of the land. Acceleration forever. Impossible is nothing. Fake it till you make it. There is always a breakthrough waiting around the corner if you throw enough money, materials, energy, water, hype and dodgy practices at it. So, the tech faithful believed like the faithful at a Las Vegas roulette table believed. The frenzy of progress, with things doubling and halving every couple of years, the sheer rush of innovation that was churning up so much material and spitting out so much waste. No time for wisdom, for the slightest reflection or doubt. Caught up in the frenzy of intelligent design. Just doing it. Dragged along by the relentless speed of things.

It all ends as toxic waste. Almost all chips are discarded long before they ever even wear out. Silicon chips are designed from a single-use perspective focused on maximizing speed and functionality for a specific purpose at the lowest possible cost. Chips are expected to have a very short life and nobody thinks about value after that life has expired. Refurbishment, reuse or recycling are not considerations. The old chip is dead. Dump it. Burn it. Bury it. Long live the new chip. Driven by cost reduction concerns, generation after generation, the value of the materials in a chip declines. There is no money in recycling silicon. It is the ultimate throwaway toxic waste.