“To the best of our knowledge, it is the largest data center — we think of it as a campus — in the world,” OpenAI’s chief global affairs officer Chris Lehane told The Associated Press last week. “It generates, roughly and depending how you count, about a gigawatt of energy.”
Why is this guy saying a datacenter generates energy? It does literally the exact opposite. I guess you don’t need to actually know anything to get a leadership role at openai, as long as you can say lots of words.
Why is this guy saying a datacenter generates energy?
It’s less absurd than it sounds and requires understanding how modern data center facilities that are being deployed by big tech actually work and run at a facility-wide and systemic level. They do generate this energy, they just proceed to use it. Notice he says roughly a gigawatt of energy, which is nowhere near the gross need for the facility as per the article.
Most modern data centers built in the past few years, especially those that are “campuses” as described, have on-site power generation solutions. Sometimes this means classic oil/coal/gas generators on the property, sometimes it means more involved and nuanced situations. What Lehane is telling the AP here is that, of the energy consumed by the new data center as a whole, “roughly and depending how you count,” 1 gigawatt comes from such sources. The article clearly states the center is set to deploy at 1.8 gigawatts consumption scaling up to 10 gigawatts over the lifespan of the facility. Presumably these are on the same time scales and everything. Frankly, for an AP article this was written quite poorly and the exact meaning of most this information isn’t very clear. I don’t think that’s Lehane’s fault implicitly. Just seems like bad reporting.
People have this image in their heads of these big data centers opening up and just like, sucking up all the power from the local grid due to their demand and this is what causes things such as blackouts. This is mildly incorrect. The negative effects of these data centers’ power demands is less to do with them “overloading” public grids and more to do with the market economy of energy. You get blackouts because all the energy they can’t generate themselves on-site must be acquired somewhere else. They can walk up to the local power companies and buy energy just like any private citizen can. They often get discounted rates compared to the plebes, too. You end up with blackouts because the energy companies don’t give a shit who they sell their product to, they just care that it sells. When companies like Microsoft, Nvidia, or OpenAI roll up with significantly more capital and resources than anyone else in the local economy, they’re easily able to out-compete even the entirety of the local domestic power demand. That’s what causes blackouts.
No one wants to talk about this because it’s easier to just say braindead shit like “fuck datacenters/AI/big-tech/fuckingwhateveritis” so you can feel like you’re “on the right side” than it is to acknowledge the long line of people in both the public and private sectors who had to rubber-stamp personally fucking the average person for us to even get to this point. Does big tech suck absolutely, fat, stinking donkey balls? For fucking sure. Are they anything more than a symptom of a much more entrenched societal rot? Nope.
It’s exactly what they want. He is avoiding use of clear, concise sentences to mislead the layman. The data center is generating a gigawatt of energy, for their use. The Wyoming power grid is their battery.
Jfc you almost gave me a heart attack and I had to go re-read, it’s Wyoming, no data center here yet though Microsoft was proposing one in Racine. It’s only a matter of time with all the water here.
But this proposed data center is so big, it would have its own dedicated energy from gas generation and renewable sources, according to Collins and company officials.
The “depending on how you count” probably refers to the renewables.
Yeah that language is pure corporate BS - data centers CONSUME energy at massive scales (up to 1 gigawatt in this case, which is insane), they’re literally just giant heaters that occasionally produce AI outputs as a byproduct of all that wasted electricty.
1 gigawatt is not that insane, and I doubt it’s what the datacenter consumes. A rack can easily get into double, even triple digits kW for GPU heavy setups. So let’s say 10 racks per Megawatt. I’m sure such a datacenter has more than 10.000 racks. Plus A/C, and all other “ancillary” uses. A normal datacenter can get close to 1 GW, this thing might be double digits, but I doubt they will publish exact numbers.
Every square kilometer of land (0.38 Sq miles in freedom units) gets about a GW of heat from the sun (depending on latitude). I doubt one datacenter will contribute that much…
I don’t know, I’ve been in some hot places but massive cooling towers tend to radiate a bit more (now I know what I’m reading about today) and a data center without the ability to pump heat outside isn’t going to make it a whole day before it’s toast.
Not necessarily disagreeing, just curious about how much heat is dispersed by the ones here.
Some of these facilities do generate a significant portion of their own electricity via various means. It’s not like that amount of energy is just sitting out there on the grid waiting to be used. Somebody has to generate it and if you’re already investing millions in rectifiers, batteries, and other data center power systems, why wouldn’t you consider taking it a step further?
Tell me it’s not gonna be generating power with “portable“ generators that narrowly avoid stricter regulation thanks to the guy who bought Twitter pushing them around the data center parking lot every few months.
My data center has 35MW of generators onsite. No modern DC is designed nor built without backup generators to allow continuous operation during any utility power outages.
Why is this guy saying a datacenter generates energy? It does literally the exact opposite. I guess you don’t need to actually know anything to get a leadership role at openai, as long as you can say lots of words.
It’s less absurd than it sounds and requires understanding how modern data center facilities that are being deployed by big tech actually work and run at a facility-wide and systemic level. They do generate this energy, they just proceed to use it. Notice he says roughly a gigawatt of energy, which is nowhere near the gross need for the facility as per the article.
Most modern data centers built in the past few years, especially those that are “campuses” as described, have on-site power generation solutions. Sometimes this means classic oil/coal/gas generators on the property, sometimes it means more involved and nuanced situations. What Lehane is telling the AP here is that, of the energy consumed by the new data center as a whole, “roughly and depending how you count,” 1 gigawatt comes from such sources. The article clearly states the center is set to deploy at 1.8 gigawatts consumption scaling up to 10 gigawatts over the lifespan of the facility. Presumably these are on the same time scales and everything. Frankly, for an AP article this was written quite poorly and the exact meaning of most this information isn’t very clear. I don’t think that’s Lehane’s fault implicitly. Just seems like bad reporting.
People have this image in their heads of these big data centers opening up and just like, sucking up all the power from the local grid due to their demand and this is what causes things such as blackouts. This is mildly incorrect. The negative effects of these data centers’ power demands is less to do with them “overloading” public grids and more to do with the market economy of energy. You get blackouts because all the energy they can’t generate themselves on-site must be acquired somewhere else. They can walk up to the local power companies and buy energy just like any private citizen can. They often get discounted rates compared to the plebes, too. You end up with blackouts because the energy companies don’t give a shit who they sell their product to, they just care that it sells. When companies like Microsoft, Nvidia, or OpenAI roll up with significantly more capital and resources than anyone else in the local economy, they’re easily able to out-compete even the entirety of the local domestic power demand. That’s what causes blackouts.
No one wants to talk about this because it’s easier to just say braindead shit like “fuck datacenters/AI/big-tech/fuckingwhateveritis” so you can feel like you’re “on the right side” than it is to acknowledge the long line of people in both the public and private sectors who had to rubber-stamp personally fucking the average person for us to even get to this point. Does big tech suck absolutely, fat, stinking donkey balls? For fucking sure. Are they anything more than a symptom of a much more entrenched societal rot? Nope.
It’s exactly what they want. He is avoiding use of clear, concise sentences to mislead the layman. The data center is generating a gigawatt of energy, for their use. The Wyoming power grid is their battery.
Jfc you almost gave me a heart attack and I had to go re-read, it’s Wyoming, no data center here yet though Microsoft was proposing one in Racine. It’s only a matter of time with all the water here.
The “depending on how you count” probably refers to the renewables.
Yeah that language is pure corporate BS - data centers CONSUME energy at massive scales (up to 1 gigawatt in this case, which is insane), they’re literally just giant heaters that occasionally produce AI outputs as a byproduct of all that wasted electricty.
1 gigawatt is not that insane, and I doubt it’s what the datacenter consumes. A rack can easily get into double, even triple digits kW for GPU heavy setups. So let’s say 10 racks per Megawatt. I’m sure such a datacenter has more than 10.000 racks. Plus A/C, and all other “ancillary” uses. A normal datacenter can get close to 1 GW, this thing might be double digits, but I doubt they will publish exact numbers.
I guess they could say they are generating 1GW of computing power
More like a GW of heat… Thankfully I’m sure that will counteract whatever has caused it to be over 80 degrees on my way to work before 0700.
Every square kilometer of land (0.38 Sq miles in freedom units) gets about a GW of heat from the sun (depending on latitude). I doubt one datacenter will contribute that much…
I don’t know, I’ve been in some hot places but massive cooling towers tend to radiate a bit more (now I know what I’m reading about today) and a data center without the ability to pump heat outside isn’t going to make it a whole day before it’s toast.
Not necessarily disagreeing, just curious about how much heat is dispersed by the ones here.
Some of these facilities do generate a significant portion of their own electricity via various means. It’s not like that amount of energy is just sitting out there on the grid waiting to be used. Somebody has to generate it and if you’re already investing millions in rectifiers, batteries, and other data center power systems, why wouldn’t you consider taking it a step further?
Tell me it’s not gonna be generating power with “portable“ generators that narrowly avoid stricter regulation thanks to the guy who bought Twitter pushing them around the data center parking lot every few months.
I read something about natural gas powered power plants; not sure if it’s this one specifically.
Because unfortunately this is not the only gigantic climate destroyer AI thingy planned/built.
AI Data centers noticeably fuck with the grid. As a result they are facing internal and external pressure to generate more of their own power. Microsoft is opening a nuclear power plant. I would not be shocked to learn through solar, wind, and coal they provide the majority of their own power
the only thing that makes sense is heat
Is that true, though?
My data center has 35MW of generators onsite. No modern DC is designed nor built without backup generators to allow continuous operation during any utility power outages.