@danjones000 @Matt_Noyes read the rest of the thread.
-
@danjones000 @Matt_Noyes read the rest of the thread. Adding in training gets us to about 0.0065kg CO2eq, although it's not clear if they count that in the original estimate.
-
@danjones000 @Matt_Noyes read the rest of the thread. Adding in training gets us to about 0.0065kg CO2eq, although it's not clear if they count that in the original estimate.
This post is deleted! -
@danjones000 @Matt_Noyes read the rest of the thread. Adding in training gets us to about 0.0065kg CO2eq, although it's not clear if they count that in the original estimate.
@evan@cosocial.ca The article that @Matt_Noyes@social.coop posted in the thread pretty clearly laid out the massive amount of electricity that AI data centers are using. That article isn't the only one saying the same thing. AI is using massive amounts of power.
Traditional server racks consume 5-15 kW, while AI-optimized racks with high-performance GPUs require 40-60+ kW. Some cutting-edge AI training facilities are pushing individual racks to 100+ kW, fundamentally changing data center design and cooling requirements. (ref)
It doesn't really matter if the power is coming from requests to the API, running the models, training the models, or making ritual sacrifices to Baphomet in hopes of making the models sentient.
If someone is using AI, they are indirectly contributing to that power usage. If you can acknowledge that and make peace with it, fine. But, saying that the energy cost is minimal in light of this is ignoring reality.
-
@evan@cosocial.ca The article that @Matt_Noyes@social.coop posted in the thread pretty clearly laid out the massive amount of electricity that AI data centers are using. That article isn't the only one saying the same thing. AI is using massive amounts of power.
Traditional server racks consume 5-15 kW, while AI-optimized racks with high-performance GPUs require 40-60+ kW. Some cutting-edge AI training facilities are pushing individual racks to 100+ kW, fundamentally changing data center design and cooling requirements. (ref)
It doesn't really matter if the power is coming from requests to the API, running the models, training the models, or making ritual sacrifices to Baphomet in hopes of making the models sentient.
If someone is using AI, they are indirectly contributing to that power usage. If you can acknowledge that and make peace with it, fine. But, saying that the energy cost is minimal in light of this is ignoring reality.
@danjones000 @Matt_Noyes electricity is only about 1/3 of global emissions. All data centers, including AI datacenters, are only 1% of electricity usage. That makes 0.3% of total emissions.
Much more emissions are due to cars, meat, cement production and rice cultivation.
I recognize that AI has a lot of electricity use; it is nothing compared to the other things you do with your time.
-
@danjones000 @Matt_Noyes electricity is only about 1/3 of global emissions. All data centers, including AI datacenters, are only 1% of electricity usage. That makes 0.3% of total emissions.
Much more emissions are due to cars, meat, cement production and rice cultivation.
I recognize that AI has a lot of electricity use; it is nothing compared to the other things you do with your time.
@danjones000 @Matt_Noyes It does not hurt to try to reduce those emissions; reducing any emissions is good.
But shaming people for using AI as if they are solely responsible for climate change is intellectually dishonest.
There are plenty of other problems with AI; "burning up the planet" is not a convincing one.