@evan@cosocial.ca The article that @Matt_Noyes@social.coop posted in the thread pretty clearly laid out the massive amount of electricity that AI data centers are using. That article isn't the only one saying the same thing. AI is using massive amounts of power.
Traditional server racks consume 5-15 kW, while AI-optimized racks with high-performance GPUs require 40-60+ kW. Some cutting-edge AI training facilities are pushing individual racks to 100+ kW, fundamentally changing data center design and cooling requirements. (ref)
It doesn't really matter if the power is coming from requests to the API, running the models, training the models, or making ritual sacrifices to Baphomet in hopes of making the models sentient.
If someone is using AI, they are indirectly contributing to that power usage. If you can acknowledge that and make peace with it, fine. But, saying that the energy cost is minimal in light of this is ignoring reality.