Saying it uses less power that a toaster is not much. Yes, it uses less power than a thing that literally turns electricity into pure heat… but that’s sort of a requirement for toast. That’s still a LOT of electricity. And it’s not required. People don’t need to burn down a rainforest to summarize a meeting. Just use your earballs.
Saying it uses less power that a toaster is not much
Yeah but we’re talking a fraction of 1%. A toaster uses 800-1500 watts for minutes, local LLM uses <300 watts for seconds. I toast something almost every day. I’d need to prompt a local LLM literally hundreds of times per day for AI to have a higher impact on the environment than my breakfast, only considering the toasting alone. I make probably around a dozen-ish prompts per week on average.
That’s still a LOT of electricity.
That’s exactly my point, thanks. All kinds of appliances use loads more power than AI. We run them without thinking twice, and there’s no anti-toaster movement on the internet claiming there is no ethical toast and you’re an asshole for making toast without exception. If a toaster uses a ton of electricity and is acceptable, while a local LLM uses less than 1% of that, then there is no argument to be made against local LLMs on the basis of electricity use.
Your argument just doesn’t hold up and could be applied to literally anything that isn’t “required”. Toast isn’t required, you just want it. People could just stop playing video games to save more electricity, video games aren’t required. People could stop using social media to save more electricity, TikTok and YouTube’s servers aren’t required.
People don’t need to burn down a rainforest to summarize a meeting.
I won’t call your point a strawman, but you’re ignoring the actual parts of LLMs that have high resource costs in order to push a narrative that doesn’t reflect the full picture. These discussions need to include the initial costs to gather the dataset and most importantly for training the model.
Sure, post-training energy costs aren’t worth worrying about, but I don’t think people who are aware of how LLMs work were worried about that part.
It’s also ignoring the absurd fucking AI datacenters that are being built with more methane turbines than they were approved for, and without any of the legally required pollution capture technology on the stacks. At least one of these datacenters is already measurably causing illness in the surrounding area.
These aren’t abstract environmental damages by energy use that could potentially come from green power sources, these aren’t “fraction of a toast” energy costs only caused by people running queries either.
Nope, I’m not ignoring them, but the post is specifically about exceptions. The OOP claims there are no exceptions and there is no ethical generative AI, which is false. Your comment only applies to the majority of massive LLMs hosted by massive corporations.
The CommonCorpus dataset is less than 8TB, so fits on a single hard drive, not a data center, and contains 2 trillion tokens, which is a relatively similar amount of tokens that small local LLMs are typically trained with (OLMo 2 7B and 13B were trained on 5 trilion tokens).
These local LLMs don’t have high electricity use or environmental impact to train, and don’t require a massive data center for training. The training cost in energy is high, but nothing like GPT4, and is only a one time cost anyway.
So, the OOP is wrong, there is ethical generative AI, trained only on data available in the public domain, and without a high environmental impact.
Saying it uses less power that a toaster is not much. Yes, it uses less power than a thing that literally turns electricity into pure heat… but that’s sort of a requirement for toast. That’s still a LOT of electricity. And it’s not required. People don’t need to burn down a rainforest to summarize a meeting. Just use your earballs.
Yeah man, guess show much energy it would take to draw the 4k graphics on your phone screen in 1995?
Yeah but we’re talking a fraction of 1%. A toaster uses 800-1500 watts for minutes, local LLM uses <300 watts for seconds. I toast something almost every day. I’d need to prompt a local LLM literally hundreds of times per day for AI to have a higher impact on the environment than my breakfast, only considering the toasting alone. I make probably around a dozen-ish prompts per week on average.
That’s exactly my point, thanks. All kinds of appliances use loads more power than AI. We run them without thinking twice, and there’s no anti-toaster movement on the internet claiming there is no ethical toast and you’re an asshole for making toast without exception. If a toaster uses a ton of electricity and is acceptable, while a local LLM uses less than 1% of that, then there is no argument to be made against local LLMs on the basis of electricity use.
Your argument just doesn’t hold up and could be applied to literally anything that isn’t “required”. Toast isn’t required, you just want it. People could just stop playing video games to save more electricity, video games aren’t required. People could stop using social media to save more electricity, TikTok and YouTube’s servers aren’t required.
Strawman
I won’t call your point a strawman, but you’re ignoring the actual parts of LLMs that have high resource costs in order to push a narrative that doesn’t reflect the full picture. These discussions need to include the initial costs to gather the dataset and most importantly for training the model.
Sure, post-training energy costs aren’t worth worrying about, but I don’t think people who are aware of how LLMs work were worried about that part.
It’s also ignoring the absurd fucking AI datacenters that are being built with more methane turbines than they were approved for, and without any of the legally required pollution capture technology on the stacks. At least one of these datacenters is already measurably causing illness in the surrounding area.
These aren’t abstract environmental damages by energy use that could potentially come from green power sources, these aren’t “fraction of a toast” energy costs only caused by people running queries either.
Nope, I’m not ignoring them, but the post is specifically about exceptions. The OOP claims there are no exceptions and there is no ethical generative AI, which is false. Your comment only applies to the majority of massive LLMs hosted by massive corporations.
The CommonCorpus dataset is less than 8TB, so fits on a single hard drive, not a data center, and contains 2 trillion tokens, which is a relatively similar amount of tokens that small local LLMs are typically trained with (OLMo 2 7B and 13B were trained on 5 trilion tokens).
These local LLMs don’t have high electricity use or environmental impact to train, and don’t require a massive data center for training. The training cost in energy is high, but nothing like GPT4, and is only a one time cost anyway.
So, the OOP is wrong, there is ethical generative AI, trained only on data available in the public domain, and without a high environmental impact.
That’s nothing. People aren’t required to eat so much meat, it even eat so much food.
I also don’t like this energy argument of anti ai, when everything else in our lives already consumes so much.
Valid
You can easily use less power in other ways, too; it’s not one or the other. Let’s do both.