• 1 Post
  • 386 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2024

help-circle

  • Nope, I’m not ignoring them, but the post is specifically about exceptions. The OOP claims there are no exceptions and there is no ethical generative AI, which is false. Your comment only applies to the majority of massive LLMs hosted by massive corporations.

    The CommonCorpus dataset is less than 8TB, so fits on a single hard drive, not a data center, and contains 2 trillion tokens, which is a relatively similar amount of tokens that small local LLMs are typically trained with (OLMo 2 7B and 13B were trained on 5 trilion tokens).

    These local LLMs don’t have high electricity use or environmental impact to train, and don’t require a massive data center for training. The training cost in energy is high, but nothing like GPT4, and is only a one time cost anyway.

    So, the OOP is wrong, there is ethical generative AI, trained only on data available in the public domain, and without a high environmental impact.










  • theunknownmuncher@lemmy.worldtoFuck AI@lemmy.worldOn Exceptions
    link
    fedilink
    arrow-up
    7
    arrow-down
    8
    ·
    edit-2
    8 hours ago

    Saying it uses less power that a toaster is not much

    Yeah but we’re talking a fraction of 1%. A toaster uses 800-1500 watts for minutes, local LLM uses <300 watts for seconds. I toast something almost every day. I’d need to prompt a local LLM literally hundreds of times per day for AI to have a higher impact on the environment than my breakfast, only considering the toasting alone. I make probably around a dozen-ish prompts per week on average.

    That’s still a LOT of electricity.

    That’s exactly my point, thanks. All kinds of appliances use loads more power than AI. We run them without thinking twice, and there’s no anti-toaster movement on the internet claiming there is no ethical toast and you’re an asshole for making toast without exception. If a toaster uses a ton of electricity and is acceptable, while a local LLM uses less than 1% of that, then there is no argument to be made against local LLMs on the basis of electricity use.

    Your argument just doesn’t hold up and could be applied to literally anything that isn’t “required”. Toast isn’t required, you just want it. People could just stop playing video games to save more electricity, video games aren’t required. People could stop using social media to save more electricity, TikTok and YouTube’s servers aren’t required.

    People don’t need to burn down a rainforest to summarize a meeting.

    Strawman


  • theunknownmuncher@lemmy.worldtoFuck AI@lemmy.worldOn Exceptions
    link
    fedilink
    arrow-up
    19
    arrow-down
    9
    ·
    edit-2
    9 hours ago

    No, and that’s irrelevant. Their post is explicitly not about the majority, but about exceptions/edge cases.

    I am responding to what they posted (I even quoted them), showing that the position that “there is no ethical use for generative AI” and that there are no exceptions is provably false.

    I didn’t think it needed to be said because it’s not relevant to this discussion, but: the majority of AI sucks on all fronts. It’s bad for intellectual property, it’s bad for the environment, it’s bad for privacy, it’s bad for people’s brains, and it’s bad at what it’s used for.

    All of these problems are not inherent to AI itself, and instead are problems with the massive short-term-profit-seeking corporations flush with unimaginable amounts of investor cash (read: unimaginable expectations and promises that they can’t meet) that control the majority of AI. Once again capitalism is the real culprit, and fools like the OOP will do these strawman mental gymnastics and spread misinformation to defend capitalism at all costs.



  • theunknownmuncher@lemmy.worldtoFuck AI@lemmy.worldOn Exceptions
    link
    fedilink
    arrow-up
    32
    arrow-down
    20
    ·
    edit-2
    16 hours ago

    the fact that it is theft

    There are LLMs trained using fully open datasets that do not contain proprietary material… (CommonCorpus dataset, OLMo)

    the fact that it is environmentally harmful

    There are LLMs trained with minimal power (typically the same ones as above as these projects cannot afford as much resources), and local LLMs use signiciantly less power than a toaster or microwave…

    the fact that it cuts back on critical, active thought

    This is a usecase problem. LLMs aren’t suitable for critical thinking or decision making tasks, so if it’s cutting back on your “critical, active thought” you’re just using it wrong anyway…

    The OOP genuinely doesn’t know what they’re talking about and are just reacting to sensationalized rage bait on the internet lmao