Experts warn that average individuals can now experience the same sycophant-induced delusions as billionaires

  • 🍉 Albert 🍉@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    3 days ago

    reminder not to blame desperate vulnerable people who are in dire need of help with is often unafordable or unavailable.

    blame the corporations who are providing this for profit and with little to no safeguards.

    • Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 days ago

      Depends on the user, I used DungeonAI back in the day to get over losing a family member and it was fine

      On the other hand, my neighbor is convinced that some Character AI loves him sincerely and that he birthed the first ever conscious AI. He barely talks to his wife of 20 years anymore

  • ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    3 days ago

    Human shrinks, just like AI chatbots, are experts at slick-talking BS and know how to manipulate people.

    The difference is, most human shrinks mean well and do try to help, while most AI chatbots are run by greedy monopolistic Big Data for-profits whose sole purpose is to “increase engagement”.

    • chirospasm@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 days ago

      I would suggest that counselors / therpists, in fact, have backgrounds – educational and experiential – that support the ‘slick-talking BS’ you suggest, but that it is only slicktalking BS if you aren’t willing to consider the benefit you get from relating to them in the way they were trained to relate.

      This is important because the ‘relating’ is what has an impact more on you socially than the ‘slicktalk.’ It’s the ‘human-to-human’ part that sticks to us longer than self-help books, prompts us to be open and considerate for change, and even supports our eventual ability for understanding ourselves a little better.

      There is no ‘relating’ to an LLM. That LLM is weighted, in fact, to provide positive responses that meet the requesting of your text-based prompt.

      If, in an LLM therapy session, I suddenly flip the script and write, ‘Now pretend you are a far less confrontational therpaist who understands my feelings on X or Y that we’ve been talking about, and who doesn’t want to press me on it as much,’ then I am no longer even superficially attempting ‘relate.’ The cosplay of therapy is ripped away.

      The ‘relationship’ part of therapy cannot happen authentically with an LLM if I can still control the outcome.