• I saw a video of a woman in China talking to an AI recreation of her child (then-deceased) through VR.

    I felt so creeped out by it. Like wtf, if I die, I want my mom to remember me, not talking to a fucking AI IMPOSTER.

    • Datz@szmer.info
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      There’s a whole company/llm about doing that whose CEO gave a Ted talk about it.

      https://m.youtube.com/watch?v=-w4JrIxFZRA

      After that, I actually had a pretty wild idea about someone using to replace dead/missing people in chats. Imagine the horror of finding out your friend died months ago, or got kidnapped. Horribly impractical but sounds like a good novel.

      • Avicenna@programming.dev
        link
        fedilink
        arrow-up
        12
        ·
        2 days ago

        “watch me talk about how I get rich of off exploiting people’s emotional fragilities and try to pass it as providing closure and community service”

      • Windex007@lemmy.world
        link
        fedilink
        arrow-up
        18
        ·
        edit-2
        2 days ago

        If someone wants an AI companion, fine. If it’s a crazy good one, fine.

        But it’s strictly predatory for it to be designed to make someone feel like it’s someone else who was a real person, ESPECIALLY someone dealing with that type of grief.

        You had to boot the mom out of the painting. There was no ambiguity on that one.