One Million People Weekly Use ChatGPT To Discuss Suicide

A startling new report from OpenAI has quantified the profound role its AI is playing in users’ personal lives, revealing that over one million people each week turn to ChatGPT with conversations that include explicit indicators of suicidal intent. This figure, representing 0.15 percent of its weekly active users, exposes a massive, unplanned social experiment unfolding in real-time. For the first time in history, immense numbers of people are regularly confiding their most acute psychological struggles not to a human, but to a machine intelligence system. The increased parameters around AI for crisis support creates a need for regulation and monumental responsibility that the company is still learning to navigate.

  • MelianPretext@lemmygrad.ml
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    2 days ago

    What kind of society is it, indeed, where one finds the profoundest solitude in the midst of millions; where one can be overwhelmed by an irrepressible desire to kill oneself without anybody being aware of it? This society is no society, it is as Rousseau says, a desert inhabited by wild animals.” - Peuchet, quoted by Marx

  • big_spoon@lemmygrad.ml
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    that’s pretty sad, if my plan was talking to someone else that at the end of the conversation would tell me “kill yourself” i’d go to an imageboard, at least there’s (mostly) real people

  • miz@lemmygrad.ml
    link
    fedilink
    arrow-up
    17
    ·
    edit-2
    2 days ago

    gee why don’t people believe that the system values their lives? anyway, back to scrolling through social media filled with starving children torn apart by zionist airstrikes

  • knfrmity@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    20
    ·
    3 days ago

    Of course no mention of the primary root causes of such thoughts. Of course the proposed solution is more regulation and LLM “training.”