Reddit’s conversational AI product, Reddit Answers, suggested users who are interested in pain management try heroin and kratom, showing yet another extreme example of dangerous advice provided by a chatbot, even one that’s trained on Reddit’s highly coveted trove of user-generated data.

https://en.wikipedia.org/wiki/Bromism

However, a man was poisoned in 2025, after a suggestion of ChatGPT to replace sodium chloride in his diet with sodium bromide; sodium bromide is a safe replacement only for non-nutritional purposes, i.e., cleaning.[3][4][5]

  • ToastedPlanet@lemmy.blahaj.zoneOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    This is why I’m glad I’m on lemmy and not reddit. And why we should not want to be allowing AI generating messages on this website in lieu of comments and posts from other people.