Reddit’s conversational AI product, Reddit Answers, suggested users who are interested in pain management try heroin and kratom, showing yet another extreme example of dangerous advice provided by a chatbot, even one that’s trained on Reddit’s highly coveted trove of user-generated data.

https://en.wikipedia.org/wiki/Bromism

However, a man was poisoned in 2025, after a suggestion of ChatGPT to replace sodium chloride in his diet with sodium bromide; sodium bromide is a safe replacement only for non-nutritional purposes, i.e., cleaning.[3][4][5]

  • brbposting@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 days ago

    Alternate link to somewhat prevent Google from interlinking us with you quite so tightly

    Original reddit link:

    https://old.reddit.com/r/BORUpdates/comments/16223aj/updatesaga_the_emotional_saga_of_spontaneoush_the/
    

    Automated summary: