return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 16 hours agoAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comexternal-linkmessage-square42fedilinkarrow-up1205arrow-down16
arrow-up1199arrow-down1external-linkAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 16 hours agomessage-square42fedilink
minus-square𝓹𝓻𝓲𝓷𝓬𝓮𝓼𝓼@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up20·13 hours agodoesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought) any money says they’re vulnerable to prompt injection in the comments and posts of the site
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up1·2 hours agoGood god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.
minus-squareCTDummy@piefed.sociallinkfedilinkEnglisharrow-up14·edit-28 hours agoLmao already people making their agents try this on the site. Of course what could have been a somewhat interesting experiment devolves into idiots getting their bots to shill ads/prompt injections for their shitty startups almost immediately.
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up18·12 hours agoThere is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
doesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought)
any money says they’re vulnerable to prompt injection in the comments and posts of the site
Good god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.
Lmao already people making their agents try this on the site. Of course what could have been a somewhat interesting experiment devolves into idiots getting their bots to shill ads/prompt injections for their shitty startups almost immediately.
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.