- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Alarmed by what companies are building with artificial intelligence models, a handful of industry insiders are calling for those opposed to the current state of affairs to undertake a mass data poisoning effort to undermine the technology.
Their initiative, dubbed Poison Fountain, asks website operators to add links to their websites that feed AI crawlers poisoned training data. It’s been up and running for about a week.
AI crawlers visit websites and scrape data that ends up being used to train AI models, a parasitic relationship that has prompted pushback from publishers. When scaped data is accurate, it helps AI models offer quality responses to questions; when it’s inaccurate, it has the opposite effect.


People have been doing this to “protest” AI for years already. AI trainers already do extensive filtering and processing of their training data before they use it to train, the days of simply turning an AI loose on Common Crawl and hoping to get something out of that are long past. Most AIs these days train on synthetic data which isn’t even taken directly from the web.
So go ahead and do this, I suppose, if it makes you feel better. It’s not likely to have any impact on AIs though.
I love how competent and thorough you think the people creating AI are.
AI is a self selecting industry of serial bullshitters, I am sure they claim to do all the things you say, but I am zero percent convinced of it until I see proof these kinds of anti-AI strategies don’t work.