Projects like Arubis use a web based proof of work to slow down and and potential stop not traffic. The idea is that the proof of work makes the client spend some computer resources accessing the page so that it isn’t as computationally feasible to abuse public websites.

However, doing this all as a web service seems inefficient since there is always a performance penalty tied to web pages. My idea is what there could a special http protocol addition that would require the client to do a proof of work. Doing it at the browser/scaper level means that it would be Mich more efficient since the developer of the browser could tailor the code to the platform. It would also make it possible for bots to do it which would still allow scrapping but in a way that is less demanding on the server.

  • CameronDev@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    Its never just “A little extra code”. Each browser will have to implement it themselves (although possibly it could be done in chrome and everyone else inherits it by default), each browser will run the features through the standard debates around support, necessity, correctness, side channel security issues, etc. Firefox might drag their feet, chrome might implement it differently, edge might strip it out because it hurts their scraper. 5 years later it might get useful.