Projects like Arubis use a web based proof of work to slow down and and potential stop not traffic. The idea is that the proof of work makes the client spend some computer resources accessing the page so that it isn’t as computationally feasible to abuse public websites.
However, doing this all as a web service seems inefficient since there is always a performance penalty tied to web pages. My idea is what there could a special http protocol addition that would require the client to do a proof of work. Doing it at the browser/scaper level means that it would be Mich more efficient since the developer of the browser could tailor the code to the platform. It would also make it possible for bots to do it which would still allow scrapping but in a way that is less demanding on the server.
I want to say no purely based on the idea that server code being executed on a local browser will make sandbox escape or RCE bugs far more dangerous.
If a site wants work does, they should pay for it with their hosting.