Interested to know how it would play out. The author has a point, that we shouldn't be submitting a tx for every search query or visit to a website, and it would slow down surfing the web to a crawl (2-10 seconds to verify the tx with current chains, opposed to milliseconds with direct communication to the server). And even if it was a random check (once every 10-100 queries) bots would still be able to mass produce wallets with dust in them and proceed as if nothing happened.
I'm not saying it's impossible, I'm just interested in knowing how you'd see it play out?
He is hosting in his home which is fine. I personally think web3 is the way. Not everything in web3 has to be about charging the end user. He should check out presearch. Users get pre to search and pre is used to pay for ads. His home search would be better distributed web3 and not run just in his basement. Plus like AOU has said their are other ways to stop the bots.
The author has a point, that we shouldn't be submitting a tx for every search query or visit to a website, and it would slow down surfing the web to a crawl (2-10 seconds to verify the tx with current chains, opposed to milliseconds with direct communication to the server).
IMO the idea would be to make a chain that is low cost and extremely fast specifically for this problem rather than using existing chains that can't cope with the speed requirements or demand.
I think if there's some back and forth (e.g., if you are providing value to a website, it makes it cost less to verify you) and that would also help screw over bot traffic. But how to design and implement it is of course an interesting and probably extremely difficult problem.
We've already seen dust-attacks on several chains, that completely overload the nodes because the chain had low cost txs fees. Even if we were to implement a trust system, where your first engagement with a site costs 1 token, next 0.5 tokens, next 0.25, etc. I don't think it would solve the bot problem. Bots can get wicked complicated and do things we have not even dreamed of yet. I run bots on a few on crypto faucets and other sites using BrowserAutomationStudio - anything a human can do, BAS can do as well, and there is very little you can do to detect them.
if you are providing value to a website, it makes it cost less to verify you
That could work on some sites, sure. But with stuff like GPT-3 (and soon 4/5), the sites that rely on text/image content will be overrun, even with web3 verification/reputation.
Not trying to be a downer - I'm just having a really hard time seeing how we can possibly beat the bots with web3 (would love if we could and I was proven wrong).
(post is archived)