What makes LLMs work isn't deep neural networks or attention mechanisms or vector databases or anything like that.

What makes LLMs work is our tendency to see faces on toast.

What if instead of proof-of-work we made proof-of-leisure

As scrapers click on everything they see sporadically without any sense or reason, they won’t wait for any amount of time when they get to a website - they will start to click every link they see until the whole bandwidth is wasted on their stupid bullshit.

What if when some website is opened for a first time from that ip/useragent combination there is a page that says “do nothing for three seconds and don’t press on any of those links” with 5 faux links that will immediately result in being banned. Then after 3 seconds links disappear and the original requested page opens up.

Is this something? Maybe even 1s is enough
Show older
Computer Fairies

Computer Fairies is a Mastodon instance that aims to be as queer, friendly and furry as possible. We welcome all kinds of computer fairies!