Websites have no way to differentiate between bots or real human beings. The website has to load content to bot's and implement ever increasingly complex resource hungry or expensive defences that are ineffective. Bandwidth is being used up blocking legitimate clients and legitimate clients user experience is deteriorating.
[snippet from article]
This is my blog post on how we can fix the internet by breaking it again. Without needing to wait for so called "solutions" from the same people that broke the internet in the first place..
namely:
Advertisers, Big tech, A.I companies, Social media.
etc.
Good point mrkeen.
I'm trying to solve many interconnected problems at once and generating new ones... standard.
Let's say you have a public digital identity... this would indeed make it possible to tie activity back to that identity granted.
Let's say you add a system of trust whereby you get a random public key as the server. The reputation system confirms it's linked to some real identity but doesn't tell the server which one.
Something like that.. it's possible to enhance the privacy to browsing clients by building on top of this idea.
Also certificate authorities are largely opaque corporations and there is no community verification of actual trust between a community of people sharing content with each other.
It is not unheard of for a CA to get hacked or issue fake certificates to aid hackers.
A malicious reputation server could also potentially ok and allow bad requests if that aspect of the system isn't resilient. so reputation server's would also need a reputation system of some kind... and down the rabbit hole we go again.
If you replace "reputation service" with "certificate authority" isn't this basically SSL with both certificates validated? Which would avoid the need for a user website. Or am I missing something.
> When an actual human being wants to interact with a website they include their own public key.
Move over cookies, IP addresses and tracking pixels, we now provide cryptographic proof to advertisers that we are interested in something.
No beadyw because SSL does not provide protection from A.I bot scrapers or DDOS attacks.. or manage federated digital identity.
I've updated the blog again with some tweaks.
Another issue that I thought might be pointed out with this idea is...
If you need to interact with a reputation service of some kind for every new client connecting. That would require network calls and traffic correlating to the number of new inbound requests.
So you could DDOS a website by providing requests with junk public key's causing the server to consume resources checking whether those requests are genuine.
As a defense against scrapers and bots I think the idea has some merit. It may be replacing one form of DDOS for another if there is a targeted attack against the authentication bit.