New Web Order

The World Wide Web is constantly undergoing a restless campaign of enshittification; worse design, worse maintenance, worse technical implementations, etc. Web pages, for example, were designed to be 85% of content worth reading, 10% HTML and 5% CSS. Nowadays, we get:

Of course, these issues are only content-wise. If we come to analyze the core technology that powers the modern web, we quickly realize that the "clearnet" is built on a fundamentally centralized infrastructure; at its core, HTTP (Hypertext Transfer Protocol) is the backbone of the web, and it relies on a hierarchical and bureaucratic system that enforces control at multiple levels, such as:

Despite the many occasional attempts at concocting decentralized protocols, the vast majority of internet activity still occurs within this centralized framework, and most of the proposed solutions fail due to both user adoption issues and fundamental design flaws. This article will provide a reimagination of the most important aspects of the web as an attempt to make a roadmap towards a seemingly perfect solution, rather than ignoring the fact that shitty technology exists and doing nothing about it in the name of the proliferating "competing standards."

How shitty standards stick around.

Inappropriate application of xkcd #927.

Solution ~

The web has two core technical problems:

  1. The Protocol (i.e., the clearnet)
  2. The Clients (i.e., web browsers)

1. The Protocol ~

As highlighted above, multiple decentralized protocols have already been made to reinvent the wheel. Currently, all of them have huge defects, but the closest protocol representing how things must work on the web is a blend of I2P and BitTorrent, closely similar to Hyphanet.

The protocol that I have in mind doesn't work with websites, but only for data transmission. There's no fixed way to display content (like web browsers do with HTML), and instead, different clients interpret the data they receive however they choose appropriately. One client might show a chat, another a video, or another a forum post, as will be seen below, keeping things flexible and decentralized.

For transmission, the protocol serves to be an anonymizing overlay network whose data is encrypted and transmitted from one peer to another through a dynamic and smartly randomized multitude of volunteer-maintained nodes, which are determined through their historical performances and thus given different priorities based on efficiency, balancing both anonymity and speed in the network.

In addition, to get rid of monopolization (such as the one on the clearnet regarding site domain names), discoverablity of content on the protocol would be decentralized; every piece of data is identified by a random cryptographic hash. Discoverability in the protocol is up to the people willing to share hashes, through manual exchange, search engines, forums, trackers, etc., and it's not managed by the protocol itself.

Moreover, the proposed protocol relies on decentralized (i.e., peer-to-peer) data distribution, where any device with access to specific data can help distribute it if both the sender(s) and recipient are online, similar to how torrenting works. Plus, to improve availability, peers should be able to designate and manage external servers, both centralized or decentralized, to host data 24/7.

2. The Clients ~

Instead of a "web browser" through which everything on the web can be accessed directly, different programs (or clients) should present different aspects of data on the web, each client with its own unique implementations:

Of course, I reserve the right to extend my hate on bloatware here (see Things I Dislike > Bloatware).