It’s high time to fight against the dominance of the internet giants. In Europe, several regulations have been announced that is targeted to force these giants to respect more healthy rules of the game and to be more protective of users’ rights and of competition. Some even brandish the threat of dismantling some of the tech giants, a weapon of mass destruction hardly used in history.
Is an alternative path of a truly decentralized internet possible?
A number of companies hold a virtual monopoly within the internet in critical areas of services (search engines, email, etc.), infrastructures (global transit, content distribution networks, cloud computing services, etc.) and even, to some extent, internet standardization (IETF, ICANN/IANA, W3C, etc.). The equation is unprecedented, and their position has become virtually impenetrable.
The now-famous “network effect” explains the genesis of the current domination: The bigger a web player is, the bigger it gets. The more users it has, the more interesting it becomes for the subsequent users to join that player and not another one. The services offered are all the more attractive, as they appear to be “free,” but they come at the price of the commoditization (and sometimes the violation) of users’ privacy.
The internet giants have also massively invested in their own “pipelines” (notably, submarine cables) in a bid to bring their content as close as possible to the user. Five years ago, these “priority access paths” represented 25% of the world’s web traffic. Today, they account for 64%.
This is reflected in the quality of service offered by the internet giants: a latency time greatly reduced compared with their (potential) competitors. Let’s think about a platform that would like to compete with YouTube or Netflix, but with a loading time that is 10 times longer.
In the end, we have all become dependent on a small group of all-powerful service providers.
Decentralizing the internet has become a holy grail, and several projects have emerged to meet the challenge (e.g., Filecoin, ThreeFold, Solid and Dfinity).
These projects generally have the same goals:
• To “distribute” the cloud and offer an alternative to hyper-concentrated data centers and centralized cloud providers.
• To guarantee better protection of user privacy and “data sovereignty.”
• Allow applications to be deployed with a level of quality and scalability similar to what the internet offers.
The technical challenge is immense, as is that of massive adoption by users of the services offered by the GAFA, an acronym that stands for Google, Apple, Facebook and Amazon.
However, the means to achieve these objectives differ from project to project.
Solid is a specification that lets people store their data securely in decentralized data stores called pods. Pods are secure, personal web servers for data. When data is stored in someone’s pod, they control which people and applications can access it. The user can get a pod from selected pod providers (some being hosted by Amazon), or the user may choose to self-host a pod to be more autonomous.
Dfinity proposes the Internet Computer Protocol, or ICP, which the project describes as “extending the internet with serverless cloud functionality, enabling secure software and a new breed of open internet services.” This ICP is provided by a global network of independent data centers.
ThreeFold deploys a peer-to-peer (P2P) grid formed by a global network of independent farmers. What differentiates ThreeFold from the other serverless clouds is that they started from scratch and built a new infrastructure from the ground up. The main benefits of the ThreeFold Grid are:
• Privacy: A P2P environment means no middlemen or intermediaries — data travels directly between people and is stored on the nodes of their choice, rather than being sent through and stored by a third party.
• Security: Data stored in data centers is susceptible to security breaches. In bypassing data centers and exchanging data directly between peers, greater security can be achieved, as it reduces code and back doors significantly.
• Scalability: In a many-to-many system, scale is essentially unlimited. Hardware (nodes) can be added at ease in any home or office by anyone, which is not the case with the current data center model.
• Cost-efficiency and sustainability: End-to-end (direct) connection between peers means that the system will define the most efficient path for data. This leads to much more energy and cost efficiency, compared to the centralized data center model.
In both projects, users must buy utility tokens that act as “gas” to reserve sovereign capacity and store data.
The Internet of Universal Resources
The next level may be an actual merger of the existing internet protocol (TCP/IP) with blockchain technology . The result would be an internet capable of carrying not only packets of data but also services in a decentralized manner. This “merger” would foster a more open, resilient and plural internet that is capable of natively offering essential services such as information search, decentralized domain name management, digital identity, electronic messaging, data storage, computing power (artificial intelligence), confidentiality, traceability and electronic signature.
These services have become universal resources of the internet and, as such, should be natively provided by the network and managed as commons.
In technical terms, the challenge is to combine the data packet transport (TCP/IP) functionality with a certain “intelligence” that allows packets to encapsulate a service marker. This service marker will be read and interpreted by all components of the network infrastructure (routers, switches, servers).
In doing so, services — universal or critical — are brought back to the protocol level of the internet. Indeed, the packet (routed according to the rules of the protocol) “activates” access to these services from a dedicated node, or server.
This node is part of a decentralized network of nodes. The operators of these nodes can be either existing internet service providers, specialized companies (software publishers, data centers, etc.), or public authorities. Ownership of these nodes could also be hybrid, shared between these different actors.
Belgian public utility foundation IOUR Foundation promotes this type of approach and presents a suite of protocols that brings the native services down to the lower layer of the internet. A proposal like this has fundamental implications for the internet’s physiognomy, notably: decentralized governance, interoperability of services, native traceability and confidentiality.
A decentralized, native search engine
No internet service is more concentrated than the search engine (both 63% of all searches and 94% of all mobile and tablet search traffic comes from Google).
This essential function can be offered by the internet network (via its augmented protocol), which would result in a more objective, more complete and more privacy-friendly search engine, as all search data would be stored by the network in a decentralized way and no longer be centralized on private servers. In addition, users will be able to decide whether or not to anonymize their search.
It is very essential to promote active collaboration and complementarity between all the aforementioned projects (and others) that pursue the same objectives.
Synergies are not only possible, they are obvious. The ThreeFold Grid, for example, can add tangible value to Dfinity or Solid and other similar projects if they want to benefit from a truly decentralized and sovereign infrastructure, instead of relying on current data center models. The future IOUR infrastructure could — and should — also rely on such a grid to deploy the nodes that are necessary to make the internet capable of providing “native” services.
Cooperation is of the essence in the new world we want to build.