The internet is undoubtedly the most significant technological revolution in human history. Despite the fact that the industry has seen significant growth since its inception it’s still infancy and needs major improvements.
Tim Berners Lee’s internet was meant to be “a collaborative media, a place [could] all gather and read and/or write.” AOL, Compuserve and early Yahoo were quickly able to dominate an interconnected computer system that allowed users to just that. These online service providers provided the portal to Web 1.0. This was where individuals, businesses, and governments could consume and sometimes post content. In 1994 Netscape launched their web browser, triggering the dot-com boom and the beginning of the browser wars.
Web 2.0 is a departure from Web 1.0, where Graham Cormode, Balachander Krishnamurthy said that “content creators were few… with most users acting as consumers of content.” According to John Battelle and Tim O’Reilly , Web 2.0 introduced the “Web as Platform”, where software applications are built upon Web as opposed upon the desktop.
This allowed millions of people to create content on social networks, blogs and sharing sites. Social media platforms and search engines that are driven by user-generated material have disrupted media, advertising, and retail industries. Companies in publishing and retail that didn’t adapt have either been forced to close their doors or are struggling for survival.
Web 2.0’s business model is based on user participation. This allows users to create new content and profile data that can be sold to third parties to market to them. The internet has evolved into a huge app store, with centralized apps like Amazon, Google, and Facebook dominating. Everyone is trying to create an audience, collect data, and monetize it through targeted advertising. Web 2.0’s business model is based on the centralization and exploitation of data and the use it without the users’ consent.
What is Web 3.0?
It was believed that the Semantic Web would be the next generation of internet. Berners-Lee invented the term “Semantic Web” to describe a web where machines would process content in a humanlike manner (i.e., all data would be connected and understood both conceptually and contextually).
There were many reasons why the Semantic Web didn’t materialize. The main reason was the difficulty in implementing real AI technology (referred to as RDF, or resource description framework). How does a machine distinguish between a Jaguar (the vehicle) and a jaguar? Understanding the context is the only way to tell the difference.
It is a monumental task to connect concepts and build taxonomies for each word. It is so difficult, even with IBM Watson spending billions it never came to fruition.
Web 3.0, although not the Semantic Web Berners-Lee envisioned, is in many ways a back to his original. “There is no central controlling node and so there is no single point failure… and no “kill switch”.
Technology such as distributed ledgers, storage on blockchain, and data decentralization will enable data decentralization. This will make Web 3.0 more transparent, secure, and less centralized. Individuals will have the right to their data and will be able replace centralized tech giants with decentralized infrastructure and platform applications.
Berners-Lee didn’t foresee the rise of internet giants and their ability to profit from our data. Web 2.0 will end the constant interruptions that have become the norm, as decentralization makes it possible to communicate transparently, opt-in and peer-to-peer, allowing individuals to own their time.
Web 3.0 will make the internet more fair by allowing individuals to be sovereigns. True sovereignty means being in control of your time and information. Web 3.0 will allow individuals to connect to an Internet where they can own their data and receive proper compensation for their content. This will eliminate an unjust and exploitative web where only a few people own it.