While discussions about tech bubbles have been heated, few commentators seem to be targeting their invective at the real underlying bubble: the World Wide Web itself is crumbling. Like any outmoded technology, the Web is rapidly losing users as it fails to adapt to disruption from mobile apps and continues to perform poorly – despite incredible optimization efforts – due to a bloated software architecture built of hacks on top of hacks. It had an unbelievable 25-year run, but I think it’s time to admit that the product is reaching its last throes.
Just to be clear what we are discussing, the Web is a collection of protocols (namely, HTTP) and hyperlinked documents (built using HTML) that allow users to easily produce and consume content. Since HTTP is a standardized protocol and HTML is a markup language, the Web is platform-agnostic and usable on any device that can connect to the Internet. A key result of this design is unprecedented openness – through hyperlinks, users can connect their content to any other page without seeking permission.
Beginning in the early 1990s, this system would transform the world for the next 15 years, becoming the key vehicle for information and content dissemination across the globe. But as demands increased for quality, security, and control, the Web started to buckle. Its incredible growth forced it to expand far beyond the designs of its technical specifications into areas like asynchronous server communications and local data storage. As smart devices arrived at the end of the last decade, it became increasingly clear that the Web had found its competitor.
And then it lost.
Here is a startling fact: for all but the most mundane applications, it is easier today to create a rich application using XCode or Eclipse than it is to develop a comparable app on the Web. With the libraries offered by iOS and Android, software engineers have extensive standardized resources to build great experiences for users, and both platforms have reached sufficient maturity that documentation is plentiful and APIs are fairly consistent.
The Web has tried to compete with the “mobile web” concept, but like so many responses to technology disruption, this one seems too little, too late. Building an engaging application with HTML5 on mobile is unbelievably challenging, even with a host of libraries downloaded from GitHub to simplify the process. Mozilla’s expansion into the space through FirefoxOS and Open Web Apps is a decent start, but with Americans already spending more time on their smartphones than on the Web through a PC, such efforts are becoming moot.
It’s truly a sad moment, given that we are sacrificing so much of the Web’s best qualities for proprietary native apps. There is no way to construct URLs to apps, nor any method to hyperlink to specific content within an app container, a concept called “deep linking.” New libraries are harder to build given the closed nature of the iOS platform, and Android’s openness has slowly faded as well. That means source code for apps isn’t visible for modification or improvement, deeply cutting down on the speed on which new techniques are propagated across the development community. In short, independent developers are being harmed in the race to ensure that the largest enterprises can control their brand’s online experience.
Openness could have been the key competitive angle for the Web. Yet the stories over the past two years about the NSA’s Internet surveillance program have completely undermined that argument for consumers. Now, with the potential of the FCC gutting its net neutrality policy, even the ability to equally access information online is at risk. To a degree, we only have ourselves to blame. When websites started blocking links connecting to their content and companies began walling off more of their data to non-members, the development community became instrumental in making openness an empty phrase.
While the Web may be dying, its core objectives live on. I remember when my family bought our first modem. It was 28.8kbps if I recall, and the Internet back then was deeply confusing. We bought a book which listed all the major websites, since search engines were still embryonic. It was a simpler time. Table HTML tags were the only means for laying out webpages, as CSS level 1 wouldn’t be devised until late in 1996. Given the speed of the modems back then, the Internet was mostly textual, with a design aesthetic that was creatively chaotic if not the most usable. It was inviting and open.
We need to return to this kind of world again, and the only way we are going to get there is to rebuild our stack from the bottom. In short, we need a literal “Web 2.0,” a new edition that brings back some of the critical features that have been removed in our race to build better Internet applications.
What would a new Web look like? For one, it would make very different assumptions about users and their habits. It would assume multiple devices and a cloud-based infrastructure, and so it would handle synchronization services fundamentally as part of its protocols. It would assume two-way communications between the client and the server and thus could natively handle push communications. It would simultaneously have better facilities for handling identity online, while also providing better anonymity.
Like the expansion of the United States to the West, the Web started as a world with an open mindset and local, flexible rules. Over time, fences appeared, property was divvied up, and society became more process-driven to protect the property people already had rather than to ensure the best possible development of the future. For the internet to evolve, we need to move away from the technologies that are slowly degrading and infantilizing our experience, and strike a new path toward a world where the Internet once again is open and free.