‘This is for Everyone’ by Tim Berners-Lee: Inventor explores Net gain and loss

We are truly living in the digital age, powered by the Internet, networking, supercomputers, mega data centres, digital devices, digital services, and now Artificial Intelligence. Each of these is a combined result of hundreds of inventions and innovations since the development of early computers in the middle of the last century. The technologies that power the digital age have been developed by scientists, engineers and innovators in research institutions and universities, and commercialised by corporations in Silicon Valley and other technology hubs.
The Internet, for instance, is a result of a series of developments such as computer time-sharing, ARPANET (network of Advanced Research Projects Agency), packet switching, Transmission Control Protocol/Internet Protocol (TCP/IP), World Wide Web, Hypertext Transfer Protocol (HTTP), Internet Protocol addressing, Public Key Infrastructure, email protocols like POP3, host server systems, etc. And behind each of them is an interesting story of innovation, determination, collaboration, as well as trial and tribulation.
With the advent of the age of Personal Computers (PCs) in the 1980s, the interconnection of computers through Local Area Networks became possible. Then the LAN in the building was connected to the LAN in another building through phone lines, and eventually local networks got connected to overseas computers through undersea fibre optic cables to become a network of networks — the Inter-net.
Tim Berners-Lee, the inventor of the World Wide Web, was at the centre of these revolutionary changes in the 1980s and 1990s. Working at the Computing and Networking Division of the European Particle Physics Laboratory (which goes by the acronym CERN), he conceptualised the idea of a wide web for information flow and exchange and called it ‘Mesh’.
He finally chose a more catchy WWW for his invention after considering other options like Mine of Information and The Information Mine (TIM).
Before the web, the mode of transmitting information from one computer to another was FTP or File Transfer Protocol, but it was slow and cumbersome. Berners-Lee designed a new protocol called HTTP and the system of identifying a specific set of data through Universal Resource Locators or URLs, which turned out to be the most crucial innovation of the web.
He also designed the Hypertext Markup Language or HTML, which, together with URL and HTTP, paved the foundation of the Internet as we know it today.
The web was designed to break the hierarchical flow of information, as scientists at CERN preferred to make lateral connections irrespective of job titles and divisional structures. Drawing parallels with urban planning, Berners-Lee writes, “In the modernist era, urban planners like Le Corbusier designed ‘rational’ cities, which segmented neighbourhoods by function and stripped buildings of detail and ornamentation. In so doing, they did irreparable damage to the human spirit. I explicitly conceived of the web to be fractal, thumbing my nose at this kind of false ‘rationality’.”
Berners-Lee did not want to enforce any particular structure, so things on the web could take any shape or size, growing somewhat like an anarchist jungle. Though it was designed at CERN, Berners-Lee made it clear that “it was not just for scientists, not just for academics, but this is for everyone”. Keeping in line with this philosophy, CERN relinquished all intellectual property related to the source and binary codes of the WWW and permitted people to use, duplicate, modify and distribute it.
From two million people who used the Internet in 1991, the number shot to one billion within 10 years. This was one of the most rapid adoptions of any technology in history. However, with such rapid growth came the challenges of inequitable access due to location, poverty and disability.
Berners-Lee made access the priority of the World Wide Web Consortium he formed early on to promote the web as an open and independent platform.
Berners-Lee laments the manner the web was being commercialised by utilising user data. “It had never been my intention to have any data collected about the user at all, and the original web tools I wrote contained no mechanisms for doing so,” he writes. He is also disappointed at the web becoming divisive, polarising and toxic as opposed to his idea of it being a tool of creativity, collaboration, and compassion. Here, Berners-Lee is almost echoing what Oppenheimer felt about his greatest invention — the atomic bomb.
The book is more than a memoir. It is a racy and highly readable treatise on a range of topics — early web culture, birth of e-commerce, browser wars, progression of search engines, dot-com bubble, the spread of social media, transition of the web from PC to mobile, and the future with the emergence of machine learning and AI.
The author believes “there is still time to build machines that serve humans, rather than the other way round”. It is a point worth pondering over as we enter the age of AI.
— The reviewer is a science commentator




