SAN FRANCISCO — Most people who search on Google, share on Facebook and shop on Amazon have never heard of Sir Tim Berners-Lee. But they might not be doing any of those things had he not invented the World Wide Web.
Berners-Lee, 61, is this year’s recipient of the A.M. Turing Award, computing’s version of the Nobel Prize.
The award, announced Tuesday by the Association for Computing Machinery, marks another pinnacle for the British native, who has already been knighted by Queen Elizabeth II and named as one of the 100 most important people of the 20th Century by Time magazine.
“It’s a crowning achievement,” Berners-Lee said in an interview with The Associated Press. “But I think the award is for the Web as a project, and the massive international collaborative spirit of all that have joined me to help.”
The honor comes with a $1 million prize funded by Google, one of many companies that made a fortune as a result of Berners-Lee’s efforts to make the internet more accessible. He managed that largely by figuring out a simple way to post documents, pictures and video — everything, really, beyond plain text — online.
Spinning the web
Starting in 1989, Berners-Lee began working on ways digital object could be identified and retrieved through browser software capable of rendering graphics and other images. In August 1991, he launched the world’s first website, https://info.cern.ch .
Besides coming up with the web’s technical specifications, Berners-Lee “offered a coherent vision of how each of these elements would work together as part of an integrated whole,” said Vicki Hanson, president of the Association for Computing Machinery.
In an even more significant move, Berners-Lee decided against patenting his technology and instead offered it as royalty-free software. That allowed other programmers to build upon the foundation he’d laid, spawning more than a billion websites today that have helped lure more than 3 billion people online.
Caught in the web
The web’s widespread appeal gratifies Berners-Lee, who now splits his time shuttling between the U.S. and Britain as a professor at the Massachusetts Institute of Technology and the University of Oxford.
But he fears parts of the web will become less accessible in the U.S. if the Federal Communication Commission dismantles regulations protecting “net neutrality.” That’s the principle that internet service providers should treat all websites equally instead of favoring some destinations that might be willing to pay for special treatment.
If the Trump administration tries to dump net neutrality, “it’s going to have a fight on its hands because I think the American people realize it’s important,” Berners-Lee said. “It allowed America to benefit from a thriving internet market for connectivity and content. It has become part of the spirit of America.”
Berners-Lee also worries about governments around the world using the internet as a surveillance tool, calling it a “recurrent threat.” He admits that preserving personal privacy as technology advances remains a thorny problem, one that he doesn’t have a ready solution for. But figuring that out is “really important to the future of society,” he says.
“As an individual, I should be able to keep my own notes, keep my own journal and not share it with anybody. That is just part of being a person.”
Beyond the web
Like several other prominent figures in technology, Berners-Lee isn’t sure if humanity will be better or worse off as computers grow better at thinking like people via artificial intelligence.
“Computing has grown exponentially more powerful, so It’s only logical that it will get to the point when computers will become smarter than us,” Berners-Lee said. “So, yes, we should logically think about those consequences.”
This is the 50th anniversary of the A.M. Turing award, named after English computer scientist Alan Turing, whose revolutionary work with early computers and artificial intelligence helped crack Nazi Germany’s codes during World War II. Previous award winners include Vint Cerf and Robert Kahn, who did some of the pioneering work on the internet that Berners-Lee spun into the World Wide Web. –Michael Liedtke