“Obi-Wan was right,” says Vint Cerf, a vice-president at Google. The Force is all around us, “it just happens to be digital.”
Mr. Cerf is talking about ubiquitous computing. In the early 1970s, he wrote the common technical protocols that allow different networked computers to talk to each other. If you had asked him then to tell you what a computer is, the answer would probably have been straightforward: a screen, a processor and some sort of input device, such as a keyboard.
Attempting to answer the same question in 2013 is trickier. Networked sensors, cloud computing and voice commands don’t necessarily fit into this definition, and it’s about to get even more complicated. The computer as the main location for computing hardly applies anymore. Connected computing power is flowing into the world around us, empowering us in various ways. In doing so, it’s bearing an increasing resemblance to that mysterious Jedi power from Star Wars.
Mr. Cerf isn’t the first to connect technology with the supernatural. Science-fiction writer Arthur C. Clarke famously said the most advanced technologies are indistinguishable from magic. We find ourselves at an inflection point: within a few years, the most visible manifestation of computing – the computer – will have largely disappeared, replaced by invisible sensors that react to our gestures, obey our voices and perhaps even predict our thoughts as surely as a human would.
When computers disappear and computing bleeds into the world around us, it also means that instead of interacting with separate small computing devices, we will essentially live inside one giant computer. There will be benefits, to be sure, but there is also the potential peril. How will we cope? Are we ready for all of it and can we absorb the positives without being overwhelmed by the negatives? It’s going to be a difficult transition and we’re going to require powerful wisdom to make it through.
“We need Yoda and Obi-Wan to help us out,” quips Mr. Cerf.
The idea of ubiquitous computing was first defined by a group of scientists at the Xerox Palo Alto Research Center in 1988. In their earliest papers on the subject, John Seely Brown and Mark Weiser mapped out computing’s future and how it would flow up from the desktop and into the environment around us. They envisioned a world where computers would be found in everything, from the walls in our homes and street-side poles to our clothes and cars. The computers would also be networked, so that one machine could call on the resources and power of every other machine in the system. They would also innately understand how humans work.
In the two prior eras of computing – the mainframe and PC epochs – the opposite was true. People had to adjust to machines and their forms of input and output, whether it was punch cards and keyboards or printouts and monitors.
Mr. Weiser, before he died in 1999, said we were at the beginning of a third computing epoch. Most thinkers on the subject, including Seely Brown – now a visiting scholar at the University of Southern California – believe we’re firmly in the middle of it. The improving horsepower and connectivity of our devices has enabled the melting away of these clunky interfaces, the last hurdle to realizing the ubiquitous computing vision of the 1980s.
Ubiquitous computing starts with ubiquitous computers, which now almost equal the human population in number, if smartphones are included in the definition. The telecommunications networks enabling their Internet connectivity are also quickly improving and expanding, and not just in the developed world. People in the poorest countries are leapfrogging PC-based computing and moving right to mobile devices. Pretty soon, every man, woman and child will have a powerful, networked computer in the palm of their hand.
For all their capabilities, however, these mobile devices still require too much from their users. Their displays, for one thing, are small and limiting.
So-called “Augmented Reality” will be an important part of the solution. Using monocular AR eyewear to overlay computer-generated data onto the real world, for example, the U.S. military is now allowing soldiers to see what their aerial drones see. The eyewear introduces new capabilities, such as highlighting friendlies and enemies in different colours, with the expectation of saving lives by preventing accidental friendly fire.