Skip to main content
long read

When computers disappear and computing bleeds into the world around us, it also means that instead of interacting with separate small computing devices, we will essentially live inside one giant computer.

"Obi-Wan was right," says Vint Cerf, a vice-president at Google. The Force is all around us, "it just happens to be digital."

Mr. Cerf is talking about ubiquitous computing. In the early 1970s, he wrote the common technical protocols that allow different networked computers to talk to each other. If you had asked him then to tell you what a computer is, the answer would probably have been straightforward: a screen, a processor and some sort of input device, such as a keyboard.

Attempting to answer the same question in 2013 is trickier. Networked sensors, cloud computing and voice commands don't necessarily fit into this definition, and it's about to get even more complicated. The computer as the main location for computing hardly applies anymore. Connected computing power is flowing into the world around us, empowering us in various ways. In doing so, it's bearing an increasing resemblance to that mysterious Jedi power from Star Wars.

Mr. Cerf isn't the first to connect technology with the supernatural. Science-fiction writer Arthur C. Clarke famously said the most advanced technologies are indistinguishable from magic. We find ourselves at an inflection point: within a few years, the most visible manifestation of computing – the computer – will have largely disappeared, replaced by invisible sensors that react to our gestures, obey our voices and perhaps even predict our thoughts as surely as a human would.

When computers disappear and computing bleeds into the world around us, it also means that instead of interacting with separate small computing devices, we will essentially live inside one giant computer. There will be benefits, to be sure, but there is also the potential peril. How will we cope? Are we ready for all of it and can we absorb the positives without being overwhelmed by the negatives? It's going to be a difficult transition and we're going to require powerful wisdom to make it through.

"We need Yoda and Obi-Wan to help us out," quips Mr. Cerf.

The idea of ubiquitous computing was first defined by a group of scientists at the Xerox Palo Alto Research Center in 1988. In their earliest papers on the subject, John Seely Brown and Mark Weiser mapped out computing's future and how it would flow up from the desktop and into the environment around us. They envisioned a world where computers would be found in everything, from the walls in our homes and street-side poles to our clothes and cars. The computers would also be networked, so that one machine could call on the resources and power of every other machine in the system. They would also innately understand how humans work.

In the two prior eras of computing – the mainframe and PC epochs – the opposite was true. People had to adjust to machines and their forms of input and output, whether it was punch cards and keyboards or printouts and monitors.

Mr. Weiser, before he died in 1999, said we were at the beginning of a third computing epoch. Most thinkers on the subject, including Seely Brown – now a visiting scholar at the University of Southern California – believe we're firmly in the middle of it. The improving horsepower and connectivity of our devices has enabled the melting away of these clunky interfaces, the last hurdle to realizing the ubiquitous computing vision of the 1980s.

Ubiquitous computing starts with ubiquitous computers, which now almost equal the human population in number, if smartphones are included in the definition. The telecommunications networks enabling their Internet connectivity are also quickly improving and expanding, and not just in the developed world. People in the poorest countries are leapfrogging PC-based computing and moving right to mobile devices. Pretty soon, every man, woman and child will have a powerful, networked computer in the palm of their hand.

For all their capabilities, however, these mobile devices still require too much from their users. Their displays, for one thing, are small and limiting.

So-called "Augmented Reality" will be an important part of the solution. Using monocular AR eyewear to overlay computer-generated data onto the real world, for example, the U.S. military is now allowing soldiers to see what their aerial drones see. The eyewear introduces new capabilities, such as highlighting friendlies and enemies in different colours, with the expectation of saving lives by preventing accidental friendly fire.

This year, AR glasses that are indistinguishable from regular sunglasses will hit the market for the rest of us when the likes of Google and Vuzix roll out consumer products. These will connect wirelessly to a smartphone and serve as a sort of secondary display. In the context of ubiquitous computing, they'll free digital information from the confines of a handheld screen and bring it out into the real world.

The same goes for so-called pico projectors, which can turn smartphones into miniature movie projectors. When combined with sensor technology such as cameras or motion detectors, pico projectors also become two-way, interactive communication tools. Projected virtual keyboards, already on the market, are a good example.

Input is also undergoing its own quiet transformation. In the past few years, touch screens, voice command and gesture control – think Apple's Siri and Microsoft's Kinect – have all arisen to complement and in many cases supplant the traditional mouse and keyboard. In many situations, they make computing feel natural and intuitive.

Motion or gesture control, in particular, will be integral in a world of ubiquitous computing. The original vision relied strongly on the ability to affect information and machines with just a wave of the hand. That capability arrived only recently through video game consoles. Nintendo popularized motion-controlled computing with its Wii controller in 2006 while Microsoft took it a step further in 2010 with the Kinect, which allowed for full-body gaming without the need for any handheld device.

The technology is now iterating and improving quickly. Leap Motion, a San Francisco-based startup, will soon start selling a chocolate-bar-sized plug-in for computers that, like Kinect, will allow users to control what they see on the screen with their hands. The device promises to be more fine-grained than Microsoft's product in that it can accurately track finger movements. Microsoft itself will likely unveil a new-and-improved second-generation Kinect with its next Xbox console later this year.

Leap chief executive Michael Buckwald sees motion control being incorporated into smartphones, where it will integrate with AR glasses to create immersive, 3D computing experiences akin to what has been seen in movies such as Minority Report and Iron Man. In the spirit of ubiquitous computing, the experience won't feel like computing.

"Suddenly, data will become so much more intuitive," he says.

Keyboards are likely to be rendered obsolete in all but the most specialized of cases by devices that simply allow us to issue commands to the walls. The Siri-like "Ubi," a puck-sized box that plugs into any electrical socket, is just one example. The Ubi attracted attention on Kickstarter last year by looking up recipes and connecting phones calls when asked to do so.

It may not be Star Wars exactly, but it's certainly approaching Star Trek , where the crew spoke to their ship – and it spoke back. There's little doubt: we're quickly being surrounded by computing, and the ways in which we interact with it are becoming much more natural and invisible.

***

So what happens if and when we want to unplug? Can we even do that anymore? Unfortunately, when you're surrounded by the Force, there's no off switch, and some people are already feeling those detrimental effects.

Back in the 1990s, when you weren't "at your computer" you couldn't be expected to check it. But with the rise of the "Crackberry" phenomenon – the obsessive use of one's smartphone, particularly the BlackBerry – computing is now location-less for being online, for text messages and e-mail. Employees are always reachable.

Studies have found the large majority of people check e-mail after work hours, take smartphones and laptops with them on vacation and send e-mail or texts during meals with family and friends. Often, employers don't officially require them to do so, but the tacit expectations are there. The work day and its related stresses and pressures have, for many people, effectively blended into personal time.

Socially, many are also finding it preferable to spend time on Facebook than with real friends and family. A recent study by researchers at the University of Chicago's Booth School of Business found that most people consider Facebook, Twitter and e-mail harder to resist than cigarettes and alcohol, while a report from the Pew Research Center and Elon University last year concluded that young people in particular will suffer from a loss of patience and a lack of deep-thinking ability because of the sort of "fast-twitch wiring" enabled by perpetual connectedness.

This is killing the boredom we experience whenever we have down time. That's good because we're never bored anymore, but it might also be bad because boredom may actually necessary to achieve deeper levels of thinking. "Every major society, even the minor ones, have built in times where you have to shift your mind, whether it's prayer or Sundays off," says Genevieve Bell, an anthropologist working for microprocessor maker Intel. "As human beings, we have to be intermediately disconnected, or differently connected, versus this constant buzz of connectedness."

The growth of ubiquitous computing and always being connected to a network of billions – whether it's people or machines – may also be stunting our emotional growth and intelligence. There have been reports that computer-mediated communication loses important emotional coding. We don't have to experience real-time emotional reactions if we replace face-to-face or phone conversations with e-mails and texts. Some think this means we are developing coarser personalities, which accounts for the fact that we can be so rude and brusque online and over e-mail. Mr. Cerf, for one, likens it to being surrounded by protective bubble of your car.

"When you get in a car and are surrounded by steel and glass, you feel free to say things you would never say face-to-face to anybody," he says. "In some ways, people behave on the Net in similar fashion."

Technologists, naturally, believe more technology may be the answer. Microsoft's Kinect can track your facial expressions, and in 2011 the Mood Meter at the Massachusetts Institute of Technology used Kinect to detect student smiles in an attempt to gauge general levels of happiness on the school's campus. MIT professor Rosalind Picard, meanwhile, has developed a face-tracking algorithm that can detect emotions you don't even know you're feeling while smartphone apps such as StressSense can tell your mood from the sound of your voice.

These may be the early steps toward re-injecting emotional tonality into computer-mediated communications and it could also lead to computers developing manners. While current automated systems can learn from your behaviour and thereby make helpful suggestions – a computer can easily figure out what your favourite bands are and alert you when they're coming to town, for example – they still don't know when it might be appropriate to interrupt you.

"Without an understanding of what we want to know and when we want to know it, it's difficult for systems to achieve their potential. We're just beginning to be able to do that effectively," says Alfred Spector, vice-president of research at Google. "No one gets it right. Even our best friends misread our moods sometimes. That's among the most important challenges."

Emotion-sensing technologies may also be combined to let your computer know how you are feeling so that it can adjust its reactions and output to your emotional state. If you're really upset, for example, maybe that telemarketing call won't come through.

Bill Buxton, who was at Xerox PARC in the 1980s and is now a principal researcher for Microsoft, says technologists often overlook such matters. For him, it's also a simple issue of annoyance. He gets particularly irked by those automated telemarketing calls – known as robocalls – that interrupt at the worst possible times. There's currently no way to tell such a machine to remove him from its list.

"We thought we were trying to do something right and we created a monster in other ways," he says.

There is also the possibility – or perhaps likelihood – that in the face of all this inexorable technological progression, old-fashioned humanity may re-assert itself. Already, some companies and families are applying the brakes to this seemingly out-of-control locomotive. Last year, Volkswagen pledged to turn off e-mail on BlackBerries for German workers after hours to prevent burnout. Mom and dad, meanwhile, are increasingly telling the kids to leave their phones in their bedrooms when sitting down for dinner.

"We're ready to take a deep breath and do it in a way that meets our goals as people a little bit better," says Sherry Turkle, an MIT professor and author of Alone Together: Why We Expect More From Technology and Less From Each Other. "Technology can make us forget what we know about life. For a couple years there, [total connectivity] seemed like a good idea."

These sorts of cultural affirmations suggest that different parts of the world might move into ubiquitous computing at different speeds. Some people are therefore resisting the pull of the Force – or perhaps the Dark Side, as it were – more than others.

When Intel initially hired Ms. Bell, a cultural anthropologist from Australia, in 1998, her initial mandate was to help the company understand women and non-American consumers. Her success led Intel, two years ago, to put her at the helm of an entire research and development lab, complete with a staff of 100 that includes social scientists, designers and fellow anthropologists.

They came to some surprising conclusions about how different cultures will react to ubiquitous computing. Ms. Bell found that in such countries as Japan, Singapore and South Korea, computers have achieved a higher level of natural interaction with humans because the public believes that machines are their friends.

Various cultural, economic and demographic differences – ranging from a higher proportion of elderly citizens to denser living environments resulting in less privacy and even more reserved personal conversation style – have generally resulted in a greater trust of technology in Asia than in the West. That's why robots and other machines in your environment that monitor and talk to you are more advanced and accepted in Osaka than in New York or London. "If you go to Tokyo, everything talks to you and expects you to talk back," Bell says.

Westerners, meanwhile, have been psychologically conditioned into a "Terminator mentality" of wariness by decades of apocalyptic science-fiction where machines turn on humanity, as well as by media exposés of the misdeeds of companies and governments. So while robotic tools such as Siri and Kinect are old hat in the East, in the West there have been considerable questions raised about how such devices and applications store and track biometric and voice data.

Such wariness may not stop the ultimate outgrowth of ubiquitous computing in the West, but it is likely to slow it. As Jared Diamond noted in his Pulitzer Prize-winning book Guns, Germs and Steel, cultural differences towards certain technologies can have major ramifications in the success or failure of nations. Japan's decision to give up guns in the 18th century came because its strong traditional Samurai class rejected them. The Samurai's cultural dominance within Japan led the nation's war-making ability to fall markedly behind other states with less wealth and social cohesion over the subsequent centuries. So, these national differences in cultural acceptance of ubiquitous computing could very well determine who the leaders – and the followers – of the next century will be.

Technologists like Mr. Cerf want to see the benefits of ubiquitous computing adopted, but they also welcome this cautious approach – as long as it's reasoned and not based on irrational fears. The wine cellar in Mr. Cerf's basement, for example, is wired to send him a text message if the room temperature rises above 60 degrees Fahrenheit. He hasn't yet got around to setting the system up so that he can remotely restart the cooling system, but he intends to. If someone really wanted to, they could hack in and ruin his wine collection, so the question is not just to try and prevent that from happening, but also in understanding why anyone would actually do so.

"The more we rely on these sorts of systems, the more fragile we become," he says. "But there is as much connectivity as there is disconnectivity associated with these mechanisms."

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe