“The cloud.” It's the buzziest tech buzzword in recent memory.
Thanks to fast, always-on Internet connections and vast server farms run by the likes of Amazon and Google, we're able to offload more and more of our digital lives onto remote servers. Our documents can live in the cloud. Our photos can live in the cloud. Our e-mail can live in the cloud.
And soon, robots may live in the cloud.
Or robot brains, at least.
At last week's InnoRobo summit in Lyon, France, researchers from the Willow Garage robotics lab publicly unveiled the Heaphy Project, a robotics system that focuses on “pushing computation into the cloud.”
Heaphy Project founder Tim Field explained that traditional robots require large amounts of on-board computing power to perform complicated tasks like voice recognition, spatial awareness and image processing.
A cloud robot requires far less computing power because it can offload most of that heavy-duty number-crunching and data storage to an off-site computer.
“For the vast majority of processing, you can do that anywhere,” says Mr. Field. “It doesn't need to be local to the robot. If you have a fast wireless connection then you can send images, 3D data, control… all the processing can be done in the cloud.”
The Siri feature on Apple's iPhone is a good example of this approach.
Rather than performing processor-intensive (and battery-draining) speech recognition directly on the phone itself, Siri offloads that work by sending your voice to a server farm controlled by Apple.
Cloud services, by their very nature, represent a shift toward increased external control over our personal devices and software. And science fiction has long played into fears about losing control of the technology around us.
But if the term “cloud robotics” conjures up images of drone-like robot armies ( Skynet, anyone?), perhaps the most encouraging aspect of Mr. Field's Heaphy Project is how it incorporates good old-fashioned human intelligence.
Here's how it works.
“Heaphy is primarily a tele-operation interface, which allows anyone in the world to operate a robot,” says Mr. Field.
Remote operators (actual human beings) are recruited via Amazon's Mechanical Turk service. After about two hours of training in a simulated environment (for which they are paid about six dollars), operators are allowed to control a real-live robot.
By logging into the Heaphy web site, operators can see through a remote robot's eyes, manipulate its arms and perform tasks such as emptying a clothes dryer, reaching a cup from a cupboard, or taking out the trash.
According to Mr. Field, these remote operators don't necessarily know much about the robot they're controlling. They simply connect to the cloud, and are “given information via the sensors of the robot.
“Likewise, the robot is connecting to the cloud, and doesn't know who the user is.” The cloud provides a mediating effect, acting as a third-party between operator and robot.
Mr. Field calls this approach a “human in the loop” system that treats human intelligence like an on-demand computing resource. Cloud robots can requisition additional human intelligence on an as-needed basis in the same way companies can rent extra cloud servers or storage space from Amazon.
According to Bob Bauer of Willow Garage, the advantage of treating human intelligence in this is two-fold: “One is that the cost of labour is clearly less. It's the ultimate outsourcing.”
The second advantage is flexibility. Mr. Bauer says robotics companies requiring human intelligence for small or specialized projects can buy it à la carte, “without having to maintain a workforce year-round.”
Cloud robotics is a relatively new and very small field right now, though its potential is fascinating. Several challenges will need to be overcome before cloud robots are adopted by the mainstream.
Practically speaking, connectivity is an issue. Wireless networks in homes and offices are usually pretty reliable, but if the network goes down, so do any robots that rely on the cloud. Server reliability is also important. Just ask anyone who's been frustrated when Apple's Siri says, “I can't take any requests right now. Please try again in a little while.”
But perhaps most importantly, trust and confidence will be paramount for cloud robot adoption. For most of us, the idea of sharing a physical space with a robot is new and unfamiliar (let alone the idea of sharing a physical space with a robot that receives commands from a vast, nebulous “cloud”).
So how can cloud robots earn humans' trust?
“I think it only happens through extended periods of experience with a product,” says Mr. Field. “If you'd asked people 10 years ago if they'd store all of their photos, all of their documents, all of their e-mail online, they'd say, 'No way, I want that on my hard drive.' So I don't think there's anything that you can do except build up trust over time.”
Follow us on Twitter: