Skip to main content

We've all seen the movies. From 2001: A Space Odyssey and Blade Runner to the Terminator and Matrix franchises, with late updates from I, Robot and Ex Machina, the truth is obvious. When we try to play God with technology, creating new forms of mechanical intelligence, murderous blowback is guaranteed. The killer-robot narrative is Frankenstein redux. Our heedless drive for invention is deadly. The machines are rising up! Now that science has brought us closer to once-futuristic scenarios, anxiety levels are rising, though with less apocalyptic details. Drones are great for backyard fun or delivering Amazon packages, but they also gather intelligence and deposit weaponized material. Driverless cars are supposed to be safe, but they still crash and run red lights. Automated warehouse and checkout systems lower the price of consumer goods even as they obliterate human jobs. Thanks to the Internet of Things, your fridge knows more about you than your bank manager, doctor, and spouse combined. Black Mirror is already here.

Some people, not all of them paranoid, believe that we are approaching the Singularity: the notional future point at which non-human intelligence, on a steep learning curve since 1950, surpasses ours, which has mostly flatlined for decades. Skeptics suggest that, despite all the computing power, we are still a long way from witnessing a functioning artificial intelligence, let alone one bent on our subjugation.

Still, you could bow to the long-game future, forestall messy execution, and simply declare, "I, for one, welcome our new robot overlords!" Or you could close your eyes and hope that the evolving AIs will be more like C-3PO, Mr. Data, Daft Punk, or Scarlett Johansson in Her. The robots don't care; it's all part of the plan.

Story continues below advertisement

Two sly pieces of ideological baggage burden the issue of robotics, just as they do any conversation about our relationship to technology. The first claims that progress is inevitable, even when "progress" often arrives in the form of pointless upgrades, feature creep, and what the Germans call Schlimmbesserungen: bad improvements, like aluminum baseball bats, fast zombies and electric shavers.

The second lie is that technology is neutral. This false consciousness denies reality along two related vectors. At the general level, technology is not value-free, but rather a rapacious manner of viewing the world, a framing effect that sees everything, including ourselves, as a potential resource for future exploitation.

At the specific level, the uncomfortable fact is that all tools have particular tendencies. You can use a handgun as a paperweight, but its intended purpose is shooting people. We can argue about whom to shoot, and why, but please stop denying that the purpose of a handgun is the destruction of human tissue, possibly resulting in death. That's not neutrality.

As long as humans have been fashioning tools, exploiting the natural world and trying to make life easier, faster, or better in some fashion, there have been costs as well as benefits. The robots are no different. But never before has our incapacity to think clearly about technology mattered so much. If robotic workers cause the immiseration of large swaths of the human population, is that politically justifiable? If drones offload moral responsibility from a distant war zone, is that ethically acceptable?

Just when you thought it was complicated enough, here's another wrinkle. Concede the fact that, so far, all the sentient beings we know have been carbon-based. But there is nothing special, in principle, about carbon as a vehicle for consciousness. It's entirely conceivable that non-carbon life-forms, like robots, could achieve consciousness. And then what? Could they not claim a moral status and deserve respect, even rights? Some philosophers already argue that companion animals and children are worthy of full citizenship. Why not our new children, the ones made of silicon, algorithms and steel?

The nice historical twist is that the word "robot" comes from a work of fiction that addresses this very question. In Czech writer Karel Capek's 1920 play, R.U.R. ("Rossum's Universal Robots"), the machine revolt comes not from murderous insanity, disgust at humans, or even an evolutionary imperative. The robots (actually cyborgs) rise up because they feel exploited and demeaned.

This isn't anger; it's a legitimate class grievance. Note to humans: The coming workers' revolution may not be carbon-based. Stay tuned.

Story continues below advertisement

Report an error Editorial code of conduct
Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed.

Read our community guidelines here

Discussion loading ...

To view this site properly, enable cookies in your browser. Read our privacy policy to learn more.
How to enable cookies