Skip to main content

Mobot the magnificent mobile robot invented by Hughes Aircraft Electronic Labs.J. R. Eyerman

The sci-fi thriller Humans, which wrapped this week on AMC, depicts a world where people turn to "synths" (a.k.a. robots) to help simplify their daily lives. Naturally, things only got more complicated when the robots began caring for their kids and sleeping with their husbands. In the real world, however, scientists and designers are claiming that a new wave of sartorial technologies will lighten the burden of another habitual (and in turns inspiring and maddening) act: getting dressed.

Researchers at the University of Toronto and the Institute of Robotics and Industrial Informatics in Barcelona, for instance, have turned computers and mobile devices into personal stylists. After studying thousands of photos on the fashion website chictopia.com, the team, led by U of T computer scientists Raquel Urtasun and Sanja Fidler, came up with a complex "fashionability" algorithm that tells users how to make an outfit, and even the background of their photos, more appealing. "We want people to be able to use it to dress better," Fidler says over the phone. It's also useful in helping people select pictures to post on Facebook or online dating sites, she explains. They hope to launch an app in the coming months.

Fits.me, meanwhile – a website that bills itself as a "virtual fitting room" – uses volumetric robots to help users assess how well clothing purchased online will fit them. Punch in your measurements, and an image will appear onscreen of a woman or man with a similar body type wearing the article that you're interested in. (Caveat: Users can only "try on" pieces from participating retailers.) The idea was developed in response to high return rates and customers' dissatisfaction with their online shopping experiences, chief executive officer James B. Gambrell says from London, where the company is headquartered. The data the site gathers are proving useful to participating clothing labels, too. "We can tell a brand how people are wearing their clothes. If [a customer is] looking at a size 8 blouse, we can tell if they're wearing it loose or tight, and what their average body size is."

Over in San Francisco, a startup called Electroloom has created a 3-D printer that shoots out garments – or, more accurately, a liquid solution that turns into polycotton when it hits and adheres to a 3-D model – on demand. Still a work in progress, its prototypes have crafted tank tops and skirts sans sewing machine, and its founders have indicated that a working model will be ready for early adopters by spring 2016.

For anyone who has ever needed a hand doing up her dress, the Zipperbot, created by Adam Whiton, from the Massachusetts Institute of Technology's Personal Robots Group, is apparently up to the task. The device, which resembles a bulkier version of the pull tab on an actual zipper, is not available to purchase yet. However, Whiton thinks his bot could start a trend in assistive clothing, helping people with disabilities get dressed, for example. It could also be handy for anyone who wished her pencil skirt detected motion, loosening its fit for lunchtime walks and zipping up again once back at the office. He admits that having a sartorial robot that performs common functions on its own can be confusing for users at first. "We want to be in control of our clothing, because we want to be in control of our identity," Whiton says over the phone. Handing some of that control over to an automated entity, as he sees it, is all about baby steps. "Right now, wearables are pretty big. But the next step is going to be more about activation," he predicts, referring (of course) to full-on sartorial robots like his.

Interact with The Globe