Skip to main content
the future of work

While companies such as Uber and Airbnb boast transparency and even mutual reviews, they remain far from immune to discriminatory practices.Jeff Chiu

One of the underlying flaws of any workplace is the assumption that the cream rises to the top, meaning that the best people get promoted and are given opportunities to shine.

While it's tempting to be lulled into believing in a meritocracy, years of research on women and minorities in the work force demonstrate this is rarely the case. Fortunately, in most corporate settings, protocols exist to try to weed out discriminatory practices.

The same cannot necessarily be said for the sharing economy. While companies such as Uber and Airbnb boast transparency and even mutual reviews, they remain far from immune to discriminatory practices.

In 2014, Benjamin Edelman and Michael Luca, both associate professors of business administration at Harvard Business School, uncovered that non-black hosts can charge 12 per cent more than black hosts for a similar property. In this new economy, that simply means non-white hosts earn less for a similar service. This sounds painfully familiar to those who continue to fight this battle in the corporate world – although in this case, it occurs without the watchful eye of a human-resources division.

In the corporate world, companies have moved from focusing on overt to subconscious bias, according to Mr. Edelman and Mr. Luca, but the nature of the bias in the sharing economy remains unclear.

It's either statistical, meaning users infer that the property remains inferior based on the owner's profile, or "taste-based," suggesting the decision to rent comes down to user preference. To curb discriminatory practices, the authors recommend concealing basic information, such as photos and names, until a transaction is complete, as on Craigslist.

Reached by e-mail this week, Mr. Edelman stands by that approach.

"Broadly, my instinct is to conceal information that might give rise to discrimination. If we think hosts might reject guests of [a] disfavoured race, let's not tell hosts the race of a guest when they're deciding whether to accept. If we think drivers might reject passengers of [a] disfavoured race, again, don't reveal the race in advance," he advised.

While Mr. Edelman feels those really bent on discrimination will continue to do so, other, more casual discriminators will realize it's too costly.

An Uber driver who only notices a passenger's race at the pickup point might think to himself he already has driven about five kilometres. If he cancels, not only will he be without a fare, but also Uber might notice and become suspicious, Mr. Edelman surmised.

Not everyone agrees that less information is the best route to take to combat discrimination in the sharing economy. In fact, more information may be the fix, according to recent research conducted by Ruomeng Cui, an assistant professor at Indiana University's Kelley School of Business, Jun Li, an assistant professor at the University of Michigan's Stephen M. Ross School of Business, and Dennis Zhang, an assistant professor at the John M. Olin Business School at Washington University in Saint Louis.

The trio of academics argues that rental decisions on platforms such as Airbnb are based on racial preferences only when not enough information is available. When more information is shared, specifically through peer reviews, discriminatory practices are reduced or even eliminated.

"We recommend platforms take advantage of the online reputation system to fight discrimination. This includes creating and maintaining an easy-to-use online review system, as well as encouraging users to write reviews after transactions. For example, sending multiple e-mail reminders or offering monetary incentives such as discounts or credits, especially for those relatively new users," Dr. Li said.

"Eventually, sharing-economy platforms have to figure how to better signal user quality; nevertheless, whatever they do, concealing information will not help," she added.

Still, others believe technology itself can offer a solution to the incidents of bias in the sharing economy, such as Copenhagen-based Sara Green Brodersen, founder and chief executive of Deemly, which launched last October. The company's mission is to build trust in the sharing economy through social ID verification and reputation software, which enables users to take their reputation with them across platforms. For example, if a user has ratings on Airbnb, they can collate it with their reviews on Upwork.

"Recent studies in this area suggest that ratings and reviews are what creates most trust between peers. [For example] when a user on Airbnb looks at a host, they put the most emphasis on the previous reviews from other guests more than anything else on the profile. Essentially, this means platforms could present anonymous profiles showing only the user's reputation, but not gender, profile picture, ethnicity, name and age and, in this way, we can avoid the bias which has been presented," Ms. Brodersen said.

Regardless of the solution, platforms and their users need to recognize that combatting discriminatory practices is their responsibility and the sharing economy, like the traditional work force, is no meritocracy.

"This issue is not going to be smaller on its own," Ms. Brodersen warned.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe