Navneet Alang is a Toronto-based freelance technology culture columnist.
Just a few months ago, Ava Berkofsky explained something that shouldn't have been revelatory, but was: how to properly film black skin. Ms. Berkofsky is the director of photography on the HBO show Insecure and she used a number of lighting and makeup techniques specifically to make sure the black cast looked their best.
The fact that she needed to delve into the topic at all is a result of how film developed. Since the 1940s, Kodak used a system called Shirley Cards that featured white models to adjust the colour accuracy of its film. Consequently, film was always slanted to help white actors look good, while dark-skinned performers were poorly lit.
In one sense, the nature of colour and light on film stemmed from obscure technical decisions. But as a result, dark-skinned people watching film and television have for decades never seen themselves depicted with the same care and range as their fair-skinned counterparts. It is just one more way in which minorities are denied seeing themselves with the fullness and richness of their lives intact.
We often think of technology as a tool or a means to an end. But it is better understood as something that helps us mediate a relationship to the broader world. And if anything, the digital era has exacerbated how tech shapes the way in which we see the world and ourselves. Another recent example: Google's once-obscure Arts and Culture app suddenly went viral after users discovered its ability to match the likeness of selfies they took with famous works of art. It was fun and spread quickly, as everyone posted their matches with various paintings, marvelling at how much they did or did not look like the figures in the paintings chosen by the app.
Details from Albrecht Dürer, John Singleton Copley and Sidney Poitier in For The Love of Ivy. PHOTO ILLUSTRATION: THE GLOBE AND MAIL
Amidst the chatter, however, visible minority users quickly discovered that the app did a poor job matching them. East Asian users were matched with caricatures or generic Asian faces, while others found no match at all, prompting criticism that the app was, if not exactly racist, then at least biased.
Here again, the choices made in designing a technology had led to some people being able to revel in seeing themselves, but left others to look into a mirror and see no reflection. It highlights a growing issue in which the increasing importance of technology in our lives is also accompanied by a troubling set of blind spots and biases that manifest through how tech is designed.
Google itself was taken aback by the app's sudden success. Perhaps it shouldn't have been quite so surprising, though. The app hit at what truly drives viral culture: an otherwise rare moment of collective experience, the sense that what has since evaporated from mass media is instead to be found on social media as half the people you know are, for the briefest of moments, engaged in the same thing.
But the exclusion of people of colour exacerbated the growing sense that the way in which technology is designed carries an unconscious set of choices that, very often, reflect the makeup of the people designing it, or otherwise reproduce existing inequalities.
There are clear reasons the app worked far better for white users. Firstly, the database on which the app relied is part of Google's Cultural Institute and only had access to a limited selection of works of art – many of them European portraits from the 18th century – thus narrowing the potential matches. And because the rapid rise of tech has, for obvious reasons, occurred in the West first, there has been a generally greater focus on scanning and making databases of Western art, further making matches more likely to favour people of European descent.
It all sounds innocuous enough. But the decision to make an app that matched faces with painted art reflects an inherent bias that prioritized white users to the exclusion of others. That there are clear historical and practical reasons for the bias is a fact that both is and should be kept distinct from the choice made to reflect that disparity at all. There is, too, a further factor at work: the more basic fact that Western art itself has had a greater emphasis on both realism and portraiture. That is: The choice to focus on portraiture is itself already wrapped up in a number of cultural assumptions.
Like the depiction of dark skin on film, the core of the issue is representation. There is an intricate relation between how people understand themselves and their place in the world and their depiction in public spaces. In a way, that complex interplay is found in the doubled meaning of the word "representation" itself: It can mean to depict something, but also to stand for, as in representative government, or a representative of a group. While missing out on a fun viral app is hardly the end of the world, it is nonetheless part of a much more significant pattern in so many other venues, whether politics, art or business, where women and minorities of all stripes cannot find examples to model themselves after. Representation matters because we don't just form our identities inside our own heads, but also in relation to idealized depictions we see out there in the broader world.
The choices made in the design of particular apps and social networks thus form part of a broader structure that affects how we as individuals relate to the world. It is also an increasing problem as decisions get automated. Just recently, Google's algorithm for classifying photos tagged two black people as gorillas – to which Google responded by simply removing gorillas from its database.
It highlights the pernicious and immensely difficult problem facing tech. Artificial intelligence and the machine learning that drives it work by filtering through enormous amounts of data and recognizing patterns over time. In theory, and sometimes in practice, those algorithms become more and more sophisticated. One clear, positive example is the rapid advancement in how well technology can recognize human speech. Digital assistants such as Amazon's Alexa or Apple's Siri can now fluently understand spoken language, and in a variety of accents, too.
But the history of the world is itself out of balance – biased, distorted, imbalanced. Just as Virginia Woolf imagined Shakespeare's sister and what history may have looked like without the oppressive erasure of patriarchy, the knowledge that both we and our technology have to draw on reflects the one-sidedness of our history and all of its many prejudices: classism, misogyny, colonialism, heteronormativity and no end of other, very real "isms." If the databases we form to teach technology or sort information draw on history with no accounting for its slant, then it is doomed to merely replicate it.
Technology is thus not some neutral conduit through which we communicate or learn. It is instead a mediating layer that constructs reality in a particular way. And as we've seen recently, the choices made in the design of the platforms we use everyday can have profound consequences. There is a logic to the design of tech that has created structures that run on attention, likes and similar metrics, and Facebook and Twitter have thus been able to be gamed by bad actors, whether those bent on harassing or parties who wish to influence politics. There is also the growing problem of polarization and the extremism of online debate, much of which is not merely reflective of trends in society but instead is made worse by the nature of online platforms themselves.
We are living through a moment of reckoning for technology. In the past couple of decades, the digital era has represented an upheaval of historic proportions. As we have rushed headlong into our screen-based future, however, we have overlooked that what drives tech is not merely the promise of some gleaming, pristine utopia hovering just over the horizon, but instead the muck of history and its legacy of avarice, prejudice, violence and erasure. Just as filmmakers are now learning to manipulate light, so too must tech learn to shine and direct its gaze in new ways – uncovering and challenging the deep biases that lie beneath so that the future may, in a manner unflinching and honest, be a little brighter than the darkness of today.