When I first learned that Facebook had conducted an experiment that played on the emotions of its unknowing users, I must admit: I shrugged.
The study was conducted over the course of a week in 2012, and hid updates that had certain emotional trigger terms from a tiny percentage of users. The result: it found that mood tends to be a bit like a contagion, spreading throughout the network. As this piece in the New Yorker points out, it's likely not mood specifically that spreads, but perhaps anything, because people in groups have a tendency to mimic each other.
It's an interesting conclusion, but also one that was arrived at because Facebook manipulated the experience of around 700,000 of its users without informing them. Since the study was made public, there has been unending debate over the ethics of these methods; the general consensus is that though there's much to be learned from having such a huge resource of data, it's less than ideal that a single private company has access to it. As scholar Zeynep Tufecki reminds us, when these companies have access to so much information about our preferences and behaviour they tend to use it to their own ends.
Yet, I shrugged because I've become so accustomed to the idea that my experience online is a constant stream of manipulation. Such apathy is, undoubtedly, the wrong approach to take. But I have another idea about what lies behind my blasé response: Sometimes the entirety of digital technology itself feels like one grand social experiment, of which the Facebook study was just a minor part.
We often think about technology as a utility or a tool – something that helps us achieve a goal. So when social media came along, what was obvious about it was what it let us accomplish: keeping in touch, organizing social events, posting photos and so on.
But when you try to use social media to find other people's photos and notices, it presents the same problem as all other digital systems: it must obscure some information in order to make other parts of it useful. For example, if right now you type the words "Argentina Belgium" into Google, what pops up is not a list of their populations or a comparison of their literary histories, but a summary of Saturday's World Cup soccer match. It's Google's incredibly complex algorithm that organizes and ranks what we see, because to simply list all the information in the world about those topics would be excessive.
Algorithms are a solution to the digital problem of "too much stuff." They are mathematical filtering systems that cut through noise. So now algorithms are in everything: not just Google searches, but in determining what shows up in your Facebook news feed, what recommendations you get on Netflix or Rdio – or, crucially, what kind of ads you might like to see.
That means that technology isn't just a tool to get things done – when algorithms tailor what we are and are not aware of, they shape our lens on reality. And it isn't just algorithms, either. Think of all the things that have changed in just the last 10 years. From how teens do homework to how we use smartphones, from using Facebook to organize our social lives to navigating our way around cities, digital technology has profoundly altered the day-to-day practices of our lives in an incredibly short time. In just a few short years, we've gone from zero smartphones to well over a billion of them.
In a sense, then, we are all involved in a grand experiment, as technology companies and entrepreneurs throw things at a wall to see what will stick. If Facebook experimented with the spread of sentiment, tech companies at large are experimenting with how humans socialize, gather information, relate to the physical world and more.
Granted, many of these experiments – from Google Maps to Twitter to Wikipedia – have innumerable upsides as they ease how we communicate, inform ourselves and find our way around. That's just the free market at work: companies create things and, if they're good, people consume them. But when you think back to the rise of the car – another product that had many benefits and appealed to people's wants – we can see that there's always an ambivalence to tech-driven change. The freedom of the automobile was a boon that also delivered such clearly detrimental drawbacks as urban sprawl, pollution, a warping of our energy economy and even the high rate of vehicular deaths. When technology reshapes life, there are benefits and tradeoffs, and it's our job to figure out if we're getting enough of the former.
The question to ask is "Whose interests are being served as we live through this period of widespread experimentation?" I think it would be hard to argue that, overall, the Web's capacity to educate, inform, connect and entertain ourselves is a bad thing.
But if digital tech is increasingly a space in which we do all those things, then how that reality is shaped – be it through algorithms, design, or the availability of information – becomes crucial to interrogate. After all, Facebook playing with our mood is just the tip of the iceberg. Almost everything we encounter online is crafted so as to pull us or push us in one direction or another, and you almost get the sense that it's because these companies are fighting to reshape reality in their own image – that when you think about your social life, you'll automatically think "Facebook." Tech companies have so rapidly become so large and essential to our lives, that challenging what they do has already started to feel impossible.
Yet, we should never forget just how rapidly digital seems to move. Though Facebook, Twitter, Google seem to give us more and more reason for pause and less room for escape, some seemingly immovable digital entities – Microsoft, Yahoo!, Sony – have been overshadowed by more nimble, responsive competitors. And as the reasons to be wary of the manipulations of digital life keep adding up, maybe change will emerge in the form of a competitor that understands what we want is to be connected and informed – and not to be the guinea pigs of those who wish to experiment with our lives.