Skip to main content

Mark Zuckerberg strode onto the stage of Facebook’s developers’ conference this week to signal the end of his months-long apology tour. Facebook was committed to “keep building,” the CEO said, inspired by a question he had recently asked himself: “What is that basic idea that the world would lose if Facebook went away?”

But inside the cavernous downtown San Jose convention centre, developer Thomas Eliot was asking himself a very different question: Is Facebook the right place for his small Berkeley, Calif., video game company to build its business?

He had just come from a session in which Facebook employees had walked through all the features the social media giant was rolling out for game developers such as himself. Many of them seemed to be about hooking users to come back to play games on the platform again and again.

Story continues below advertisement

“It could be like a cigarette smoker taking a cigarette every 15 minutes,” Mr. Eliot said. “Gambling is such an enormous market, and Facebook opens those potentially addictive behaviours up to another very large segment of the world’s population.”

Such concerns were not uncommon among developers at this year’s conference. At an event that had attracted 5,000 technologists to learn how to use social media to change the world, many were starting to wonder exactly what ethical tradeoffs that might entail.

Were the hyper-targeted ads they purchased on Facebook, Google and other major tech platforms an invasion of customers’ privacy? How much of their users’ personal information did they really need to access? Were they predominantly engineers building startups destined for astronomical growth, or were they also consumers who deserved protections? And if the answer was both, where should they draw the line?

The tech industry has been buffeted by a seemingly endless series of ethical scandals in recent months – from last year’s toxic workplace drama at Uber to a growing backlash against smartphone addiction to Facebook’s data privacy woes.

Yet, there are signs that Silicon Valley is undergoing a profound cultural and moral shift. Tech workers here now invoke Facebook’s one-time slogan “move fast and break things” with a sense of irony. Long the valley’s rallying cry for breathless innovation, many now see it as a warning for an industry whose obsessive focus on growth has led to a callous disregard for ethical concerns.

Under the harsh glare of the public spotlight, it seems, Silicon Valley may finally be finding its conscience.

For some, the realization that major tech companies are often making the same profit-driven moral tradeoffs as many of the businesses they’re trying to disrupt has been dispiriting.

Story continues below advertisement

Last month, Facebook designer Westin Lohne announced on Twitter that he had quit. “Morally, it was extremely difficult to continue working there as a product designer,” he wrote.

Others see the newfound self-awareness as a positive sign. “I think it is quite an important moment for computing,” said Mehran Sahami, a Stanford University computer science professor and former senior research scientist at Google.

We’ve gotten to that moment where people realize we need to think more broadly about what technology means for our society, as opposed to just what is the technical problem that can be solved with technology.

— Mehran Sahami

Facebook has borne the brunt of the public backlash against tech’s ethical lapses after revelations this year that political consultancy Cambridge Analytica had improperly accessed data on as many as 87 million Facebook users. Some suspect the controversy has dealt a devastating blow to employee morale. “I’m really worried about this,” Adam Mosseri, head of the company’s news feed, wrote on Twitter. “I worry it’ll make it much more difficult to step up to the challenges we face.”

A company spokesperson acknowledged that Facebook has been going through a challenging time and that employees were actively discussing the fallout on internal forums, but said the company has not rolled out any specific efforts to boost morale.

So far, Facebook says, it hasn’t seen an increase in employee departures or a drop in new job applicants. Most employees are busy working on the numerous changes the company has pushed out since the Cambridge Analytica news broke in March. “These things would not be rolling out if people were jumping ship,” the spokesperson said.

But Facebook’s reputation has been dealt a blow, even within the computer engineering schools where students have long fought for coveted internships at the most prestigious tech firms.

Story continues below advertisement

Sam Resendez, a sophomore in computer science at the University of Southern California who is set to start an internship at Facebook in June, noticed a sudden, dramatic change in attitude toward the social media giant among his friends and classmates after the data-privacy controversy erupted.

“It has been very much a bombshell inside the tech industry,” he said. “Facebook was originally a very reputable company to go work at, and it certainly still is. But now the discussion has changed from this very prestigious technology company to work for to this shady data-stealing company. It’s been very, very drastic and it happened very quickly.”

Mr. Resendez, 19, signed a contract for the summer internship last year, excited at the prospect of working at a major tech company known for its strong engineering culture. He’s still looking forward to his internship, but the scandal has changed his perspective.

“At least initially, going in, you see all these companies pride themselves on being very socially positive,” he said. “I guess it was a bit of a reality check that there is all this ruthless pragmatism required to run a company.”

App developers, who have been at the centre of Facebook’s data-privacy scandal, are similarly questioning their relationships with the tech behemoths that control access to much of the online world.

When Albie Brown launched Spotter, an app he describes as the “Airbnb of driveways,” in his hometown of Providence, R.I., he used Facebook’s ad tools to micro-target Spotter to people he thought would be early adopters: students with disposable incomes who had recently bought cars and lived in a specific neighbourhood of the city.

The day his promotion launched, a friend who happened to fit his target demographic perfectly told him the ad had popped up on his Facebook profile. “Before then, it had all felt pretty theoretical, that these ads would just go out to this whole world out there,” Mr. Brown said. “It was just one of those clear moments where I thought: Okay, this is kind of creepy. This is a little bit more than I signed up for.”

Olivia Rae Brown sits for a photograph at Stanford University on April 27, 2018.

David Paul Morris/The Globe and Mail

Olivia Brown experienced a similar revelation. As a teenager building apps for the iPhone, she would get frustrated when users would deny permission to access the features on their phones necessary for her program to run, then leave her a bad review because her app didn’t work.

It sent her digging into Apple’s data-privacy policies to understand why a company so friendly to app developers would also allow users to limit their products.

“After that, every time an app would ask me for permission to use my camera, I started asking: Why?” she said. “It was just this perfect combination where, both as a consumer and as an engineer, I was seeing what the impact of ethics in tech could be.”

Now a freshman studying computer science at Stanford, Ms. Brown, 19, has already ruled out a career at some of the region’s top tech firms. “There are some companies that I could never see myself working for at this point, like Facebook or Google, that in my mind are companies I barely want to use as a consumer any more.”

The Cambridge Analytica controversy served as a particularly incendiary flashpoint for public debate of the ethics of tech. But many in Silicon Valley say opinions of the industry’s ethical responsibilities have been gradually shifting over the past several years.

Computer science lecturer Cynthia Lee began noticing a change in her students at Stanford about three years ago, as broader protest movements over inclusion and diversity on campus swept through elite colleges and universities across the U.S. “At the same time, that’s when I saw our students on campus really starting to question whether tech was headed in the right direction,” she said.

Students have been a leading indicator of the industry’s changing priorities, she says, raising issues such as sexism and consumer privacy long before they hit the mainstream. “It’s a very stereotypically millennial thing. They want to work on something that matters and that is contributing positively,” Dr. Lee said. “But where it’s not merely a stereotypically millennial thing – and was very prescient on their part – was realizing that tech as currently constituted was not necessarily going to do that.”

Earlier this year, a group of students formed Stanford Students Against Addictive Devices and protested outside Apple’s headquarters in Cupertino, Calif. Among their demands was for the company to develop an “essential mode” for phones that would allow users to limit the devices to just calls, texts and taking photos.

The 2016 U.S. presidential election, and revelations that social media firms had helped elect a right-wing populist to the Oval Office, was another pivotal moment that sparked a rethink among technologists in the deeply liberal San Francisco Bay Area.

“Trump’s election was a bit of a wake-up call,” said David Judd, a member of the Tech Workers Coalition, a group that supports labour organizing efforts in the tech industry. “It wasn’t necessarily a bunch of people suddenly realizing that there are risks to the kind of data that’s being collected, but rather a bunch of people realizing that a bunch of other people also had that thought,” said Mr. Judd, who is on parental leave from his job at payment processing firm Stripe.

Adding to an overall atmosphere of disillusionment is the realization that careers in the tech industry can come with the same stress and downsides as more traditionally lucrative, white-collar industries such as finance.

Nicolle Zapien photographed in San Francisco on April 25, 2018.

David Paul Morris/The Globe and Mail

Nicolle Zapien is dean of professional psychology and health at the California Institute of Integral Studies and a practising psychotherapist in San Francisco. Many of her clients affiliated with the tech industry come to her burned out from long hours trying to build the next billion-dollar startup, grappling with addiction and other mental-health issues –aggravated by the products they have helped create – or struggling to get a date in a world where so much communication now happens through apps.

“They’re coming into my office – millionaires, multimillionaires, 30 years old. They’re incredibly bright and they’re saying, ’I’ve launched three startups, I’ve been part of Google and I don’t have any relationships. I’ve been working for the last decade, 16-hour days, and I don’t know how to connect with people. I have everything and I have nothing.’ ”

Technologists say that, despite their reputations as ambitious meritocracies, the valley’s most prestigious tech firms often have their share of bureaucracy, office politics and make-work projects meant to help departments hit performance targets rather than change the world.

“I think there’s been this realization that a lot of times, when it comes down to it, the average engineer at Facebook or Apple or any other company isn’t going to solve the world’s problems,” Ms. Brown said.

What’s more, the promise of huge financial windfalls and high-profile jobs building innovative technology, which lured engineers to companies such as Google, Amazon or Facebook back when they were just startups, are much harder to come by at what are now large, well-established firms.

“A lot of people in the industry were kind of promised something that a lot of other industries don’t quite promise, which is the idea of positive impact and positive change,” disillusioned said Stephen Cognetta, a former product manager at Google. “It’s not an untrue narrative, but it’s a selective narrative. A lot of people are realizing that and feeling a little disillusioned with that idea and feeling like the tech industry isn’t what they had thought.”

While at Princeton University and later working on the team in charge of the Google Doodle, the ever-changing graphic that appears on the Google search engine, Mr. Cognetta volunteered on a suicide hotline in his spare time. He saw a wide gulf between the mental-health professionals who were deeply suspicious of technology’s impacts on society and the computer programmers who spent little time thinking about the mental-health consequences of the products they were building.

So Mr. Cognetta left Google in the spring of last year to pursue his interest in mental health. Earlier this year, he teamed up with Dr. Zapien’s institute to hold a hackathon aimed at developing new technologies to improve emotional well-being. The event brought together more than 300 technologists and mental-health professionals.

Part of the focus was to disrupt the hard-driving hackathon culture itself. Hackers swapped the typical all-night coding sessions and Red Bull for yoga breaks and guided meditation.

The event also made Mr. Cognetta more aware of the ways that technology was fuelling some of the mental-health problems that he was now trying to solve with technology.

“A lot of us in this world know that this is a problem. We’ve been told for many years that technology can do things like isolate us,” he said. “But it’s hard when you’re working in a company where users are represented by numbers on a dashboard to really identify with these sorts of issues.”

In June, he and Dr. Zapien will be part of a group hosting a “Reverse Hackathon” aimed at re-engineering a piece of existing technology to make it more ethical and socially responsible. The event has attracted high-profile sponsors, including Google and venture capital firm Greylock Partners.

They are not alone. As the public backlash against the ethical downsides of technology has gained momentum, more technologists are turning their efforts toward trying to fix those problems. Recently, a group that includes former Google design ethicist Tristan Harris and early Facebook investor Roger McNamee created the Center for Humane Technology to push for changes to addictive technologies they warn have “hijacked our minds.”

Computer science schools are also beginning to respond to the backlash by revamping curricula to more deeply incorporate lessons on the ethical consequences of technology. Stanford is working on a “significant reimagining” of an existing course on computer science ethics and public policy in order to bridge the divide between technology and the humanities, said Prof. Sahami of Stanford.

The revised course, which is expected to launch in January, will incorporate experts from other academic fields at the school, such as public policy, social science and ethical theory. The goal is to create a course for hundreds of students, teaching computer scientists about the ethical underpinnings of regulation while teaching humanities students how to use the technology they may one day be regulating.

“Ultimately, if we want to get good solutions to these things, they’re going to need to involve people with lots of different skills,” Prof. Sahami said. “It’s not just going to be solved by one faction of people.”

Such changes are the silver lining to what has been a dark year for Silicon Valley, holding out the promise that an industry that vowed to change the world has the ability to change itself.

As a woman working in a male-dominated field for the past 30 years, Dr. Lee long fought against the stubbornly held notion that computer science was an open-minded meritocracy that could do no wrong. The spotlight now being shone on the downsides of tech is a welcome change, she says.

“What’s been demoralizing to me is all those years of being frustrated that faculty, businesses and other power structures within the field that I loved didn’t seem to be taking this seriously,” she said. “It’s been a very energizing year to finally be having this conversation.”

Report an error Editorial code of conduct
Comments

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • All comments will be reviewed by one or more moderators before being posted to the site. This should only take a few moments.
  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed. Commenters who repeatedly violate community guidelines may be suspended, causing them to temporarily lose their ability to engage with comments.

Read our community guidelines here

Discussion loading ...

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.