Skip to main content

For the first time, Facebook admits to dragging its feet in response to Russian hacking reports

A Facebook logo is seen at the Facebook Gather conference in Brussels. The company admitted it had not acted quickly enough to address concerns that Russian-backed organizations were using Facebook to interfere in the 2016 U.S. presidential election.

YVES HERMAN/REUTERS

Facebook Inc. is offering its first public acknowledgments that its technology was used to undermine democracy, a move that is seen as an effort to placate regulators in the United States and Europe, who continue to turn up the heat on the social-media giant.

In a series of blog posts this week, the company admitted it had not acted quickly enough to address concerns that Russian-backed organizations were using Facebook to interfere in the 2016 U.S. presidential election. "In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform," wrote Samidh Chakrabarti, Facebook's product manager for civic engagement, in a blog post on Monday.

He outlined several efforts the company is making to "neutralize" the risks, but warned Facebook couldn't promise its technology would never be manipulated to undermine democracy. "I wish I could guarantee that the positives are destined to outweigh the negatives, but I can't," he wrote. "That's why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible."

Story continues below advertisement

During a trip to Europe, during which regulators have pressed social-media giants to crack down on hate speech, Facebook chief operating officer Sheryl Sandberg pledged this week to improve privacy tools to give Facebook users more control over their data and to hire 20,000 people this year to screen Facebook for harmful content.

The changes are in response to data-privacy regulations set to come into force across the European Union in May that will require companies to be more transparent with the data they collect on customers or face stiff penalties.

"We know that tech companies need to do better and that we at Facebook need to do better. We have a lot to improve," Ms. Sandberg told a Brussels audience, which included EU lawmakers, on Tuesday. "We have not done enough to stop abuse of our technology." The public admissions come days after Facebook chief executive Mark Zuckerberg announced a major overhaul of the company's news feed to prioritize content from friends and family over advertisers and third-party content providers, a change that has largely been interpreted as Facebook's attempt to deal with a burgeoning crisis over the role social-media companies have played in fuelling political discord and harming public health with addictive technologies. The company also said it would survey users on the news sources they considered to be the most trustworthy.

Yet, in a sign that U.S. lawmakers are still unsatisfied with Silicon Valley's response to its political problems, two high-ranking California politicians this week urged Facebook and Twitter to investigate reports that Russian accounts were behind an online campaign to press the U.S. House intelligence committee to release a classified memo outlining what some lawmakers believe is an anti-Republican bias inside the FBI and Department of Justice.

In a letter, Senator Dianne Feinstein and Representative Adam Schiff – both senior members of Congressional committees investigating allegations of collusion between Russia and President Donald Trump's campaign – urged the two social-media companies to conduct a forensic investigation this week into whether Russian bots, or fake accounts, were continuing to spread divisive political content over social media.

"If these reports are accurate, we are witnessing an ongoing attack by the Russian government through Kremlin-linked social media actors directly acting to intervene and influence our democratic process," the lawmakers wrote.

The announcements from Facebook are the early signs the social-media firm is "acknowledging that they're a new type of media company" and working to take a more active role as a gatekeeper of the information that flows across its site, said Nicholas Grossman, a professor of international relations at the University of Illinois. Still, he expects Facebook will need to do more to police its platform, such as enlisting a panel of media and political experts to vet trusted news sources, if they hope to avoid increased scrutiny from regulators. "I wouldn't be at all surprised that if they continue in this direction of having an influence on politics without taking that influence seriously that governments, especially in Europe, are more likely to look to regulate them more."

Story continues below advertisement

So far, Facebook has offered a series of statements and proposed changes, but has fallen short of outlining a comprehensive plan on how to address problems such as fake news or political interference, said Michael Connor, executive director at Open MIC, a non-profit group that has helped investors file shareholder resolutions against tech firms.

"If you carefully examine what they're saying and how they're saying it, you realize that they are very cleverly setting no real targets and making no real commitments," he said. "All they are saying is that they hope to do better."

During a panel on trust issues facing tech companies at the World Economic Forum in Davos, Switzerland, this week, Salesforce CEO Marc Benioff called on regulators to get more aggressive in holding tech leaders to account, comparing technology firms with other industries that require government intervention, such as the tobacco industry. "The signs are pointing toward more regulation, because when the CEOs won't take responsibility, then I think you have no choice but for the government to come in," he said.

Calls from within Silicon Valley for regulators to force tech firms to address their trust problem mirror a steep decline in public trust in social media. In a new global survey gauging trust in government, corporations and the media, communications firm Edelman found a large drop in the past year in public faith in social-media platforms, driven in part by people who reported they felt less informed than they had in the past and had trouble separating fake news from legitimate media sources.

"There are some upsides and downsides with these technologies, but there's enough there where people are beginning to question, 'I may not be getting information the same way I used to,'" said Steve Rubel, Edelman's chief content strategist. "And they are in part blaming the [social-media] platforms for that."

Facebook handout video highlights new ad transparency measures being tested in Canada
Report an error Editorial code of conduct
Comments are closed

We have closed comments on this story for legal reasons or for abuse. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.