Skip to main content
opinion
Open this photo in gallery:

Illustration by Nina Martinez

Maria Ressa is CEO, co-founder and president of Rappler, the Philippines’ top digital news site, and was the co-recipient of the 2021 Nobel Peace Prize for her work defending freedom of expression and democracy. She is the author of How to Stand Up to a Dictator: The Fight for Our Future, from which this essay has been adapted.

It was August, 2016. The Duterte administration had just come to power. Local officials along with law enforcement and military personnel from Rodrigo Duterte’s tenure as mayor of Davao were taking the top jobs in the capital. More worrying were the daily reports of deaths: bodies found in the streets of poor neighbourhoods, eyewitnesses whispering about killers descending upon homes in the night. Mr. Duterte’s drug war had begun turning Manila into a real-world Gotham City, without a caped crusader.

Rappler had one reporter and a production team assigned to work the overnight shift. They soon began reporting as many as eight dead bodies every night. They were gruesome murders: some hog-tied, their heads wrapped in duct tape, a cardboard sign on top: “Drug dealer, huwag tularan”: don’t become like me.

I was relieved to be in the Singapore offices of Facebook, a world away from the violence. The purpose of my visit was to deliver a warning to our Facebook Asia-based partners, the people with whom I co-ordinated many of Rappler’s partnerships. Among them were Ken Teh, who among other things handled news groups in the Philippines from the company’s Singapore office; Clare Wareing, who headed Asia-Pacific policy communications; and Elizabeth Hernandez, who was in charge of public policy for the Asia-Pacific region.

Rappler’s relationship with Facebook had begun on a promising note. Ken Teh had contacted me in early 2015, tasked with building partnerships with news groups in the Philippines. Rappler was a logical choice given that we fused online journalism with social network theory. By the time Facebook was staffing up in Southeast Asia, the United Nations-based World Summit Award had already chosen Rappler as one of 40 “best and most innovative digital innovations.” Facebook had even showcased Rappler at F8, its annual conference for developers, in 2016.

When Facebook opened its first office in the Philippines that year, it released startling statistics: that Filipinos spent 1.7 times more time on Facebook and Instagram than watching TV. Filipinos had 60 per cent more Facebook friends than the global average, and they sent 30 per cent more messages than the global average. Sixty-five per cent of Filipinos accessed Facebook every day.

Part of why Rappler had started outperforming legacy news organizations so quickly was our use of Facebook. We had embraced the platform early and knew its performance in the Philippines better than Facebook itself did, often surprising its executives with what we had discovered in our daily data-monitoring operations. I even secretly entertained the idea of actually working for Facebook. It struck me that, like CNN for my generation, Facebook was determining the flow of information for this one.

After choosing my lunch from the lavish buffet spread, I followed Ken, Clare and Elizabeth to a long table and sat down to eat. “What we found is really alarming,” I proceeded to tell them. “I’ve never seen anything like this, but it’s clear how dangerous this can be.”


Open this photo in gallery:

At Rappler, Maria Ressa and her team initially embraced Facebook until they saw its effect on political polarization in the Philippines.Michelle Siu/The Globe and Mail

What I was there to convey had a history.

The only former colony of the United States, the Philippines’ nearly 113 million people boasted an English-speaking, often college-educated, labour force familiar with Western culture. That’s one reason why our country has long been a source of cheap labour for the West. In 2010, the Philippines overtook India as the world’s top call centre, business process outsourcing (BPO) and shared services hub. More significantly, we became a prime source of internet scams, from the days of Hotmail and e-mail spam. Many foreign businesses experimenting in grey areas came to the Philippines because it had few or no internet regulations, and what regulations it did have, it didn’t enforce.

Our country was also where the hate factory 8chan, later 8kun, best known as a forum for violent extremists, was based and later linked to QAnon: The American father and son suspected of creating it had been living on a pig farm south of Manila.

A lot of that changed after a global crackdown between 2010 and 2012, when internet security researchers and law enforcement agencies dismantled spambots and technology evolved to control them. So when those involved in that homegrown industry looked for new business opportunities, they turned to social media. Well before the 2016 presidential elections, the stage had already been set in our country for three converging trends that helped the government shamelessly consolidate power: click and account farms, information operations and the rise of political influencers in the greyer areas of the advertising industry.

As early as 2015, there were reports of account farms creating social-media phone-verified accounts, or PVAs, from the Philippines. They would become a global phenomenon. That same year, a report showed that most of Donald Trump’s Facebook likes came from outside the United States and that one in every 27 Trump followers online was from the Philippines. When the influence economy took off, some shady companies that sold Twitter likes and followers had offices in the Philippines. When Filipino politicians began to experiment with social media, many outsourced their operations to advertising and PR strategists who pulled together a spectrum of content and distribution accounts, from digital influencers to community fake account operators. They gave shape to the disparate elements already in the Philippines that operated in a grey area of law and ethics. Supply met demand, and disinformation became big business.

Uncomfortable questions came up for the Filipino advertising community, as it soon would in nations all over the world: How many of them were “freelancing” in that grey area? How many were working with “influencers” around the world and in emerging markets now known as creators of fake accounts and likes? How were they defining the line between influence and fraud when working with multinational clients? The design of social-media platforms encouraged all of that behaviour, so the technology platforms were having a corrupting influence on the values of our younger generation, especially those roped into working in the industry.

And what about the politicians who betrayed their commitment to the public by exploiting what had once been a marketing tool and shamelessly and insidiously manipulating the public they allegedly served?

It was all about power and money.

This evolution in the Philippines had begun in 2014, when online fans began using social media to support their stars, and political operatives discovered the potential of this kind of engagement.


Open this photo in gallery:

Rappler's office in Manila, 2018. Rappler would face attacks from a network of Facebook accounts for its reporting on online disinformation.Dondi Tawatao/Reuters

One day, we invited a dozen kids with big digital footprints to our office. What they did became known as the AlDub phenomenon. AlDub was a play on the names of the Filipino actors Alden Richards and Maine “Yaya Dub” Mendoza, who appeared on a popular afternoon TV show about two lovers who never got to meet in person. Their fans began lobbying for the two to finally meet, to the point where their social-media followings smashed Twitter’s global record for the number of tweets about one subject.

Building fan groups helped create what were then the harmless precursors of what Facebook called “CIB” – ”Co-ordinated, inauthentic behaviour.” They organized to artificially make hashtags trend higher, at times hijacking whatever else was trending. The groups became so large and successful that it was only a matter of time before corporate marketing seized on their tactics.

Then fandom turned into politics.

Let me show you how easily those shifts happened through the experience of a young man I’ll call Sam. He described how he had started creating pages when he was still in school, starting with an anonymous page that focused on romance. He had begun conversations by asking about people’s hottest date or their worst breakup. He had grown one of his communities to more than three million followers. He was only 15 years old when he began developing groups, tapping into what he thought were topics that appealed to Filipinos: One page was about joy, another about mental strength. About a year later, corporations began to ask him to mention their products. By the time he was 20 years old, he claimed to have at least 15 million followers across several platforms.

That was when he shifted from advertising to politics: He joined a team working for Mr. Duterte’s campaign. He claimed he had built a series of Facebook groups in different cities using their local dialects. It had started innocently enough with pieces on tourist attractions and local news. Then every now and then he would drop in crime stories. The group had started sharing one story every day at peak traffic time. Then he and his friends would write comments that connected the crime to drugs. That was, in part, how Mr. Duterte’s “drug war” became seen as something necessary in Philippine life.

That was the tactic Facebook didn’t pay attention to. What we now call “astroturfing” – the fake bandwagon effect – however, was extremely efficient.

In 2016, Rappler began tracking people who shifted the discourse as Sam did and all of the networks of disinformation. We were one of the few media organizations doing so in the world, which was another reason I had been eager to tell the Facebook Singapore team what we had found.


Open this photo in gallery:

Filipino supporters of Ferdinand Marcos Jr. – also known as Bongbong, or by the initials BBM – wave their phones at a rally in Lipa.Eloisa Lopez/Reuters

I laid out how Rappler had charted three stages of the degradation of the online information ecosystem and political life in the Philippines. One was the early experimentation and buildup of campaign machinery in 2014 and 2015. The second was the commercialization of a new online black ops industry. The third was the consolidation of power at the top and the spread of political polarization across the country.

Chances are that you’ve seen some version of this if you live in a democracy. These phases have been enabled by global decisions and realities far from the Philippines; more than ever before, what’s local is global, and global is local.

In the beginning, it was hard to know what was even happening. Since Rappler and I lived on social media, we felt the shifts more than we understood them. In the run-up to the 2016 election, we began seeing new distribution and messaging techniques for candidate Duterte on social-media platforms. In one instance, his supporters created a Facebook page calling for the death of a student who had asked a question critical of Mr. Duterte. When we called the campaign team, they asked their supporters “to be civil, intelligent, decent, and compassionate.” Those were early days.

In that same election, Ferdinand “Bongbong” Marcos, Jr., the son of Ferdinand Marcos, was running for vice-president. We observed over social media a distinct push to change the history of his family’s past, efforts to redefine and cleanse the Marcos family record. And we witnessed the strong presence of an “us-versus-them” worldview, which inspired anger and hate and helped polarize the electorate.

The second phase of degradation had to do with the commercialization of a new black ops industry that was capitalizing on an underground digital economy long operating in a legal grey zone. As early as 2014, before bots and fake accounts became notorious around the world, Rappler discovered information operations during the country’s telecommunication wars.

The Philippine Long Distance Telephone Company and its mobile provider, Smart, were fighting for users with Globe Telecom and its own mobile subsidiary. Smart was running a promotional campaign on Twitter and Facebook using the hashtag #SmartFreeInternet. Rappler chronicled how a combination of bots and fake accounts had suddenly shut down the entire #SmartFreeInternet online campaign: When someone used the hashtag, it would signal to a bot or fake account to automatically message you something negative.

That drew on an old strategy popularized in the computer industry in the United States in the 1990s known as “fear, uncertainty and doubt,” or FUD. The disinformation campaign spread negative information and lies to fuel fear. The conversation we mapped online reminded us of the Communist strategy “Surround the city from the countryside”: It effectively cut off Smart’s Twitter account from its targeted millennial audience. “Some corporations, interest groups, and governments are mobilizing fictitious social media resources at scale to disrupt other legitimate uses of these platforms,” we wrote in an article. “Left unchecked, practices like this could turn a platform like Twitter into a wasteland, discouraging people from participating and limiting the potential power of the crowd for good.”

And, indeed, just two years later, we saw FUD transfer to propaganda. It shouldn’t have been a surprise because the people who had experimented with it in 2014 were among the ones who turned to politics and rolled it out for Mr. Duterte in 2016.

I showed Ken, Clare and Elizabeth how we had first discovered the shift to politics: by investigating a network that was attacking Rappler and ABS-CBN, the largest TV network in the country.

First, my co-founder Chay Hofilena and her team meticulously recorded the attackers’ Facebook accounts, their “friends” accounts, and the groups the accounts belonged to on a spreadsheet. One chart compiled all 26 accounts, along with what the accounts claimed were their “facts”: where they worked, where they went to school, their jobs, where they lived. We took every column on that sheet and assigned a reporter to verify those details. Every single claim was a lie.

Those 26 accounts behaved differently than did those of most users: The account users belonged to more Facebook groups than they had actual friends. One example was the account of Mutya Bautista. Her public friends list showed that she had only 17 friends, but she was a member of more than a 100 groups, including those campaigning for Ferdinand Marcos, Jr., overseas Filipino communities and buy-and-sell groups. Those groups had members ranging from tens of thousands to hundreds of thousands.

It took our team at least three months to manually count the reach of those individuals’ messages in those public groups. They charted how one fake account on Facebook can reach three million to four million others, proving the exponential reach of a lie. I believe Rappler was the first to quantify this.

I also showed the Singapore Facebook team how systematically those black ops players had weaponized social media by focusing their tactics according to demographics: the Philippines’ tiny upper class, the middle class and the mass base. They created content that would then be amplified through the distribution networks. Though Facebook was a key vector of distribution, the effort was across all social-media platforms.

What we were seeing was a kind of asymmetrical warfare online, except in this case it was the platforms and larger powers using the surreptitious tactics of a rebel group. Anyone who stood up to the lies spreading over pro-Duterte and pro-Marcos disinformation networks was gaslit, or told they were crazy. What the bad guys were doing, they ascribed to the good guys.

The same process was happening in other democracies around the world. In the United States, more lies were being spread among far-right and alt-right groups on its platform, and Facebook had the data to prove it, but it did nothing, for fear of alienating Republicans. That meant that the public, its users – the targets of those information operations – were left completely vulnerable, with few defences available against what seemed like normal flows of information. Donald Trump flagrantly, delightedly lied all throughout his presidential campaign and into his presidency, and all of his lies took off through bottom-up social-media operations similar to those in the Philippines. Both Mr. Trump and Mr. Duterte changed what their populaces thought and how they behaved.


Open this photo in gallery:

Donald Trump and Rodrigo Duterte chat at an ASEAN gala in Manila in 2017.Athit Perawongmetha/Reuters/Reuters


I shared those discoveries with Ken, Clare and Elizabeth at lunch, urging them to provide more of their own data to verify what we had found. Where did they think this could lead?

“You have to do something,” I remember exclaiming, “because [if not] Trump could win.”

We all laughed because that didn’t seem possible, even in August, 2016.

By the end of our meeting, the others looked disturbed; I suspect it was because it was the first time they had dealt with anything like that and they didn’t know what to think. Frankly, Rappler understood the internet and data better than they did. At the very least, though, I thought Facebook would want to make a statement about our findings. As an alpha partner of Facebook, I wanted it to stop the insidious manipulation we were seeing so I could report what had been happening and what the company had done to stop it. I was so alarmed at that point that I thought it was more important to fix what was wrong than to just do the story.

But after our meeting, I heard nothing back.


Open this photo in gallery:

Students light candles in Davao City on Sept. 5, 2016, in tribute to people killed in an explosion at a night market days earlier.Lean Daval Jr./Reuters

On Friday, Sept. 2, 2016, at 10 p.m., an explosion hit a night market in Davao City, Mr. Duterte’s hometown. The bombing killed more than a dozen people and injured dozens more.

The morning after the explosion, Mr. Duterte declared a nationwide “state of lawlessness.” The justification for the declaration included Mr. Duterte’s pet concern: illegal drugs. “These are extraordinary times,” Mr. Duterte said. “There is a crisis in this country involving drugs, extrajudicial killings, and there seems to be [an] environment of lawlessness, lawless violence.”

He stopped short of declaring martial law or a nationwide curfew, but he did call for a greater presence of soldiers all over the country. The government set up more checkpoints. Online, Duterte supporters began to justify the declaration. Public support was necessary because in the past, bombings like this had rarely brought such strong measures. When I sat down to breakfast that Saturday morning, I turned on my computer and was alarmed by what I saw. I immediately called our social-media head, Stacy de Jesus and our head of research, Gemma Mendoza.

Within an hour, I called my co-founders and alerted them. I had never seen anything like this before.

A nearly six-month-old article, “Man with Bomb Nabbed at Davao Checkpoint,” was our No. 1 story in real-time Google Analytics. Originally published five months before the bombing, it had now been trending at No. 1 for more than 24 hours. It would stay in the top 10 stories for more than 48 hours. That was the first time we became aware of a real-time, clumsily executed information operation to manipulate public opinion. Anonymous and fake accounts, meme pages, Duterte fan pages and dubious websites worked hand in hand to make it appear that our “man with bomb” story from March was a breaking-news story, seeming to justify Mr. Duterte’s declaration of a state of lawlessness. Filipinos were duped into sharing a lie.

That was how the state’s “co-ordinated, inauthentic behaviour,” as Facebook would belatedly call such operations, began in the Philippines. It was also the opening salvo in what would be open online warfare meant to tear down the public’s trust in the independent media, and specifically Rappler.

The old story had generated 32 page views (largely from Google search), but the day after the information operation began, the story catapulted to more than 105,000 page views.

To warn the public that that old story was being used to mislead perception, we decided to publish a warning post on our Rappler Facebook page. “Rappler asks our community to verify sources of information, and stop sharing the dated article,” we posted on Facebook on Sunday, Sept. 4, 2016, at 6:18 p.m. “If you see it on your feed, please let others know this happened on March 25, 2016.” We also added a short editor’s note on the story page on our website, which would be the first sentence any reader would see: “This story was published on March 26, 2016.”

Months later, our Facebook post, which had warned our followers about that attempt to mislead the public, was taken down by Facebook itself. Our social-media head, Stacy, sent me Facebook’s reason for deleting our post: “This message was removed because it includes a link that goes against our community standards.”

When we complained, Facebook restored the post, but when I looked again a few months later, it had been taken down again. So we complained again. There was no response.

As of Aug. 1, 2021, the link was dead. It’s almost as though Facebook didn’t want its users to know it had ever happened.


Open this photo in gallery:

Protesters picket Facebook's office in Manila in 2019, accusing it of inaction against fake news and hate speech.Bullit Marquez/The Associated Press


That was the beginning of my growing disillusionment with the company that had initially opened up such exciting possibilities for Rappler. Today, I’m beyond disillusioned. I believe that Facebook represents one of the gravest threats to democracies around the world, and I am amazed that we have allowed our freedoms to be taken away by technology companies’ greed. Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behaviour at a scale that brought out the worst in humanity. Harvard Business School professor emeritus Shoshana Zuboff called this exploitative business model “surveillance capitalism.” We all let it happen.

Facebook today favours moneymaking over public safety. Its company lobbying efforts enable it to bend and break the often lax content rules it sets itself. It rarely prioritizes safeguards for the nearly three billion users on its platform, which in 2020 had revenues of US$85.9-billion. In 2021, the revenues were US$120.18-billion, an increase of 40 per cent.

There are three assumptions implicit in everything Facebook says and does: First, that more information is better; second, that faster information is better; third, that the bad – lies, hate speech, conspiracy theories, disinformation, targeted attacks, information operations – should be tolerated in service of Facebook’s larger goals.

The dangers of “more” and “faster” have led us to dystopia: the suffocation of our minds by junk, a loss of clarity of thought and a lack of concentration, and the empowerment of individual over collective thinking.

Lies repeated over and over become facts in this online ecosystem. As a journalist, I know that we are only as good as our last story and any error must be accounted for, fixed and publicly announced. That’s why we have correction pages. We report the facts because that creates our shared reality. And the reality is that lies, left unchecked, create and sustain flat-earthers, QAnon, Stop the Steal and a rabid anti-vax movement, to mention a handful of the most noxious conspiracy theories.

Mark Zuckerberg’s decisions prioritizing company over country, of growth above all, added to the fact that lies are prioritized over facts, and have destroyed the information and trust ecosystem that gave birth to Facebook. When he accepts 1-per-cent disinformation on his site, it’s like saying it’s okay to have 1-per-cent virus in a population. Both can take over, and if not eradicated, they can ultimately kill.

Another harmful decision that has been made by every social-media platform is to grow its business through algorithms that recommend friends of friends. We click and grow our individual networks, and by extension the platform’s, more when we’re served friends of friends.

So in 2016, after Rodrigo Duterte used Facebook to help him get elected, this “friends of friends” algorithm, along with his divisive us-against-them rhetoric, further radicalized Filipinos. If you were pro-Duterte and you were getting recommendations for posts from friends of friends, you moved farther right. If you were anti-Duterte, you moved farther left. And over time, the chasm between the two sides grew. This has been a global theme; substitute Narendra Modi, Jair Bolsonaro or Donald Trump for Mr. Duterte, and you get the point.

Algorithms serve up content that radicalizes us. If you click on a borderline conspiracy theory, the next content a platform serves you is even more radical because it keeps you scrolling. Groups such as QAnon spread from the darkest corners of the web onto Twitter and Facebook, until they were suspended and banned. It took years to get to that ban. In the meantime, what happened to the people who were swayed to believe in the conspiracy theories? What about their cognitive bias, which may lead them to see the bans as yet another evidence of a conspiracy?

Facebook is changing our behaviour, and it is using its global user database as a real-time laboratory. It changes individuals and societies. On a large scale, this kind of behavioural change is emergent behaviour, and no one can predict the organic change from the individual parts. I saw that happen slowly in the real world in Indonesia while studying the way the radical ideology of terrorism spreads. Today, online, it’s on steroids, crippling societies by destroying trust globally.

Comparisons to the lies and tactics of Big Tobacco in the 20th century are wholly justified. Facebook, and the politicians benefiting from it, know full well the harms they are unleashing on the public. Facebook is the world’s largest distributor of news, yet studies have shown that on social media, lies laced with anger and hatred spread faster and farther than facts. The very platforms that now deliver the news to you are biased against facts, biased against journalists. They are, by design, dividing us and radicalizing us – because spreading anger and hatred is better for Facebook’s business.

In the United States, the surge in extremism has become a full-blown crisis. The United Kingdom and Europe are still reeling from Brexit, the Syrian refugee crisis, and the rise of right-wing nationalism. Similar experiences have been replicated in Brazil, where social media, largely YouTube, moved Jair Bolsonaro and his supporters into the mainstream. In Hungary, Victor Orban’s savvy promotion of anti-migrant toughness has enraptured voters. In India, the world’s largest democracy has fallen prey to the ugly Bharatiya Janata Party (BJP) machinery of Narendra Modi. Everywhere in the world, societies are being fed a steady diet of online violence that turns into real-world violence. Versions of white replacement theory are sparking mass shootings from Norway to New Zealand to the United States, powering the rise of “us against them” or, in a word, fascism.

This is anger and hatred that coalesce into moral outrage that then turns into mob rule.

The world would be so different today if Mark Zuckerberg had not stuck to his ignorant, self-serving interpretation of U.S. Supreme Court Justice Louis D. Brandeis’s aphorism that the way to counter hate speech is more speech. Brandeis said those words in 1927, long before the time of Facebook, when a lie can be delivered a million times over. His formulation also works only if there is something of a level playing field, not what algorithms created. The company’s choices gave a bullhorn to hate speech, disinformation, and conspiracy theories – emotive content that keeps you on site and scrolling, bringing in more revenues for the platform. If Facebook had taken its gatekeeping responsibilities as seriously as the journalists they took them away from, the world would be in a far better place today.

Open this photo in gallery:

Illustration by Nina Martinez

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe