A Canadian who became a vaccination advocate after the death of her son says she has been subjected to hundreds of online attacks in recent days from an anti-vaccination Facebook group, highlighting what she says is the social-media giant’s failure to curb false and dangerous information about immunization.
Jill Promoli started the advocacy group For Jude, For Everyone after her two-year-old son Jude died as a result of the flu in 2016. Last week, Ms. Promoli created a post on the group’s Facebook page to mark the start of National Immunization Awareness Week. In the post, she explained that her son received a flu shot, but failed to develop immunity and died after contracting the virus. She wrote that if more people were vaccinated, it would reduce the incidence and spread of such illnesses. The post was shared on an anti-vaccination Facebook group, and the For Jude, For Everyone page was inundated with posts that blamed Ms. Promoli for the death of her child and claimed that she was lying about the cause of his death.
“It was very aggressive,” said Ms. Promoli, who lives in Mississauga. “It became sort of a mob mentality.”
The case shows that groups can still spread false messages and attack vaccine advocates despite efforts by social-media companies to shut them down. Vaccination opponents have even used Facebook’s mechanism for reporting objectionable material to have Ms. Promoli’s page blocked.
The attacks seemed to come from members of a closed anti-vaccination Facebook group called Vaccine Education Network: Natural Health Anti-Vaxx Community. The group has a public page and a closed group that only members can view. A member of the closed group told Ms. Promoli her Facebook post was shared on that page and to expect negative comments. The administrator of the public group did not respond to a request for comment.
Ms. Promoli said she likes to promote conversations about vaccines, but felt compelled to remove many of the comments because of their misleading and hurtful content. She said that only seemed to embolden members of the anti-vaccination group, who continued to write on her page.
Ms. Promoli said this is the latest of several co-ordinated attacks from anti-vaccination groups, and that Facebook failed to take action each time. In fact, Facebook has blocked her advocacy page and her business page several times after anti-vaccination users reported them, saying they contained malicious content. Ms. Promoli said she has learned it is common for large numbers of anti-vaccination users to report pro-vaccination pages to get them blocked or removed. She added that after this happens, it is difficult to get the pages reactivated.
Ms. Promoli said she got her pages unblocked only after contacting a friend of a friend who worked at Facebook, and that many other pro-vaccination users have to start from scratch. She said this happens because no one at Facebook appears to be paying close attention.
“It’s difficult to get anybody’s attention there,” Ms. Promoli said.
In response to questions about Ms. Promoli’s situation, a Facebook spokesman said in an e-mail the company has community standards to help deal with bullying and harassment. Facebook may remove offensive content that appears to target people, and that users can report such behaviour, he said.
Tim Caulfield, Canada Research Chair in Health Law and Policy at the University of Alberta, said Ms. Promoli’s recent experience highlights the uphill battle public health officials face in trying to stop vaccine misinformation.
“It demonstrates how mobilized this community is and the degree to which this community will go to get their misinformation across,” he said.
Mr. Caulfield said there are no easy answers, as platforms such as Facebook were designed to allow open conversations. But the anti-vaccine rhetoric and its negative effects mean action is necessary to address the public health risk, he said. Earlier this year, an outbreak of measles in Vancouver was linked to a family who wrongly believed vaccines were dangerous. In New York State, an anti-vaccination group created a handbook promoting false information about vaccines and aimed it at the Orthodox Jewish community, which is facing a major measles outbreak.
Theresa Tam, Canada’s Chief Public Health Officer, said in an interview that she’s exploring strategies with social-media companies to address the spread of false information online. Dr. Tam, who participated in a panel last week with Facebook’s head of public policy for Canada, said part of the issue is striking a balance between allowing people to express their opinions and allowing them to promote misinformation.
Earlier this year, Facebook announced it was taking steps to combat anti-vaccine information, including reducing the ranking of pages that spread false information and rejecting ads that include false messages about vaccination.
“I’m challenging the internet companies to come up with different ideas,” Dr. Tam said. “I don’t think the conversation is over. They’re just beginning to see what they can do."