Skip to main content

Facebook is making it easier for users who express suicidal thoughts to get help.Paul Sakuma

How much does a Facebook status update reveal about our mental state?

It's a questionmany are asking after the popular social networking site moved to allow users to report content suggesting someone is contemplating suicide. The person posting the content is then sent an e-mail that includes a link to a live chat with a trained counsellor.

The decision raises the issue of whether individuals have a responsibility to patrol comments posted by friends and acquaintances, and more important, how to judge circumstances that may warrant intervention.

Should you alert officials if a co-worker's status update expresses frustration with life after an especially hard week? Or when a friend writes about feeling depressed for no reason?

Like many matters related to health, there are no black and white answers. And while they support the ultimate goal of preventing suicide and connecting people with the help, some experts doubt that the new measures will have the desired impact.

The major flaw in this type of program is that it is based on inferences, said Darcy Santor, professor of clinical psychology at the University of Ottawa, who studies adolescent mental health. The system requires people to make sensitive judgments about the mental state of others and is entirely subjective, which could lead to any number of unintended consequences, he said.

"[From]a public health perspective, it is very unlikely that this patchwork approach to a very serious problem is going to yield the kind of benefits that they would like," Dr. Santor said.

People who are putting themselves out there on Facebook know they have an audience, which could mean the content of their statements is exaggerated or not truly reflective of how they feel.

And while an individual may think he or she is simply trying to help by flagging a Facebook contact they believe may be suicidal, the consequences of making the wrong decision could be significant, Dr. Santor warned. In addition to how the individual in question may feel after being wrongly singled out as potentially suicidal, there is the possibility that the story will become the subject of gossip in a circle of friends, he said. The issue of false identification when it comes to suicidal thoughts or behaviour could also carry legal or liability issues.

"This can start to damage peoples' reputations," Dr. Santor said. "Do we really want to put average people in a role of making a very complex, difficult decision that should frankly be left to individuals with training?"

At the same time, many people who are actually struggling with serious mental health issues and even suicidal thoughts may feel too much shame to express their thoughts to anyone, let alone online.

Wendy Craig, professor of psychology at Queen's University in Kingston and one of the country's leading experts on childhood bullying, said it's positive to see a major company such as Facebook taking steps to bring the issue of suicide out in the open.

But she questions the impact the initiative will have for individuals. It's one thing to receive a link to start an online chat with a crisis counsellor and another to have access to local supports and resources that can provide help over the long term, Prof. Craig said.

"I think that we have to be ethically careful in a sense to be sure we're not just going to support these kids in the moment and let it be," she said. "You need to connect them into the resources so they can have ongoing support."

Rather than relying on a Facebook gimmick, Dr. Santor stressed the importance of simply reaching out to people who may be in need. If an individual expresses frustration with life, he suggests talking to them about how they're doing and urging them to get help. If someone talks about ending his or her life, offer support, accompany them to meetings with a mental health professional or make sure they get to an emergency room, he said.

"This is not really rocket science," Dr. Santor said. "It's just, do the humane thing."

How it works

Using the new Facebook option to report suicidal content is relatively simple:

Click on "report story or spam" – found in a tab that appears next to items in the newsfeed.

A text will then appear asking whether you want to file a report about the content.

If you do, a box will appear. Choose the option "violence or harmful behaviour."

A dropdown menu will appear with options, including suicidal content.

You can also make a report at any time by visiting Facebook's "Report Suicidal Content" page.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe