Rob Wipond is an investigative journalist and author of Your Consent Is Not Required: The Rise in Psychiatric Detentions, Forced Treatment, and Abusive Guardianships.
A teen recently called a self-described “anonymous and confidential” crisis hotline to talk about his feelings – then, minutes after the call ended, police arrived at his home, handcuffed him in front of his confused and horrified parents, and took him to a psychiatric hospital.
The hotline call responder had decided the boy might be at risk of killing himself, and covertly contacted 911 to trace his mobile phone. At the hospital, the boy’s belongings were confiscated, he was ordered to strip naked for bodily inspection, and – now sobbing uncontrollably – he was forcibly tranquillized. “It was a living hell,” the boy told me. “I felt like my world was ending, and everyone was making it worse.”
He then broke off communication – afraid that I, too, might breach my promise of confidentiality.
Many crisis and suicide hotlines have practised this kind of call tracing for decades, but they have made efforts to keep it secret. Technology has made call and text tracing easier, and complaints have become more visible on social media, where unwitting callers and texters describe feelings of betrayal and the devastating impact of police appearing at their homes, workplaces or schools and hauling them off for psychiatric evaluations – and, sometimes, prolonged hospitalizations and involuntary treatment. Many say they’ll never feel safe reaching out for help again.
What’s more, there’s no clear evidence that forcibly hospitalizing someone helps more than harms. Studies show even expert predictions of suicide barely beat random chance. Worse – perhaps because psychiatric hospitals tend to be depressing places – a meta-analysis found that the suicide rate was approximately 100 times the global average suicide rate in the first three months after hospitalization, and 200 times that rate for those who’d been admitted with suicidal thoughts.
Disturbingly, these kinds of incidents will likely become more common in Canada with the launch of the national 988 hotline. Starting Nov. 30, Canadians will be able to text or call the three-digit number to access free mental-health and suicide-prevention support. That system, overseen by Toronto’s Centre for Addiction and Mental Health and modelled after the U.S. 988Lifeline initiative, will employ tracing by covertly connecting to 911.
So it’s time to ask: Should government ban such non-consensual call and text tracing – or at least ban crisis hotlines’ false advertising about confidentiality?
Publicly, hotline operators often imply that they only trace the calls of people actively attempting suicide. In fact, most call-tracing policies, including what’s planned for Canada’s 988, apply to much larger spectrums of people deemed to be at “imminent risk.”
Imminent risk refers to predicting harms that might emerge hours or days later. It includes suicidal feelings, and other distress or behaviours that could lead to harms. If a caller has a plan and the means to kill themselves, then they’re at imminent risk and might receive an “emergency intervention” from police – with or without the caller’s knowledge or consent.
But how many people with suicidal feelings do not have any plan nor access to any means of killing themselves? More problematically still, the (often-volunteer) call responders are trained to ask – without disclosing that it’s an assessment question – “If you were going to kill yourself, how would you?” Having an answer raises your risk score dramatically.
And is tracing based on imminent risk actually legal? That’s yet to be visited by privacy commissioners or courts. Some privacy policies – such as that of the Crisis Centre of BC (which is joining the 988 system) – state that personal information is shared without consent “only as authorized by law.” A spokesperson for British Columbia’s privacy commissioner clarified that any breaching of privacy without consent must meet a high threshold of protecting health or safety, and be “clearly in the interests of the individual.”
Dr. Crawford argued the practice meets that threshold, saying, “We’re just trying to keep the person safe in that moment.” Yet minutes from internal U.S. 988Lifeline meetings show widespread recognition that police visits and forced hospitalizations can be “traumatizing” for people in emotional distress and create “dangers of brutalization, violence, and criminalization.” While Dr. Crawford said she has heard such concerns, she nevertheless believes that “emergency intervention is a necessary part of the service.”
Different approaches are possible. Call and text tracing could be abolished – after all, some hotlines never do it. Or the policy could be narrowed to truly apply only to those actively engaged in killing themselves. Or, the first time someone contacts 988, the policy could be openly discussed.
But for Canada’s 988, “confidential” is for promotional purposes only. Many call centres and crisis texting services also collect contents of conversations and share them with “third parties” for “research” and “service improvement.” Sometimes, those conversations are even used for profit, as was the case for Crisis Text Line (also part of Canada’s 988), which shared data with its for-profit AI spinoff until that practice was reported by Politico last year. And it’s highly debatable whether the often-automated processes by which these data are “anonymized and aggregated” before sharing would detect and purge all potentially revealing details about people’s personal fears, family conflicts or workplace frustrations.
So, for those who truly value confidential conversations, we may have only one choice: We must stop advising people to call crisis hotlines, and instead make ourselves more present to listen to and support each other – in private.