Skip to main content
Open this photo in gallery:

istock

Hala Mahdi, 16, spent her summer teaching a computer how to think like a teen. She channelled the health-related concerns of her peers – from “Why am I feeling depressed?” to “What should I do about this zit on my forehead?” Then, with guidance from doctors and other medical professionals, Mahdi and 22 fellow teen interns at the Toronto-based startup EmojiHealth fed the answers into a computer program (“Don’t pop the zit. It’ll only make it worse”).

EmojiHealth is a teen-focused spinoff of ConversationHealth, a company that develops chatbots for hospitals, pharmaceutical companies and other medical organizations. (Chatbots are powered by artificial intelligence and allow people to have conversations with computers.) The Toronto-based organization isn’t the only company that sees opportunity to make health care more interactive online, and teens, having grown up getting all their information digitally, are a prime target audience.

Last September, Canadian charity Rethink Breast Cancer launched a Facebook bot targeted at teens to battle misconceptions about breast cancer, answering questions such as “Can underwire cause breast cancer?” or “Does this lump mean I have cancer?” There are similar AI-powered chat programs in other countries for taboo teen topics such as sex-ed and drug use.

“A lot of health topics can be uncomfortable because they are so personal,” says Alexandra Philp-Reeves, the 19-year-old co-founder of ConversationHealth. “We can take away that human awkwardness by replacing the person with a bot.”

And unlike a simple Google search, EmojiHealth’s content is preapproved by doctors and written in a language similar to how teens already talk, making it more accessible, she adds.

While AI is still in its infancy, emerging research suggests these types of tools can be useful. In a 2017 Stanford University study, a group of 18- to 28-year-olds who suffer from anxiety, stress or depression were given access to a chatbot called Woebot. The bot monitored their moods and guided them through therapy techniques. In two weeks, the students who used the bots saw a very quick, dramatic decrease in their anxiety, stress and depression levels, psychologist and lead author Alison Darcy says.

Therapy patients need to do work outside of sessions to see a meaningful mental-health difference, but a lack of self-discipline means at-home techniques often fall by the wayside, she says. Chatbots can nudge participants to continue their therapy, while also checking on their emotional state. Woebot, for example, can analyze responses to questions such as, “How do you feel today?” and offer up extra coping exercises when it thinks it could be useful. The bot also offers 24/7 support, Darcy says.

“When someone is in a moment of distress [at] 2 a.m., they might not have the inclination to reach out to another person,” she says. “But we know it’s good for people to talk about things [that cause them stress, anxiety and depression].” That’s where the “chat” portion of the chatbot comes in handy, she says.

Chatbots could be particularly beneficial for teens, she theorizes. First, youth are far more comfortable using apps and bots than older generations. Second, there are so few resources devoted to teens in the mental-health space that any addition would be a benefit. But the real opportunity is in scale. While a therapist can see a couple dozen teens in a week, a chatbot could – theoretically – talk to hundreds of thousands of teens at once.

Of course, chatbots aren’t necessarily going to gain that reach overnight. For example, EmojiHealth, which makes no profit off its chatbots, will have to rely on word-of-mouth and promotion from Facebook to make itself known, Philp-Reeves says. But as more companies such as Whatsapp or iMessage begin to offer chatbot functions, and as people become aware of the services, the goal is to get onto more platforms in order to reach greater numbers of teens.

EmojiHealth released its chatbots last month, with the technology available on Facebook Messenger, messaging service Kik, and on free websites. The company has 10 different bots for topics such as epilepsy, diabetes, sexual health, depression and dermatology. Teens talk to a chatbot and the AI program allows it to understand what’s being said – no matter how a teen says it (“I feel depressed,” “I’m always sad,” “Wat to do when u feel depressed”). The bot provides curated information, (“It’s okay to feel depressed,” “Would you like tips to manage your depression?”) and, if the situation warrants, advice to seek medical attention or call Kids Help Phone.

But despite the potential usefulness, sharing health information with a chatbot can be problematic since the technology is moving faster than the law can keep up, warns Frank Rudzicz, a University of Toronto professor and researcher at University Health Network who specializes in machine learning and healthcare.

“Data is valuable, and companies are particularly interested in teenagers,” Rudzicz says. “The way technology is progressing is a little like the Wild West. Companies [that develop bots] don’t necessarily know what the legislation or regulations are, they are evolving and it’s very easy to circumvent them.”

Beyond data privacy, there are concerns over how an app should escalate issues. There are processes in place for doctors if they see signs that a kid may hurt themselves or others. But what should a chatbot do?

“Should the app reach out to a loved one?” Rudzicz asks. “AI is still new and could make the incorrect decisions – we don’t want the app to alert a parent saying [the teen] has depression when they don’t.”

For EmojiHealth, it’s taking a slow-and-steady approach, Philp-Reeves says. The bot, for example, is programmed to identify certain high-risk language, but it’s not capable of intervening. Instead, it suggests to users that they seek further medical or mental-health help, while also flagging it for the EmojiHealth team.

“[The bot is] not a person, we’re not comfortable with that liability,” she says. “We know we’re not [always] the tool for the teen to be using, and that’s when we direct them to other resources.”

Interact with The Globe