Sheema Khan is the author of Of Hockey and Hijab: Reflections of a Canadian Muslim Woman.
The warning signs had been there all along. An assault on a 15-year-old boy; death threats against the man’s own parents; a police safety bulletin warning of his gun stash and desire to kill a cop; violent attacks against his spouse; a weapons complaint to the RCMP; fear by neighbours and relatives of his sociopathic behaviour; rampant alcoholism. As an in-depth Globe feature reported, the nation’s worst mass shooter “was the kind of man who made people nervous, bragged about knowing how to dispose of bodies and built miniature coffins as a hobby.”
As we wait for the launch of a public inquiry, there are so many questions about the horrible incident in Nova Scotia. Foremost – given all the warning signs, how is it that a man with known violent tendencies was never detained? What can we learn from the past so that it does not revisit us in the future?
This last question is now being addressed from a very different perspective due to advances in technology. Enter the field of machine learning (a subset of artificial intelligence), which analyzes reams of data from the past, discerns patterns therein and estimates future behaviour. It’s not too different from how we, as humans, operate when we learn a new skill.
#NeverAgainTech is a U.S.-based organization devoted to using machine learning as a tool to minimize mass shootings by developing powerful algorithms that comb through vast amounts of data from past mass shootings, along with data from conversations lurking in the darkest recesses of the internet.
Its CEO and founder is 19-year-old Shreya Nallapati, who initially entered high school with a law career in mind. After her laptop was hacked, she devoted her time to cybersecurity so she could help others protect their data. As the only female on a cybersecurity competition team in high school, her skills were belittled by male teammates. She asked her teacher permission to go solo. At the Colorado state competition, she placed higher than the team of boys she was on previously. As she told Forbes magazine: “Create your team of all girls, and you’ll be surprised how much you can accomplish.”
Then came the Marjory Stoneman Douglas High School shooting in 2018. Inspired by the eloquent words of activist Emma González, Ms. Nallapati knew her calling: to develop high-tech tools to end the madness of mass shootings. This generation is fed up with the proverbial “thoughts and prayers;” they are using their youthful energy to change the status quo. After Ms. Nallapati put out a call to a network of young women in tech, the #NeverAgainTech platform was formed in 2018, led by female teenagers. Now, the organization includes a cadre of seasoned mentors (both male and female), experts in fields outside of artificial intelligence and a few advocates of the Second Amendment. Their shared goal: prevent the massacre of people by gun violence.
How does it actually work? The platform sifts through massive data sets of past shootings to look at a range of shooter characteristics, including demographics and socioeconomic background. To discover how a mass shooter devolves from words to violence, the platform also combs through the dark corners of the internet, namely message boards 4chan and 8kun (formerly 8chan). At least three suspected mass shooters posted manifestos on 8kun last year. Powerful algorithms “learn” the common features of a mass shooter and the signs of an impending attack. In 2019, #NeverAgainTech detected signs of three potential mass shootings and alerted authorities.
Now COVID-19 has added a new, dangerous context. In March, 3.7 million background checks were made related to firearms purchases in the U.S. – the most ever in a single month. As of mid-June, almost 46 million Americans have filed for unemployment during the pandemic. And hateful rhetoric against Asians increased 900 per cent on Twitter in February. #NeverAgainTech’s data scientists have found references to “ethnic cleansing” of Asians on 8kun. Learning from past data analysis, #NeverAgainTech predicts that “exorbitant rises in gun purchases, combined with growing economic uncertainty and extremist rhetoric, could result in a surge of mass shootings in the ensuing months after quarantine periods.”
I asked Ms. Nallapati, who will speak at the C2 Montreal business conference in the fall, if her platform could have been used to prevent the horrific events in Nova Scotia. She said right now, no, since the shooter did not have a presence on hate-based message boards. However, the shooter’s many interactions with police could serve as data points to learn from. The platform is a work-in-progress, driven by the desire to use AI in the service of humans. It’s another valuable tool to keep society safe.
Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.