NOEL KING, HOST:
School districts in the U.S. are using computer programs to keep an eye on what students are doing online. The algorithms look for words that suggest self-harm, but they might not always be accurate.
Alisa Roth of APM has the story. And just a quick warning, suicide is discussed.
ALISA ROTH, BYLINE: The programs are supposed to keep track of students' online activity to make sure they're not looking at inappropriate content or talking about violence. But in the past year, the programs have been flagging something else. Kids are sad, writing things like this.
NICOLE PFIRMAN: I have no friends. How can I make friends? Or I'm feeling alone. I am hopeless. I don't have a reason to live.
ROTH: Nicole Pfirman is the mental wellness coordinator for schools in Mason, Ohio, northeast of Cincinnati. She says those things help the district flag students who might be heading for a mental health crisis.
PFIRMAN: And it gives us insight into what the student's thinking that we otherwise would not see or hear.
ROTH: Like a lot of districts, Mason City started tracking students years ago. Now more and more of the companies that schools used to oversee online behavior are also offering to track students' mental health, like in this promotional video.
(SOUNDBITE OF ARCHIVED RECORDING)
UNIDENTIFIED PERSON: Gaggle identified 64,000 student references to suicide and self-harm. Each reference is a cry for help.
ROTH: On its website, the company Gaggle claims to have helped school districts save the lives of 927 students. But what that actually means and how they count that is vague. Still, those kinds of promotions are persuasive. Gaggle says its customer base grew by more than 25% last year.
Mason City uses a different company to track its students. But the general concept is the same. Machine learning helps flag words that suggest a student is thinking about hurting herself and then notifies the school district so it can intervene. The idea is plausible enough. But having a word turn up in a search doesn't necessarily mean a kid is planning something dangerous. And even the companies doing the tracking acknowledge that nobody really knows how well these programs actually prevent self-harm or suicide.
Ellen Yan directs Beacon. That's the division that tracks student suicide risk at the monitoring company GoGuardian.
ELLEN YAN: There is honestly not a lot of data at the national level in terms of suicide prevention.
ROTH: Getting that kind of data is challenging, in part because using the technology this way is still pretty new. There's a lot we still don't know. And some people are worried about protecting students' privacy. It's also hard to track because different companies watch different parts of what students do online. So one company might miss an Internet search done on a phone; another could miss something posted on social media. And programs can trigger false alarms - a student who's writing a paper about "Romeo And Juliet," say, or gun control.
For educators worried about their students, though, anecdotal evidence about how well these programs work may be enough. Nicole Pfirman says there have been a few times where she believes an alert saved a kid's life.
PFIRMAN: Had we have not been able to get the alert, respond to the alert and put the appropriate supports in place, say, on a Friday night, it is oftentimes difficult to think about what may have happened before Monday.
ROTH: And the technology companies are ready to help, positioning themselves on the front lines of student mental health. The latest example - this fall, Gaggle announced a new product line. It'll now contract with school districts to connect them with therapists for students who need mental health care.
For NPR News, I'm Alisa Roth.
KING: If you or someone you know is having suicidal thoughts, the National Suicide Prevention Lifeline is open 24 hours a day. You can call them at 1-800-273-8255.
(SOUNDBITE OF DEEB'S "FLUID DYNAMICS") Transcript provided by NPR, Copyright NPR.