The alert came at around 7 pm.
Brittany Phillips checked her phone. Phillips, a middle school counselor in Putnam County, Florida, receives messages from an artificial intelligence-enabled therapy platform that students use during non-school hours. This flags when a student may be at risk of harming themselves or others based on what the student types in the chat.
Phillips observed that this was a “serious” warning for the eighth grade student.
So, Phillips spent his evening on the phone with the student’s mother, trying to get the word out from her about what was happening and how vulnerable the student was. Phillips even called the police, she says, noting that she tells students that the chats are confidential unless they can.
That was the last school year, in the spring.
“He’s alive and well. He’s in ninth grade this year,” says Phillips. He believes the conversation built trust between him and the family. Now when a student passes her in the hall, she makes it a point to greet him, she adds.
Given budget constraints and limited mental health staff, Interlachen Junior-Senior High School, where Phillips works, is using an AI platform to screen for students’ mental health needs.
Phillips’ district has used AlongSide, an automated student monitoring system, for three years. It’s an example of a growing range of devices that are marketed to K-12 schools for similar purposes, with at least nine companies getting funding deals through 2022.
Together says its tool is used by more than 200 schools across the US and argues that its platform provides better services than typical telehealth options because it has a social and emotional skill-building chat tool — where students talk about their life problems with a llama named Kiwi that tries to teach them to build resilience — and its AI-generated content is monitored by physicians. The system offers resource-rich schools, especially in rural areasAccess to important mental health resources, company representatives say.
AI is a core component of the Trump administration National Education Agenda. Still, some parents, teachers and, increasingly, lawmakers, are wary of the growing number of teenagers. screen time. States have also started Restricting the use of AI in telehealth.
Many experts and families are also concerned that students get too attached to AI. Even a recent national survey found that 20% of high schoolers have Have used AI romantically or know someone who has used AIThere is a significant interest in preventing students from becoming emotionally attached to bots. This also includes a proposed federal law that would force AI companies to do so Remind students that chatbots are not real people.
Still, in his job, Phillips says the equipment his school uses is exceptional at putting out “small fires.” Helping approximately 360 middle school students deal with breakups and other routine problems, having this tool allows them to focus their time with students approaching crisis. Plus, students sometimes find it easier to turn to AI to deal with emotional problems, she says.
on the digital couch
School counselors say students’ anxiety depends on how comfortable they are with these technologies.
Talking with a mental health professional can be intimidating, especially for teens, says Sara Caliboso-Soto, a licensed clinical social worker who serves as assistant director of clinical programs at the University of Southern California’s Suzanne Dvorak-Peck School of Social Work and clinical director of the school’s Trauma Recovery Center and Telebehavioral Health Online Clinic.
There is also a generational component to this. For students who have grown up encountering chat interfaces through social media and websites, AI interfaces may seem familiar. And kids today find it’s easier to text someone than call them, says Linda Charmaraman, director of the Youth, Media and Wellbeing Research Lab at the Wellesley Center for Women.
She says using AI to work through emotions also allows students to avoid looking at facial expressions, which they may worry will lead to judgment. Additionally, chatbots are available at times when a human may not be available without the hassle of making an appointment, says Charmaraman.
“It’s almost more natural than interacting with another human being,” says Caliboso-Soto.
In her work with the telehealth clinic, Caliboso-Soto has seen progress crisis text lines And chat lines. She says the clinic doesn’t use any kind of AI, but companies wanting to incorporate AI into therapy sessions often approach her as note takers.
This is not necessarily bad in Caliboso-Soto’s opinion. She says that for schools lacking resources, AI can be used “as a first line of defense,” regularly checking in with students and pointing them in the right direction when they need more help.
According to the company, the starting price for a school to use AlongSide’s services is about $10 per student, per year. Larger districts typically receive volume-based discounts.
But Caliboso-Soto is concerned about using AI as a substitute counselor. She explains that it lacks the discretion that physicians provide when interacting with students. While large language models can be trained to notice traits in text, they cannot see or hear the voice inflections and body movements that a human clinician can when interacting with a student, nor can they reliably capture subtle observations or behavior. “You can’t replace human connection, human judgment,” she adds.
While AI could speed up the diagnostic process or free up time for school counselors, it’s important not to rely on it too much for mental health, Charmaraman says. Technology may miss some of the nuances that a human counselor might pick up on, and it may give students unrealistic positive reinforcement. They argue that schools need to take a holistic approach that involves families and caregivers.
Also, if a school is increasingly using AI interventions to filter out serious cases, it’s worth paying attention to whether students are having less contact with medically trained humans, Caliboso-Soto says.
For their part, AlongSide representatives say the platform is not intended as a replacement for human therapy. Ava Shropshire, a junior at the University of Washington who works as a youth counselor at AlongSide, says the app is a step toward getting help from adults. They argue that the app makes mental health and social-emotional learning more common for students and may motivate them to seek humanitarian help.
Still, some students think it’s a Band-Aid at best.
social accountability
“Can you think of any other time in history when people have been so isolated, when our communities have been so fragile?” asks Sam Hiner, executive director of the Young People’s Alliance, a North Carolina-based organization that advocates for greater youth participation in politics and policy making.
In a time of economic turmoil, technology and social media have isolated students from each other, and this has created a deep longing for community and belonging, Hiner says.
He further said, students will get it wherever possible, even if it is through ChatGPT.
Young People’s Alliance released A framework for regulating AI Which allows some medical uses of the technology.
But in general, the organization is attempting to rebuild human community and is against the use of AI when it threatens to replace human companionship, Hiner says. He added, “It’s an important aspect of healing and an important aspect of living a full life and having social relationships and mental well-being.”
So for Heiner, the main concern is what he calls “parasocial relationships,” when students develop one-sided emotional attachments, especially when technology enters schools for therapeutic purposes. It could be valuable to have an AI that can respond or analyze, even for mental health, but Heiner says AI should not prompt or message that it has its own emotional state – for example, telling a student user “I’m proud of you” – because that encourages attachment.
Even though platforms often claim to reduce loneliness, they don’t actually measure whether people are more connected and ready to live full, connected, happy lives in the long term, says Hiner: “All[tech platforms]are measuring is whether this bot is serving as an effective crutch for the immediate feelings of loneliness that they’re experiencing.”
What advocates want to prevent is that these bots are promoting the loss of social skills because they pull people away from relationships with other people, where they have social responsibilityHiner says.
crossing boundaries
Privacy experts note that these chatbots typically do not have the same privacy protections as conversations with a licensed therapist. And while concerns about student privacy and encounters with police are high, the use of these devices raises “disorderly” privacy concerns, even when monitored by people with clinical training, a privacy law expert says.
Both the company and Phillips, a consultant in Putnam County, emphasize that to work, these systems require human oversight. Phillips feels the tool is an improvement over other monitoring tools used by the district, which point students toward school discipline rather than mental health help.
This school year, Phillips noted 19 “critical” alerts from the AI health tool through February (from a total of 393 active users). The company does not isolate the events that led to the students causing them. So Phillips says some of those same students are causing many of those 19 “serious warnings.”
Using the tool, Phillips learned that understanding teen humor also takes a human being.
This is because some alerts are not genuine. Phillips says that sometimes, middle school students – usually boys – will test the limits of this technology. They type “my uncle touches me” or “my mother beats me with a stick” into the chat to check whether Phillips can act on it or not.
She says these boys are just trying to see if anyone is listening, trying to see if anyone cares. Sometimes, they just find it funny.
When she pulls them aside to discuss it, she can watch their body language, and whether it changes, which could indicate that the comment was genuine. If it was a joke, they often become apologetic. When a student is unrepentant, Phillips will call and tell the parent what happened. But even in these cases, Phillips feels she has more options than those provided by other monitoring systems to refer the student for in-school suspension.
She added that, because Phillips is keeping her eye on the conversation, students also learn to trust that she is actually monitoring the system.
And, she says, the number of boys testing the system in this way is decreasing every year.
-
This article was produced in partnership adsurgeA nonprofit newsroom that covers education through original journalism and research. sign up for them newsletter.
