AI Therapy Platforms Address Student Mental Health in K-12 Schools
An artificial intelligence (AI)-enabled therapy platform is being utilized in K-12 schools to support student mental health needs, particularly in response to budget constraints and limited staffing. Brittani Phillips, a middle school counselor in Putnam County, Florida, reported using the Alongside platform, which flags students at risk of self-harm or harming others based on their chat inputs. This system allows counselors to intervene, as Phillips did successfully with an eighth-grader previously flagged by the system.
Alongside, a platform used in over 200 schools nationwide, offers a social and emotional skill-building chat tool. Students interact with a chatbot named Kiwi and are provided with clinician-monitored AI-generated content. The company states it offers critical mental health resources to schools, especially in rural areas, at a starting cost of approximately $10 per student annually.
Benefits and Student Comfort with AI
Counselors note that students often find it less intimidating to confide in AI. This comfort stems from familiarity with chat interfaces and a desire to avoid perceived judgment from human expressions. Chatbots are also available outside of typical school hours without requiring appointments, offering convenient access to support.
Phillips indicated that the AI tool assists in addressing routine issues, which enables her to focus on students with more severe needs.
Concerns and Limitations
Despite the perceived benefits, experts and lawmakers express concerns regarding AI-enabled therapy platforms. These worries include increased screen time for teens, the potential for students to develop strong emotional attachments (parasocial relationships) to AI, and the inherent limitations of AI as a substitute for human counselors.
"AI lacks the discernment of human clinicians who can observe non-verbal cues."
— Sarah Caliboso-Soto, Licensed Clinical Social Worker
Linda Charmaraman, director of the Youth, Media & Wellbeing Research Lab, highlighted that AI may miss nuances and offer unrealistic positive reinforcement. Critics also worry about the erosion of social skills and the lack of social accountability that human interactions provide.
Privacy experts have also raised concerns, noting that these chatbots generally do not carry the same privacy protections as conversations with licensed therapists. Furthermore, some students have been observed testing the boundaries of the system by typing false statements to see if anyone is monitoring. This requires human oversight to discern genuine alerts from jokes.
Both Alongside representatives and counselors stress that the platform is intended as a stepping stone to seeking help from adults, not a replacement for human therapy. Proposed federal laws have also aimed to require AI companies to remind students that chatbots are not real people.