


A few days ago, an IT Director at a large district stood before a room packed with concerned parents to defend the safety of a student-facing AI chatbot. There was just one problem: The vendor had already pulled the plug on that specific feature weeks earlier (OPB).
This disconnect is a symptom of a larger 'Trust Gap' emerging in K-12. While our previous issues focused on how AI saves time and prepares graduates, this week we have to talk about the growing backlash against 'Companion AI' and why the era of personified chatbots may be coming to a close.
Some folks have reached out to us asking how to subscribe themselves or their colleagues to this newsletter. Click here to do so. We send a newsletter every two weeks about the latest happenings with K-12 and AI.
IN THIS ISSUE:
72% of parents can't find your AI policy, and they are filling that silence with anxiety. Here is how to fix it now.
1,100 parents from one district protested a student-facing AI persona from a major K-12 AI vendor. Here is what it means for your vendor relationships.

PARENTS
72%
The "Communication Gap." This is the percentage of parents who either believe their child's school has no AI policy(35%) or are unsure if one exists (37%) (MassINC / WBUR).
59%
The percentage of Massachusetts K-12 parents who say their child is already using AI for schoolwork (MassINC / EdTrust).
33%
Despite this high usage, only 33% of parents view AI in the classroom positively (MassINC / EdTrust).

STUDENTS
1,100+
The number of parent signatures on a petition in Bend-La Pine Schools (Oregon) protesting "companion-style" AI bots on student iPads (OPB).



A poll by MassINC and WBUR released in late January 2026 reveals a stark "Transparency Crisis" in our schools. While nearly 60% of parents reported that their students are actively using AI for schoolwork, 72% of those same parents report that their school has either failed to communicate an AI policy (35%) or they are completely unsure if one even exists (37%).
OUR TWO CENTS
When 72% of parents are in the dark, they don't fill that silence with "benefit of the doubt." They fill it with anxiety.
If parents feel like AI is a "black box" being used by their children without clear guardrails, they will eventually move to shut it down. And once trust is broken at the community level, it takes years to rebuild.
Transparency is the only antidote to this friction. AI guidance needs to move from the back of the student handbook to the front of the district homepage. If your community can't find your AI policy in under 60 seconds, you don't have one. Not in their eyes.
Bridge the 72% Gap: Take your AI guidance out of the staff drive and put it on your district's homepage. List what tools are approved, how data is protected, and what the opt-out process looks like.
Publish a “Plain-Language” Policy: Don't wait for a 50-page legal framework. Perhaps post a simple "How We Use AI" guide that answers the three questions parents care about most: Is it safe? Is it optional? And who is watching the machine?
Host a “Demo Night”: Parents fear what they can't see. Showcasing how a teacher uses AI to differentiate a reading passage for a student who might be struggling can turn a "threat" into a "tool" in the eyes of a parent.
Define the “Human-in-the-Loop”: The MassINC poll found that parent comfort drops significantly when they feel a teacher's judgment is being replaced. Be explicit about where the AI stops and the teacher begins.


At a February 10th board meeting, an IT director from Oregon’s Bend-La Pine Schools stood before a room packed with concerned parents to defend the safety of a vendor’s student-facing AI chatbot. Leading up to the board meeting, 1,100 parents had signed a petition to ban the chatbot from their students’ devices. There was just one problem: the major K-12 AI vendor had already quietly retired the student-facing persona weeks earlier due to valid concerns about students forming "unhealthy relationships" with digital personas during critical brain development.
OUR TWO CENTS
For two years, the software industry’s North Star was "engagement." If a chatbot felt like a friendly character, students stayed on the platform longer. But we are discovering that in education, satisfaction is not the same as learning. When an AI is personified to be a "friend," it often becomes sycophantic. It may agree with the student rather than challenge them. This bypasses the "productive struggle" required for brain development.
In our opinion, students engaging with a chatbot also creates a moral hazard of sorts. With a number of students already engaging with AI to discuss sensitive mental health or relationship issues, we’re concerned about schools teaching students it’s okay to chat with a bot (even indirectly)when you need help for any topic. AI isn’t magic. Bots aren’t humans.
In any case, we see this vendor pivot as a win for districts. We don't need graduates who know how to talk to digital unicorns. AI literacy means AI fluency. We need graduates who know how to interrogate a model and use it as a tool for human thinking.
Audit for “Anthropomorphism”: Look at your district's software list. Does the AI have a name, a backstory, or a "personality"? Consider if those features serve a pedagogical purpose or if they are just "engagement hacks."
Move Your AI Policy from the “Handbook” to the “Homepage”: The WBUR poll showed that 72% of parents are in the dark. Silence in a school district is rarely interpreted as "everything is fine"; it is usually interpreted as "something is being hidden."
Establish a “Vendor Safety Protocol”: The Oregon incident was a communication failure: the vendor made a safety change, but the district leader was left defending a feature that no longer existed. You can require all AI vendors to provide a Quarterly Safety Report. This should explicitly list any changes to student-facing personas, safety guardrails, or data collection methods.


“Reclaiming Conversation: The Power of Talk in a Digital Age” by Sherry Turkle
MIT professor Sherry Turkle spent decades studying what happens when screens replace face-to-face conversation, and the 10th anniversary edition includes a new preface addressing generative AI companions directly.


Chat with ClassCloud
We’re listening. Let’s Talk!
This newsletter works best when it’s a conversation, not a broadcast. If you want to talk through how any of this applies to your district specifically, or if you have feedback on what would make this more helpful, just hit reply. We read and respond to everything.
Schedule a Virtual Meeting
Thanks for reading,
Russ Davis, Founder & CEO, ClassCloud ([email protected])
Sarah Gardner, VP of Partnerships, ClassCloud ([email protected])
ClassCloud is an AI company, so naturally, we use AI to help polish our content.





