I promise I’m not the stereotypical cranky, old teacher who doesn’t want to try new things. In fact, I’m quite the opposite. I was an early adopter! (Read through any of the first posts of this blog.) This isn’t just a case of me saying “no” before looking at all the options.
The fact is, I really and truly believe we need to rethink the use of AI in education. In fact, I’ve been trying to warn educators and parents for quite some time.
After talking with teachers, I know that many are worried about how to deal with students using AI to cheat. Those same teachers have to use AI to grade student work, because of an unrealistic workload. But cheating and grading with AI are two of the lowest level problems we have with AI.
My bigger concern is using AI to deliver instruction or tutoring to students, and that’s the main focus of this post.
When we use bots to stand in for DEEPLY HUMAN INTERACTIONS, we create problems for children. I am begging educators to push back on this!
Let’s take a step back and think about learning.
Learning is deeply personal and relational.
When you think about how a brain develops– how learning occurs for each individual– every single child is learning about the world, navigating their surroundings, acquiring new information– and building upon previous learning. A very simplified example: I cannot teach a student to write a complete sentence, if that child has not yet spoken complete sentences.
As educators, we know there are building blocks to each new skill and concept, and so many of those building blocks – how they are initially acquired, stored in short and long term memory, retrieved, etc. — can appear in a child’s life very differently than they do for another child. Because of these differences in experience, how we teach students new material must be personal and relational. The more context I can help them create, the better the student is able to create their own “container” for the new learning.
More importantly, the better I know my students, the more effective I can be in helping them to build their own context — observing them during a learning activity, checking for and monitoring understanding, and adjusting the learning activity in real time to meet the needs of every single child.
I have the ability as a teacher to do this at Anastasis, because it’s how we intentionally planned the school experience and culture… and we set up smaller class sizes, without which, I couldn’t possibly be this responsive to all students’ individual needs. But… the key concept here is intentionally planning an environment where we know our students well enough to help them to have deeply personal learning experiences. Students learn with and from each other, across age/grade levels even, and those experiences teach us so much more than simply “how to” or “what is.” Personalities shine through. Strengths and growth are regularly celebrated.
And yes… I know. Deeply personal learning experiences can be messy, time-consuming, and are anything but efficient. But…
LEARNING is just that… sometimes messy, often time-consuming, and absolutely anything but efficient.
So much of US education today is about systems and efficiency. (There’s an argument to be made that it’s the same issue in many parts of the world, but I do not have experience with enough international schools to write about that, so I won’t.) The daily school schedule is a system that we know doesn’t work well for every kid. In fact, we have loads of research on start times and how much more sleep teenagers need than younger children, yet there are still many, many high schools who start earlier than recommended. Why?
We have some research measuring the impact of homework for elementary students and 6-12 students — yet the system often ignores the research findings to continue using homework in ways that don’t support learning. Why?
Many schools continue using grades to rank and sort students, rather than provide feedback and narratives that explain what a student knows and is able to do. Some schools grade behavior with the same measurements intended to show academic growth, resulting in artificially inflated or deflated scores based on behavior. Why?
The answers to all those “Why” questions often have to do with the systems we have put in place.
I get it. There are millions of children in the US, and they are all currently* guaranteed a free and appropriate education. It takes huge systems to be able to educate that many students, and we’re always looking for the next best thing to be able to educate the masses in a way that is efficient, both with time and money. Resources are spread very thin, and yet we still manage to educate children in the United States. (I’ve written a lot about how the quality of that education varies greatly from school to school, state to state… but this post has another focus.)
In trying to live within the systems we have created to educate millions of children efficiently, we have often sacrificed what is best for kids.
We sacrifice what is best for kids- human children – for what is best for the systems. Again, we create schedules of 7 to 8 hour school days with compulsory attendance and very little leeway for kids who get sick, who do not have enough to eat, who may not have a stable home life… the list here goes on and on. And yet, in many education systems, we ignore the human parts of our children and treat them as if they are small containers we need to fill with as much information as we can in each school year.
We have long known that the system doesn’t work for everyone. Too often, we insist that kids fit the system, instead of creating something that works for the kids.
Instead of changing the system, we keep trying to create new tools to make the current system even more efficient.
In doing so, we further remove the humanity from the equation, and present kids with ideas like AI tutors.
(For the purpose of this post, I’m going to suspend my urge to rant for multiple paragraphs about the ethics of AI tools- their overconsumption of vital natural resources, and how they scrape (read here: steal) copyrighted work from creators who make a living from their craft. AI tools depend upon creators who have NOT consented to their work being used to train any AI tool. Unless you’re training an AI tool on your own work, and only your own work, I will not agree that you can use any AI tool ethically. </end rant>)
I’m not going to call out any single company trying to sell AI tools to schools, because there are a lot of them, and I am not going after a specific company in this.
What I absolutely want to do is BEG educators to stop, take a breath, and think through the moral implications, as well as the potential of extreme harm, in using AI tutors with students.
I’ve read advertising copy from several companies trying to sell AI tutors to schools, especially those with large class sizes, as a solution to helping meet each student’s individual needs. Some examples (paraphrased from several so as not to target any one specific company):
- Do you feel like you need more of YOU to be able to get around to each of the students in your class to help them understand a lesson?
- Are you struggling to help each one of your students to understand the directions for an assignment?
- Is it a hardship on you and the other students in your class when a student has been absent and you need to get them caught up?
- Do you lack a teaching assistant who can help keep groups of students busy while you work with a small group?
Use our AI tutor for any of these situations! Our tutor will ask your student what they need and then deliver personalized instructions and explanations to get them back on track!
When I was a younger teacher struggling with too many classes, too many students, and not enough time in the day to make it all work, this would have seemed like a gift that could solve so many issues.
As a more experienced teacher, I see it for what it is. Maybe because I’ve seen snake oil salespeople come and go with the edtech tools that were supposed to make my life as a teacher easier… and learning better and more innovative (!) for students… it’s easy for me to see through the charade.
But an AI tutor is a poor substitute that tries to replace a necessary human interaction. It’s not personal. It’s not relational. You cannot get a true tutoring experience through a simulation.
Let’s break down the harms that come with an AI tutor.
1. Children – including teenagers- do not have the ability to FULLY comprehend abstract concepts.
Abstract thinking is developing in the brains of children through adolescence and even young adulthood. Lacking the ability to fully comprehend abstract ideas, kids can have difficulty discerning between a human interaction and one with a bot. They can verbalize, “I know it’s not real,” and yet we find some young children trying to have conversations with a puppet, for example, as if the puppet has the ability to respond independently of the puppeteer. Older kids can verbalize that they know Siri or Alexa aren’t real people, but they might respond angrily or with frustration at the bot when they don’t get the response they’re looking for. It doesn’t make sense to be angry at software or hardware, but this is not an uncommon phenomenon.
Bots that “converse” or “teach” children are programmed to be simulations of human interactions. For a child without a fully developed prefrontal cortex, this can create a disconnect between what is real and what isn’t real at time when those distinctions are very important learning milestones. While kids are more naturally skeptical than they’re usually given credit for, continually blurring the lines with something programmed to SEEM real sets the stage for bigger problems.
2. Anthropomorphization of responsive programs can cause real harm to children… and even adults.
Humans tend to anthropomorphize THINGS, and there’s very real harm to our mental health when we are unable to balance this with real-life experiences. Sure, many of us do this with our pets as well, but at least a living, breathing pet can have an authentic response, not a programmed response.
We see this with kids toys sometimes, but most toys can’t simulate authentic, real-time responses. That’s what makes them toys. Mix the two together… some of us even had a Pet Rock growing up! All it did was sit in a box. So disappointing, right?
I was a young teacher in the 1990s, and so many of my students (7th-12th grades) had Tamagotchis or NeoPets. These virtual pets were advertised as a wonderful opportunity to teach your child about the responsibility of caring for a pet. Kids were obsessed with them! Most of the adults dismissed the virtual pets as just another fad and didn’t think much more about them… until someone’s virtual pet died. Maybe the family went on vacation, and the Tamagotchi was left behind. There were all sorts of these kinds of scenarios, and most of the adults brushed off the kids’ concerns with “but it’s not even real.”
What I observed as a teacher: Some kids were fine. “No big deal. I’ll just start over.” However, there were several kids who were devastated that their virtual pet had died. We had to have a conversation with school counselors (who sometimes weren’t as responsive as I had hoped) about helping a child process that the pet wasn’t real… that the child wasn’t a horrible person for letting this happen, etc. Many of the adults dismissed the very real feelings of guilt and loss these kids experienced over a virtual entity. Those kids grew up to make memes about how sad they were when their Tamagotchis died. It’s treated as a joke now, but many of us who were teaching back then observed those kids and their sadness.
And the thing is… kids’ feelings are real. How they respond to something that feels like guilt, loss, or grief is very real, and it affects their overall well-being. Even if you, the adult, disagrees whether something is traumatic or not, a kid’s brain registers trauma regardless of what adults think and feel about the situation. When you cannot separate what is real and what is “virtual,” the lines are blurred again… but the emotional and mental health aspects can and will be very real problems.
If you have not read about the Eliza Effects: Pygmalion and the Early Development of Artificial Intelligence by Lawrence Switzky, I highly recommend you do.
There are apps, programs, games, and other tools that have been developed to sell to kids (or FOR kids) to make money without any consideration to the very real impact those tools have on the well-being of children.
Think I’m overreacting? These are just some of the most recent headlines.
AI friendships claim to cure loneliness. Some are ending in suicide. (This article is behind a Washington Post paywall)
Are AI Chatbots Safe for Children?
AI Companions and the Mental Health Risks for the Young (This article is behind a New York Times paywall)
I know that all these headlines reference at least one or two of the same individuals. There is a tendency in our culture to blame the victim in a way that diminishes the potential for harm from these tools. For me, if these AI chatbots are harmful to the mental health of even one kid… what are we even doing?!?
3. AI Bots/Tutors cannot solve problems outside of their programming, and it absolutely matters WHO is doing the programming and packaging of these tools.
(Also… what about student data/privacy?)
I’ve been in education for nearly 30** years, and I can tell you there is always an issue of trust between teachers and community. Children are precious. We cannot allow just anyone to spend hours a day with an impressionable child, right? That’s why teachers are required to be licensed after having been educated in child development and teaching methods. Most schools require its teaching staff to have at least a bachelor’s degree in education, and many schools require ongoing education in order to continue teaching.
So…
- Who is programming the AI tools? Which groups of people do they represent?
- What are the implicit biases of the people programming the AI tools?
- Are there actual certified teachers involved in the programming?
- Are there actual certified teachers on staff at all in this AI company?
- Has your district administration done its due diligence by reading all the End User Licensing Agreements (that are often longer than most humans could possibly read)?
- What exactly is stipulated in the contracts schools sign in the use of AI tutors and other tools?
- How can these tools make accommodations for students with an IEP? A 504 plan?
- Can you guarantee students and their families that this company selling you its AI tools will protect student privacy?
- What is the policy when the AI Tutor leads a student in the wrong direction, or provides factually incorrect information? How does it affect the student’s work/grades?
- What is the policy when harm is done by an AI Tutor?
- Can a family opt their child out of a class or teacher using an AI Tutor?
Years ago, I worked in the technology department of a suburban public school district. The questions above, although this was long before AI, are similar to the considerations we discussed in meetings with vendors before we would even think about deploying a new ed tech product in our schools. We could not purchase or sign contracts to use a student email product, for example, without ensuring that the laws about student data and privacy were being followed to a “t.”
I know there are lot of educators who work for different AI companies. I hope they’re able to answer some of these questions that would help people feel that their children and their privacy are safe. Past experiences and interactions with ed tech entrepreneurs remind me that a lot of ed tech companies don’t consider these things as much as they want to find a niche market where they can sell a “quick solution.”
5. Human Teacher vs AI Tutor – who gets actual face-to-face time with a human is an educational equity issue.
We already know that our educational system is not equitable. By tying school finances and funding to property taxes, schools in the United States vary greatly in resources allocated for students. This is an issue that affects race, kids who live in poverty, kids with special needs, and so much more.
Now throw in who gets actual face time with a human teacher? Who gets assigned an AI tutor? The “learning gap” will continue to grow into an insurmountable chasm.
If you’re skeptical about this point, please take a look around at the most expensive private schools in your area versus, say, a small rural public school.
- What kind of opportunities are available for students in the private school in comparison to other public schools?
- What happens when public schools close and for-profit charter schools open in their place?
- What are the demographics of the schools who receive the most funding?
- Where are we more likely- historically speaking- to see these issues arise?
- What kind of learning opportunities do students in those circumstances have?
Again, I ask… which students will have more face-to-face time with a human teacher if an AI solution is proposed?
As I mentioned above, tutoring is a personal and relational experience. If a student is struggling with the directions given by a teacher, sure. An AI Tutor Bot could repeat the instructions. But, as a teacher, I have a lot of tools already where I can share the written directions, accompanied by an audio file of ME reading the instructions, available without having to use a bot. I’ve done this in products like Seesaw, Google Classroom, Apple Classroom… or something as analog as having the instructions written on a piece of paper or a whiteboard. I don’t need to use AI as a more expensive and problematic substitution for technology we already have.
Maybe the student needs a more in-depth explanation of what we just learned. An AI tutor doesn’t know WHY that student needs more explanation. Maybe they didn’t sleep well the night before. Maybe they’re hungry. Maybe they need to move around the room. Maybe they didn’t have context for the new concept. There are endless reasons why a student might need more explanation, and an AI tutor cannot read that in a child… NOR DO I WANT IT TO TRY. If a kid needs more explanation, I can bring them into a small group. I can partner that student with other students to learn together. There are numerous options available… why bring in a technology with the potential to do harm when I can solve that problem without it?
Human issues need human solutions… and kids need humans who care about them enough to know this.
Final Thoughts
I’m going to return now to my first main point in this post. Learning is deeply personal and relational. As I sit here typing this, soooo many faces of so many students are in my thoughts, thinking about their unique personalities and special gifts they bring to the world. I’m absolutely a better person for having them, even for a tiny bit of time, in my life… and I hope they feel that I KNEW them and helped them learn.
If you’re not designing education around the kids in front of you, you’re trying to make the kids fit a system.
What we have learned at Anastasis is, that challenging the system– the status quo, thinking about the kids as the starting point… all of that is possible. When you design a living curriculum around the needs of children, the learning IS personal and relational. Kids feel successful, and we see proof of growth in every one of our graduates.
If we really want to improve education, AI is not going to save us.
Getting back to really knowing our students, cultivating relationships in the name of helping students grow and learn, and holding true to what learning means to THEM… well, that’s a pretty good start.
Oh, and that FUNDING thing. ALL kids deserve to be in safe, clean, healthy buildings where the adults know who the students are- not just their names, but really know them. We can’t do that when our schools are overpopulated and underfunded.
Kelly Tenkely and I discuss learning on our Dreams of Education podcast. We talk a lot about AI in education in episode 9 (released 12/5/24) and episode 10 (scheduled for 12/12/24).
NOTE: A lot of what I’ve written in this post is based on my experiences as an educator: a public school teacher, a private school teacher, an education technology/professional development coordinator, an adult ed teacher… and an endlessly curious learner. I’ve cited some (not all) sources in this post for expertise I don’t have. I’ve also made statements that, to me, seem to be common knowledge amongst most educators– and didn’t include citations for those. If you feel that some of these issues aren’t common knowledge and do require citations, by all means, please add them in the comments. Or write your own blog post. 🙂
*Let’s hope that free public education remains free and appropriate.
**That number might be over 30 by now. I stopped counting.