Two stories highlight how imitative AI does not belong in education. First, we see in South Korea how AI textbooks have been roundly rejected by students, teachers, and parents. Second, a new study that shows that teachers who use imitative AI to write lesson plans produce plans that fail to inspire students and fail to promote critical thinking. In both cases, they demonstrate how imitative AI fails to keep its pusher’s promises in education.
The first story is about imitative AI textbooks. The majority of people who encountered them found them wanting. The books were found to have a lot of inaccuracies and created additional work for teachers to overcome the aforementioned errors. There were also concerns about student’s data privacy, but the piece is less clear, to me, about whether or not those are realistic concerns. Many people in the piece said the books were probably rushed and thus not held to the normal quality standards.
The study of teacher’s lesson plans was similar. Ones created with the help of imitative AI tended toward superficial plans with little in the way of class specific items and very little designed to help students learn how to think critically. The authors of the study were hopeful that if teacher’s spent more time and energy on the plans they could overcome those limitations.
I assume we all see the issue here.
Imitative AI is sold as a means of letting teachers and others in education to do more with less. In both cases, more with less failed miserably. They had inaccuracies sufficient to make teachers have to spend significant time to correct them rather than teaching. Inaccuracies were not a significant story in the lessons plan study, likely because the plans tended to be general and teachers knew their material well enough to mitigate those issues. However, the imitative AI systems produced uninspired and shallow plans. Not very efficient.
Imitative AI fails in education because it is meant to make up for the lack of resources we apply to education and it simply cannot do so. Imitative AI, given the inaccuracies in, and the shallowness of, its output cannot be trusted to do the work itself. It needs to be verified by someone who understands, deeply, the work. In some cases, this is fine. But in education, where the standards for accuracy are extremely high and where resources have been denied to schools for decades, that kind of extra checking is a burden on an already overwhelmed staff. Sure, if they had a proper amount of time, they could use imitativeAI better. But if they had proper resources, then what would the value of imitative AI actually be?
Very little, if any at all. Education is not a good fit for imitative AI. It’s use retards the learning of students, and it is a failure at providing the kind of support that teachers need to be better educators. It is a simple money grab at this point, the 2020s equivalent of the MOOC gold rush earlier in the century. We owe our teachers and children a lot better than that.


Wow....very insightful read, you are certainly into dissecting AI and your reasoning is sound. If AI creates more work for educators and diminished learning & critical thinking opportunities for students, where are the benefits and value ??? Not something I would use in an educational setting but each to their own!