It’s a day ending in “y”, so it seems that we are going to get another study showing that the use of imitative AI systems is bad for your cognitive functions and retards learning. This time it is from MIT. It is ever more important that we resist imitative AI in education if we want to actually have education.
The MIT study is yet another on the recent pile that shows that people who use imitative AI lose cognitive functionality. Specifically, in this study, they demonstrate that the more help you get, the less your brain learns and retains. This applies, of course, to all kinds of help but in the study the use of imitative AI resulted in the worst performance by the participants. Worse, their performance remained lowered when asked to move from imitative AI to just using their brain to complete assignments. These systems quite literally make it more difficult for you to learn.
Imitative AI in education is a poison. This is not to say that imitative AI can never have any use, but it can only possibly be useful in situations where people who understand what the output is supposed to be can vet it for the bullshit it inevitably produces. An educational setting is not, obviously, filled with people who are experts. Trying to learn using imitative AI is bound to fail. It does too much of the work for you, so that you never add the material to your brain, and as it does the work, a significant portion of the results are bullshit. How is anyone, much less a child, supposed to learn when the answers are provided for them and the answers are often wrong?
There is a sense in some education conversations I follow (back in my misspent youth, I did some ope source edutech work until I became convinced these systems did not actually help), there seems to be a drive to accept imitative AI in school settings. Part of this is a belief that imitative AI is inevitable. Part of it is the belief that not immersing kids in imitative AI will leave them behind kids who are. Neither are true, and their presence allows imitative AI companies to profit (Well, make money. None of these firms actually make profits.) at the expense of our kids.
Nothing, not even imitative AI, is inevitable. It was inevitable that we were going to be buying pizza with our cryptocurrency. It was inevitable that we would spend our work and free time in the Metaverse. It was inevitable that 3-D television would be in every living room and den. It was inevitable that NFTs would replace real art. None of those things happened, of course. Nothing is inevitable, no matter how much VC and government flacks might want you to think otherwise. Bubbles, like imitative AI appears to be, can stay inflated for a while. But they do always pop. Imitative AI needs too much money and provides too little real value (again, not none, but nowhere near enough) to remain a viable business.
Even if the bubble stays inflated for a long time, the idea that kids need to learn to use imitative AI regardless of what it does for their ability to learn in general is deeply mistaken. There is nothing special about prompting imitative AI. It is no more complicated than search queries, not really. And even if it is, people learn how to use tools on the job all the time. There is no need to teach a specific tool. Especially a tool that has been shown to impede actual learning.
The point of education is not to teach people what to think. It is to teach them how to think. A banal observation, I know, but one that we seem to have lost track of in our rush to make education friendlier to businesses. Imitative AI, based on all evidence and common sense, hurts people’s ability to learn. It’s use is not helping people learn how to think. Rather, it is teaching them that imitative AI can think for them. We owe our children so much more than that.


Another strong essay. IMHO, your voice is one of the clearest on the topic of "LLM AI as anti-education." So, thanks for that.
One thing, though. I do wish people would stop saying "The point of education is not to teach people what to think. It is to teach them how to think." I would eradicate that slogan for two reasons.
First, this is a favorite talking point of those who would destroy Education, since removing objective facts (the attack on epistemology) is a big part of the techno-feudalists' agenda.
Second, as any K-12 teacher will attest, it's also objectively wrong. *Of course* we teach kids what to think. And we should. It's covered here: https://markharbinger.substack.com/p/neither-a-toitler-nor-a-follower
Best, _Mark
Well said and thoughtful ! The more help, you could even say it's cheating, through use of AI, the less you utilize your brain. I have never used AI and wouldn't even know how. I guess I'm an old fart with a Masters in Education but if using my brain works for me, I'll work it !!!