For the Millionth Time: Keep Imitative AI Away From Students
You do not learn as well using ChatGPT as you do just searching Google. This is both astonishing — Google search is a crap fest at this point — and completely banal. But it still is a warning to keep students away from imitative AI.
A recent study demonstrates that imitative AI is worse at teaching you things than a basic Google search process. To be fair to imitative AI, this should not be surprising. We have long known that having to read and understand multiple sources is a better way to learn than having someone just summarize the answer for you. When we teach kids to do multiplication we don’t just hand them the answer — we take them through the process. Even if — and that is often a big if — it provides you the correct answer, you likely haven’t really learned anything. So why, then, does this matter? Because imitative AI drives you towards these summaries and away from research.
Imitative AI is based on the idea that it can do the work humans used to do. It can write for you, it can draw for you, it can code for you, it can therapy for you , it can have a relationship for you. Some of this is true (if you understand the language, its always going to type faster than you in a coding environment, for example). Much of it is over-hyped, even when it does provide benefit. But all of it is based on the idea that it will do some work for some person. But having work done for you is not how you learn.
Learning is the process of retaining skills and information, and that requires effort. You have to participate in the learning — actions, repetition, researching, etc. Having someone give you an answer does very little to nothing to help you understand the answer. You must be an active participant in your education, and imitative AI is designed to make you passive. It expects you to tell it what you want and for it to do the work. The problems in that for learning should be obvious.
Now, some might ask, who cares? If imitative AI can do these things for you, does it really matter that you personally learn less? There are two problems with that. First, imitative AI does not and cannot answer correctly all of the time. In some situations, it cannot answer correctly even most of the time. You need to understand it’s output to endure that you are getting correct work. Even if this was a solvable problem — and it is not; the math the systems are based on make that impossible — you will still need people to have good educations. Imitative AI is a mere copy of previous work. You cannot have progress with it, merely the appearance of progress. If you want new things — programming languages, medical treatments, etc. — you must have people who understand the domains in order to advance. And you cannot do that if no one learns.
The last remaining argument, then, is that imitative AI is inevitable and so students must learn to use it. Nothing in life is inevitable or we would all be admiring our NFT collections while watching 3D television in the metaverse. Inevitable is something that is made, not something that happens. And the AI bubble argues very strongly against anyone being able to make imitative AI inevitable.
But even if it were inevitable, there is nothing special about prompting AI. It is a simple skill, one best left to learning on the job with the specific tool you are going to find yourself using. It is not complicated or important enough to bring the damage that imitative AI does into the classroom.
Learning is one of the most important things that humans do. It has allowed us, for good and ill, to build a society unlike anything any other animal has ever been able to produce. Imitative AI has been proven time and again to be detrimental to learning. It is past time we stop pretending otherwise and harming our kids.

