Ban Imitative AI From Schools If You Want Kids to Learn
OpenAI, and I am sure others, are now pushing to get into schools in a much larger way, trying to be the Apple Macs of this generation. They have hired a former Coursera executive to make the push. And while it is amusing to watch OpenAI hire people from perhaps the most reviled monopoly in the country, this push is a very bad thing. No educational institution should ever be allowed to use imitative AI in any form.
This is not, believe it or not, not a brief against imitative AI. Yes, I have plenty of moral and practical argument against imitative AI as a tool. But even if imitative AI ran by sucking hate out of the atmosphere and spitting out unicorns, you still wouldn’t want imitative AI within a thousand miles of anyone learning something. Even if you don’t admit that it’s use atrophies intelligence, imitative AI cannot be used by people who are not familiar with the subject matter. It cannot help you learn.
First, you really should take note of the fact that using imitative AI does atrophy your critical thinking skills. Programmers who used it became less productive, and Microsoft’s own studies have shown that its use retards your critical thinking skills. And, of course, chatbots are terrible for the mental health of children who use it. Using imitative AI to teach people appears to be a shortcut to ruining their education.
Even if you do not believe these studies, if you think they are overblown or too preliminary, the nature of imitative AI itself demands that you not use it to teach. First, the only way in which imitative AI can be useful is if it is babysat by people who understand the domain in which the work is being done. Imitative AI simply makes too many mistakes. Up to 60% of imitative AI search results are wrong. Now, I would suppose that domain specific tools are better. But on the other hand, two programming imitative AI tools deleted production data. So, you know, they may not be that much better. You must understand your domain to get any use out of these tools. How, then, can you learn anything from them if you already need to understand what you are learning to make sure you are learning the correct material?
The argument that you need to learn to use these tools in school is also silly. Prompting is not magic. It’s not even especially hard. If you use imitative AI in the workplace, you will pick it up quickly. And each workplace is likely to use different tools. For especially tricky uses, places may standardize the prompts anyway, to ensure some level of consistency. It’s a simple tool, and the idea that you need special training to use it well, training that you cannot pick up on the job, is, at best, misguided.
Doing is thinking. You go to school to learn to think. No matter how technical your training, the goal is to help you learn to think how to apply your tools effectively to the problems those tools help you solve. Imitative AI keeps you from doing and thus from learning. By substituting its random responses for your own effort, imitative AI retards your ability to learn. In a very real sense, imitative AI is anti-education. if you care about students, you will keep it as far from them as possible.

