Imitative AI Should Want Schools To Ban It. That They Do Not is a Bad Sign.
There have been a couple of news items this weekend that highlight how desperate imitative AI companies have become, even if I am not sure they understand how desperate they are behaving. The schools participating in these programs certainly do not understand the reality of imitative AI or they would want to ban imitative AI as a service to their students.
Imitative AI is not, as its proponents want you to believe, changing the world. Companies are increasingly abandoning imitative AI projects, finding that they do not scale or that they do not live up to their expectations. Only 25% of imitative AI projects deliver on their expected return on investment, a terrible figure for any category much less one that promises to reconfigure work. An entire firm that promised to do your programming through imitative AI collapsed and revealed that its output was done by engineers, not AI. Work done by imitative AI is riddled with errors, errors the systems are poor at correcting. Newer models are producing MORE bullshit, not less. And, of course, using imitative AI is both terrible for your mental health and makes you dumber. No sane university would inflect that on their students.
Ohio State University (I am not indulging in the stupid “THE” nonsense. There are other state schools in your state, you pretentious dipshits), however, appears to not care about any of that. They intend to force imitative AI on all of their students, to make them ‘bilingual’ in AI in the words of the University president. It is an intensely stupid idea, even if you think that imitative AI is going to be a large part of future work. Especially if you think that imitative AI is going to be a large part of future work. In fact, if you actually believed that, you would ban imitative AI immediately so that your students had a chance of being able to use the tools in the future.
Imitative AI cannot do your work for you, period, full stop. It has no model of the world, so it cannot under any circumstances be trusted to produce work without significant checking by someone who understands the domain the imitative AI tool is working in. Imitative AI is confidently wrong, and often wrong in odd ways. Worse, it gets less capable the more complex a problem gets, even if it had solved simpler versions of that problem. It cannot build upon anything as it is merely calculating what should come next based on its training data and starting point. And no, better prompts do not solve that problem. In order to achieve any productivity gains from imitative AI you MUST understand, well, the subject it is working in. And you cannot do that if you outsource your educational work to imitative AI itself.
You won’t learn — as noted above, you will become less capable. Even if you use it in moderation, it is doing the “thinking” for you. Programming, writing — those are not about producing product but rather about learning how to think through problems. If you rely on imitative AI, you never think through the problem, you just produce an output. Every once in a while, I run across a programmer who clearly never learned to think about programming problems. They apparently got through by cheating or by leaning on online solutions and/or the features of their IDE to scrape by. But they cannot think about a problem, so they are helpless when something goes wrong or when they encounter something that does not neatly fit the easy solutions — which is most things in programming. Which is most things in most jobs. Those programmers, if they do not learn quickly, never last long. And Ohio State University wants to create an entire cohort of people who are those programmers — helpless, useless, waiting to be fired.
Imitative AI firms should be terrified by this announcement. Their tools can, under the right circumstances, help some people in some jobs be more productive. But schools that produce people who only know how to use the tools and do not know how to think about the domains the tools work in are setting their students and these tools up for failure. As mentioned above, employers are already seeing poor results from imitative AI projects, and this is with a generation of employees that were not taught to offload their thinking to a word calculator. Widespread use of imitative AI in schools makes the successful adoption of imitative AI tools less, not more, likely. Imitative AI businesses should be telling schools to only introduce these tools, if at all, at the very end of a student’s time at school. Only after a person understands how to reason and their specific domain can they hope to make good use of imitative AI tools. But imitative AI businesses cannot do that — because they are not close to making money.
Imitative AI firms have a problem. Even on their most expensive plans, they lose money — a lot of money — on each transaction. And unlike some other industries, there are no real savings in scaling. Imitative AI does not get cheaper the more it is used, and creating new models with hopefully new capabilities gets more expensive, not less, as the models are scaled up. While in theory a tool that moderately improves productivity in certain areas is a good little business, imitative AI costs so much that it cannot survive as a good little business. It needs to either replace entire industries — one of the reasons it focuses on Hollywood, as that is still human intensive — or get a government bailout. State schools paying to license imitative AI tools to every one of their students? If adopted widely enough would be just such a bailout.
And so we get OpenAI pushing a whole new “education” initiative on schools and other firms actively trying to convince other universities to harm their students the same way Ohio State is harming theirs. They need this short-term gain because their only hope is to be bailed out, either to the extent that they can cash out or to the extent that they can keep pushing bigger and bigger word calculators in the hopes that, against all reason, they will produce something worth the billions, maybe trillions, they spend to get there. Instead of trying to make a business out of what these do well, they are rebelling against commonsense and the benefit of their fellow people in the desperate hope that they will never have to pay the price. They would burn our lives for a future they will never see.

