Imitative AI Makes You Dumber
There are a lot of costs to imitative AI. The environmental costs are enormous, the damage done to artists and writers careers (and no, the use of other people’s work for training data is not fair use — at least according to one court), and now, it seems, its use degrades your ability to think.
That is not my opinion, but rather the opinion of the people who create imitative AI. Microsoft, specifically, has done research:
A new paper from researchers at Microsoft and Carnegie Mellon University finds that as humans increasingly rely on generative AI in their work, they use less critical thinking, which can “result in the deterioration of cognitive faculties that ought to be preserved.”
“[A] key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise,” the researchers wrote.
This is not surprising. To the extent that imitative AI can do work for you, it is generally work that short cuts your own cognitive processes. If you are not writing that report, then you are not really thinking through the issues in the report. If you are not writing the code that generates the boilerplate code for you, then you are not thinking through the problem, the trade-offs and issues. You are not thinking.
This is not just my opinion, by the way. It is the opinion of one of the leading imitative AI companies, Anthropic:
“While we encourage people to use AI systems during their role to help them work faster and more effectively, please do not use AI assistants during the application process,” the applications say. “We want to understand your personal interest in Anthropic without mediation through an AI system, and we also want to evaluate your non-AI-assisted communication skills. Please indicate 'Yes' if you have read and agree.”
Using these tools has, among the more concrete costs, opportunity costs. It is not that much of an issue when a senior developer, for example, uses it to short cut his or her work a bit. As long as they properly vet the output, they might be able to speed some repetitive tasks. Or they might not — having to go over all the output with a fine-tooth comb looking for hallucinations and mistakes is not necessarily a time saver. And that is the cost.
A senior developer, a good communicator, can understand the many, many, many limitations of imitative AI. But someone new to the task cannot. And more importantly, everyone starts knowing little about their work. Work is a form of learning, a form of thinking. Anthropic clearly knows that there is a cost to using their tools — if you let people learning the tasks use those tools then they aren’t learning, aren’t thinking. You miss the opportunity to teach or learn through work, through effort. You never become truly good at the task because you never truly learn the task.
This has traditionally been one of the problems with outsourcing critical work — you lose the ability to do the work yourself and find yourself at the mercy of the people who can think, can do the work. Inevitably, the cost of not having the knowledge, the expertise, overwhelms the supposed saving of having someone else do the critical thinking for you. Imitative AI has the same fundamental problem. It cannot generally do important, creative work. But by farming out the basic work, you cut off the next generation of people who will, or should have, do the creative, important work, even if imitative AI worked perfectly. Which it does not.
Imitative AI is not a short cut, not really. It is a short shrift to learning and thinking. You cannot get to understanding, to competence, by relying on the work of others. It’s a cliche, but cheating only really hurt you — and imitative AI is perhaps the most hyped version of cheating in modern history. There is no shortcut for work, whatever the imitative AI firms may tell you in their ads.
And now, we know they know that as well.

