Technologists: Please Pay Attention to Social Systems
See, I told you someone in tech would do something dumb soon.
Dumb may be a bit harsh for this conversation, but it certainly isn’t smart. A technologist and former teacher (I mean, he is a professor at the University of Albany, so perhaps you could still count him as a teacher, but in my experience, professors aren’t generally teachers first) has created a system, inevitably called AI which it really isn’t, to monitor teachers with the intent to be a sort of artificial mentor:
"I wanted that feedback from a peer or an instructional coach but I didn't always have access to that," he said.
Teachers often say they get little feedback on their teaching methods. Although many new teachers are assigned a mentor, that mentor also has their own classroom, and can usually only observe the teacher briefly. Principals also observe the classroom occasionally. But it's not enough, Foster said.
"I didn't have the ability to collect all the data that I wanted in my classroom environment," he said.
This system will not be used to help teachers, at least not in the majority of cases. This professor has designed a surveillance system and that is pretty much all:
Enter AI. The AI tool uses cameras and audio recordings to report on whether the teacher looked at or walked through each section of the classroom, how often they used group work, and many other techniques. Even the words the teacher and students use are tracked.
"Am I engaging with student reasoning, am I asking students to explain their reasoning, am I giving my students opportunities to use rich mathematical vocabulary, am I using rich mathematical vocabulary?" Foster said, listing items the AI reviews.
This is merely a tool of the panopticon. This information will inevitably be used for two things and almost certainly two things only: micromanaging teachers to the student’s detriment and punishing teachers who do anything other than the most rote teaching. Every day will be an exercise in box checking — did I spend enough time in this corner of the room even though the kids there get the lesson and don’t need as much help, did I use enough of the proper words, even if those words don’t resonate with the kids, did I walk around enough, even though some kids freak out when I stand behind them, did I make the machine happy, even if making the machine happy wasn’t the best way to teach this group of kids on this day? The pilot teachers are, of course, already complaining that the system doesn’t record everything important.
That failure is predictable. I am assuming this isn’t really imitative AI, though there are few details, and is closer to what we used to call an expert system, with likely some natural language interpretation tossed on top. Either way, the system is only going to recognize what it can clearly “see” and corollate with its rules and/or training data. Inevitably, there will be misses in what it can “observe” due to technological limitations and inevitably there will be things good teachers do that it does not appreciate because its rules and/or training set do not encompass those actions. And since it is an ironclad rule of organizations that only that which is measured is rewarded, those teachers who do those good things the system does not reward will be punished.
The counter argument is that the system is not meant to be used that way. It is meant to advise teachers, to help them improve, not to punish them. To which I counter: pull the other leg, it’s got bells on. In a world where student achievement is largely measured by test results and business and political leaders are largely dismissive of anything that is not “science” or “mathematically” based, there is absolutely no way that this will not largely be used to control and constrain teachers. Anyone who has paid any attention in the last thirty or so years can see that, and it is wrong for someone to either build something without paying attention to the society it operates in or pretending that you do not realize the likely uses of your technology.
And these are likely uses. Sometimes, many times, new technology has unintended or unforeseen consequences, both good and bad. But the uses for this technology are so plain, so clear, that it beggars belief that you could know anything about our current society and not see the way this will actually be used.
It may very well be bad that teachers do not get proper mentorship. The solution to that issue is, of course, hiring enough teachers so that new teachers get proper mentorship. Or by giving teachers help in other tasks. Or by giving them a proper environment to foster collaboration among teachers, so that mentorship happens naturally. In other words, you have a staffing problem so hire more staff or tools that the staff can use under their own power to shorten administrative tasks.
Now, the response to that notion is likely to be some version of physician heal thyself. Where do I get off telling earnest technologists to pay attention to social systems when I am naive enough to believe that this country will hire teachers or support staff to fix a problem with teaching. And you are at least partially correct. Better staffing or better tools under the control of the staff are very, very unlikely in our current environment. That does not mean that this tool and tools like it are better than nothing. Sometimes, things are just bad. And this kind of panopticon of the classroom is just going to be bad.
Classrooms are all different. What works for one will not necessarily work for another, given the mix of students, situations, etc. Heck, what works for a given classroom on one day is likely to not work on another. When a human being evaluates a teacher, they likely have enough experience and commonsense to account for the conditions on the ground. A system like the one the professor is creating is not going to have that kind of flexibility. And so, we get teachers who will have merely one more thing to worry about, to adjust their behavior towards. One more thing that they need to think about aside from helping their kids. This system will inevitably make teaching and learning, overall, worse.
The sad part of all this is that the professor really does seem to genuinely want to help his fellow teachers. But because he thinks in terms of technology first and only, he does not see how his “solution” is going to interact with existing social systems to make the problem he has identified worse. When you build something, the context in which it is going to be used should be an important part of your considerations. By ignoring that aspect of his solution, a person with good intentions is set to wreak havoc with the education of untold number of kids.
Everything you build operates in a social environment. You cannot ignore that environment and consider your tool complete or good or useful. That is the kind of blindness that inevitably leads to suffering for the people who your otol is imposed upon. You, as a technologist, have an obligation to prevent that suffering as much as is humanely possible. Anything other measure is merely self-congratulatory nonsense.

