The Reverse-Centaur Mental Model for AI Use
I recently spoke with my potential advisor, who said she would like to hear my views on AI in the classroom when we meet next semester.
The next day, in a moment of serendipity, Cory Doctorow wrote about reverse-centaurs in his newsletter.
This is a fantastic mental model for AI use.
Don't become the assistant in your own thinking.
According to the essay, a centaur is a person who uses AI to assist them with mundane tasks. Doctorow's example involves using Whisper to scan 40 hours of audio files to find a specific claim from a podcast. Something he never would have had time to do himself.
A reverse-centaur is a person who becomes an assistant to the AI. The example given is a freelancer who is hired for a job, but not given enough time to complete it correctly. This individual freelancer has no choice but to delegate all of the work to AI, because they're doing a job that used to be done by dozens of journalists and interns.
These are obviously extreme ends of a full spectrum of AI use, and most users are likely to fall somewhere in the middle. But even if you aren't a full reverse-centaur, it's important to remain mindful of how you're using AI in the moment.
My views on AI in the classroom
Students are going to use AI whether it’s against the rules or not. They need to be taught how to use it in a way that doesn’t disrupt their own thinking.
A fundamental aspect of note-making is learning and/or clarifying your thoughts through personal connections and analogies. Because personal thought is so foundational to a course on generative note-making, I think it's perfect for introducing these ideas.
One of the biggest hurdles currently is school administrations that are focused on preventing the most obvious reverse-centaur behavior, cheating. But this is one of those areas where we have a choice. We can try the abstinence-only route and hope impulsive teenagers will follow the rules and avoid AI altogether, or we can face the reality that there will always be students who figure out new ways to cheat. But these aren't the students they should be worried about.
The students who would benefit most from education have good intentions. They're not the type who would normally cheat on an exam or plagiarize a paper, but they don't see using AI against the rules as cheating. They may genuinely believe they're learning while unintentionally outsourcing the most important cognitive work to AI.
This is because AI makes it easy to fall into a version of the collector's fallacy. The collector's fallacy warns us that we often believe we know something when we actually just know about something. AI tells you a lot about something but often works against the knowing. Even if it doesn't hallucination a straight up lie, easy answers hinders our growth.
Because growth and learning require struggle. We have to wrestle with issues, concepts, theories, philosophies that we don't understand.
I've played video games my entire life. For the first 13 years I had nothing to help me when I got stuck. I'm still incredibly proud of the fact that I beat the original Legend of Zelda on my own when I was 8. No internet, it didn't exist yet. No Nintendo Power Magazine, my mom wouldn't get me a subscription. Just me, a 5-subject notebook, and many hours mapping my way through mazes and burning trees one at a time, being forced to leave the screen with each new tree. Hours spent trying, failing, making notes, trying again, and failing again. And feeling so much pride in the successes.
Once I had access to the internet, try as I might to finish a game without looking up a solution, it's just too easy when frustration kicks in and I just want to progress with the game.
That's going to be AI for many kids if we don't help them navigate the mindfield without just saying "don't do it."
Centaurs in the classroom
Centaur mode for the classroom could look like AI scanning your notes to consolidate all your writing on a specific topic. It could be summarizing a freewriting session to eliminate all of the noise. I could be asking for a list of peer-reviewed sources on the topic that you plan to read yourself.
The common factor here is that you're doing the important work. You're reading and processing the sources provided. You're synthesizing the reading and writting on the topic to clarify your own thoughts. AI is just saving you time with admin tasks like searching and organizing.