AI Can Revolutionize Education, but Not in the Way We Think
Topics
Educators often take advantage of educational technologies as they make the shifts in instruction, teacher roles, and learning experiences that next gen learning requires. Technology should not lead the design of learning, but when educators use it to personalize and enrich learning, it has the potential to accelerate mastery of critical content and skills by all students.
The more we can enlist students in taking charge of their education, the more they will be able to make decisions about when, how, and under what circumstances AI can be helpful in their own growth.
My inbox is full of tech companies providing new AI tools that promise to revolutionize education by making it faster, more efficient, and more individualized. But what if we took the arrival of AI not as an opportunity to make the existing machine work better, but to rethink that machine?
From my perspective, the most interesting aspect of the first few years of AI in schools is less what we have learned about AI and more what it has revealed about education. “I’m a high schooler, and AI is demolishing my education,” read the headline of a recent Atlantic piece. Similar pieces have been published about universities where students no longer read books but instead rely on AI summaries. A recent podcast about AI in education was titled “The Homework Machine” and detailed the ways in which students were now not simply using AI to write essays but also to solve math equations or fill out science worksheets. In turn, schools and teachers have adopted AI detection software, and students have taken to AI humanizer software, which applies an additional filter to artificial intelligence writing to make it seem more “human.” Truly a dystopian future.
But what we are learning here, to my mind, is less about AI and more about the nature of many classrooms. In schools where students are asked to fill out the same rote assignments year after year, where there are antagonistic or impersonal relationships between students and teachers, where students see little purpose to what they are doing other than punching their ticket for the next meritocratic hurdle, it is not surprising we are seeing this escalating AI arms race between students and teachers. In many ways what is happening here is not that different from fraternities distributing old copies of exams or the age-old practice of students copying each other’s homework. AI is just accelerating the process.
We need to get off this downward spiral. The real villain is not AI but the tasks that are so commonplace in contemporary schooling. And here is where there is a real opportunity for reinvention. Here is a theory of action for you: If students are given meaningful, original, and authentic work, if they see the purpose in what they are doing, and if they form deep and genuine relationships with their teachers, then they are less likely to pass off AI work as their own, and more likely to put in the time and effort that are needed for real learning. If students can be enlisted in their own education, then AI can become a tool for research and synthesis, much as it already has become in many offices. But if students and teachers view one another as opponents, then AI just becomes the latest salvo in the cat and mouse game between teachers and students.
If students are asked questions that matter to them, then they are more likely to do their own work.
To put it another way, much of what we ask for homework is exactly what AI was made for. Read this passage, summarize the main points, and fill in a worksheet or a PowerPoint slide. Answer this algorithmic problem using the algorithm we gave you. That is basically low-level mechanical thinking, so it is not surprising that artificial intelligence is particularly well-suited to it. If you add the fact that no human being would want to do these things, it becomes a recipe for homework by Copilot. But if, instead, students are asked questions that matter to them, then they are more likely to do their own work. Relatedly, the more distinctive and original the task, the less likely that AI can provide the answer.
For example, I recently visited Crosstown High in Memphis, and students showed us with great pride the massive mural they were creating for the nursing home across the street. They had met with the residents, received input on which designs would be desirable, and were working in teams to transform a wall overgrown with weeds into something which both the school and the nursing home could be proud. This is the kind of assignment that gets students excited about their learning, that creates work that can be done publicly and in teams, and is not susceptible to cheating, AI or otherwise.
Zooming out a bit, many schools, districts, and states now have portraits of a graduate, which are visions of the kinds of citizens that they hope their young people will be when they leave them. Ensconced in these documents are qualities like critical thinking, effective communication, collaboration, empathy, and citizenship. These are not things that AI can do for you. But there is a considerable mismatch between these widely stated goals and the daily realities of schooling, which, with some exceptions, tend toward the same kind of rote, traditional tasks that they always have. The hidden curriculum of these tasks is highly individualistic and passive learning—sit in rows and lines, assimilate content, and then reproduce that content on exams, problem sets, and essays. The irony of the first wave of AI tools is that it is promising to help students move more efficiently through this conventional grammar, when what we should be learning is that as AI can do these tasks, we should be channeling our human learning energies in a different and more ambitious direction.
For example, consider my colleague Marshall Ganz’s Organizing class. On the very first day of class, students are sent out into Harvard Square to try to organize something. Trial by fire. And then, over the course of the semester, they learn many of the techniques and strategies of organizing—conducting one-on-one interviews, sharing a personal narrative, learning how to weave together emotion with reason to develop an organized coalition in support of a cause. And they get to pick the cause—something that matters to them that they see as important. In this work, they might draw on AI, for example to summarize the themes of their interviews, but they are the ones leading the work. This is what Marshall refers to as “heads, hands, heart” pedagogy: students need to think and strategize (head), but they also need to discover their courage (heart), and they need to develop the skills to put their ideas into action (hands). When education draws together these qualities, it becomes a powerful human experience, not something that can be replaced by AI.
There also are educational situations where we wouldn’t want to use AI at all. These are places where students are developing their own fundamental competencies—their ability to put one word after another, to analyze a mathematical problem or puzzle. Having AI do these things for you essentially short circuits your own thinking. It’s like GPS—it might get you where you are going, but you still wouldn’t know how to get there. Education is fundamentally a process of growing one’s capacities: it doesn’t matter that you will never be Messi or Adele; it is still worth learning to kick the soccer ball or hit a high note, even if there is someone (or something) that can do it better. The more we can enlist students to take charge of their education, the more they will be able to make decisions about when, how, and under what circumstances AI can be helpful in their own growth. Particularly as students get older, involving them in these conversations is an important part of growing their own ethical literacy around AI.
Finally, there is the role of AI in the workforce. While it is too soon to definitively say, it seems likely that AI is going to eliminate quite a bit of what we used to think of as moderately skilled white-collar work. Machines can transpose numbers from one spreadsheet to the next, they can make PowerPoint presentations, they can code, they can even make music and advertisements.
In this world, what human skills will be valuable? I’d nominate at least the following three capacities:
- Originality and creativity will be prized: people who can come up with new ways of seeing things and create different interpretations will be highly valued.
- Intra- and interpersonal skills will be critical: executive functioning, leading teams, organizing streams of work, and forging collective purpose.
- Judgement and decision-making may turn out to be the most important and human skill of them all: being able to look at various possible options and drafts and making decisions about what is worth moving forward.
If these are the kinds of capabilities that human beings are going to need, we need to build an education system that reflects them. We have an opportunity to re-envision schools to meet the needs of the workforce and the citizenry, and if we do, AI will return to its rightful place as a tool in that larger human enterprise.
Photo at top courtesy of Thrive.
