Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.
Hard to believe that it was just last fall that ChatGPT was unveiled. In the months since, AI has leapt from the realm of science fiction into our daily lives. Educators have struggled to make sense of these new tools and the attendant opportunities and risks—all while being bombarded by hype and hand-wringing. As we commence the first school year of the AI era, it seemed like a good time to check in with the American Enterprise Institute’s inimitable John Bailey, the individual I most trust to talk straight on what AI may mean for schooling. Over the years, John’s many roles have included serving as director of educational technology for the Pennsylvania and U.S. departments of education. In Education Next, John recently penned the must-read “AI in Education,” which covered a lot of ground and thus didn’t get to go as deep into practical suggestions as it might’ve. I wanted to give John a chance to talk more concretely about opportunities for educators, learners, and parents. Here’s what he had to say.
Rick: You recently penned a new piece on AI and schools. What’s the thumbnail sketch for folks who haven’t read it?
John: ChatGPT captured the country’s imagination with its ability to answer questions, write essays, summarize documents, write software, and analyze data. I tried to explain how these generative AI technologies work, how they can be used in education, and also some of the challenges they pose. It’s difficult to describe AI in the abstract, so I provide some example prompts that demonstrate what these systems are capable of.
Rick: OK, big picture: You know as well as anyone that schools have been swamped over the years with overhyped tech. Is AI different?
John: That’s very true and important to acknowledge. I’ve also been guilty of contributing to the hype only to be disappointed when the promised benefits fell short. The internet opened up a world of information and made it easier to distribute new resources. Personalized learning tried to curate resources to meet individual needs, but it was expensive and had limited capabilities. And many technology products just complicated the lives of teachers instead of simplifying them.
Generative AI is different. Rather than simply expanding resources, it expands human capabilities. That sounds strange until you play with these tools. Anyone with a phone now has on-demand access to AI assistants capable of performing a growing range of sophisticated tasks that you would normally assign to a human intern or a teaching assistant. And it’s incredibly simple to access and use. Anyone with a cellphone and the internet can access these services and ask them to complete tasks just using normal, everyday language. We’re only beginning to grasp the potential and challenges of putting these rapidly evolving AI capabilities into millions of hands.
Rick: What has you most optimistic about AI’s potential impact in education?
John: We’re still trying to understand the potential of these new AI systems, but they’re promising for two main reasons. First, they can handle an expanding range of tasks that could be helpful to teachers and students. Their ability to make sense of complicated data could also help support better decisionmaking, too. And second, all of this is achieved using natural human language, which dramatically lowers the barrier to entry for people to use these systems.
Rick: What has you most concerned?
John: While AI systems are advanced, they can produce misleading information, or “hallucinations.” There’s also the risk that the more we rely on tools like this, the greater the chance we’ll “fall asleep at the wheel” and just start deferring to their outputs instead of reviewing them and using our own judgments. Bias is another problem, in all of its facets: racial, political, pedagogical, etc. Past skepticism of technology might limit our view of AI’s potential—in fact, I worry we may be underestimating the potential of AI. And finally, I’m concerned that in policy conversations, there is too much focus on minimizing potential harm from AI rather than maximizing its benefits.
Rick: You’ve written that AI could serve as an instructional assistant, serving as a tutor or helping explain difficult concepts to students. Can you get more concrete about what that might look like?
John: Understanding the capabilities of these systems allows us to harness them effectively in educational settings. They excel at conversation and concept explanation. With a simple natural-language prompt, they can emulate tutors offering real-world examples or adopt a Socratic approach. Systems like ChatGPT have enabled providers like Khan Academy’s Khanmigo, DuoLingo Max, and CheggMate to offer instructional support.
Rick: Have there been any rigorous studies that speak to how effective these AI-based tutoring platforms are?
John: It’s been less than a year since these models have been out and even less time since some of these tutors were built. So, there hasn’t been time to conduct efficacy studies. The research that I do think is important to follow is around the capabilities of these models. OpenAI’s technical report details how GPT4 passed many of the assessments we use to measure human intelligence. Google released a similar report demonstrating that its medical model became the first to exceed a “passing” score on the U.S. Medical Licensing Examination. Microsoft researchers found GPT4 is close to human-level performance on a wide range of tasks. A new Wharton study found these systems beat MBA students in generating creative ideas. The question is how we can deploy these capabilities to help improve teaching, strengthen learning, and better support parents.
Rick: You’ve also written that AI could help reduce the amount of time teachers devote to administrative tasks. Can you get a little more concrete about how this might work?
John: Teachers increasingly spend more time on administrative tasks and less on actual teaching. There’s a hope that these AI tools might reduce some of the time spent on administrative tasks to free up teachers to spend more time with students. AI systems could serve in the role of a teaching assistant, helping to develop lesson plans, creating differentiated materials, designing student worksheets, and brainstorming ways of explaining a concept to students. In the years ahead, I think we’ll see these kinds of capabilities embedded in a lot of existing tools that teachers already use like Google Classroom.
Rick: And you’ve suggested that AI can also help parents with education-related tasks—like requesting an IEP or bedtime stories. This seems like an oft-overlooked set of opportunities: What are some good places for parents to start?
John: I was speaking with a mom who was trying to secure some specialized IEP services for her son. But the process was bureaucratic and complicated, so much so that she typically uses a lawyer to engage with the district. I asked her to email me exactly what she would ask the lawyer to do, gave it to ChatGPT, and it produced a pretty incredible first draft of a letter. It wasn’t perfect, but it was remarkable in both the way it constructed the justification and the assertive tone in which it made its case.
Rick: Finally, in your Education Next piece, you also noted that AI could offer crucial assistance to administrators. What are some examples of what you have in mind?
John: Generative AI could be extremely helpful in drafting more parent-friendly documents. For example, I asked ChatGPT to take a sample letter to parents about assessment suggested by the Missouri education department and make it easier to understand and more persuasive about the value of assessments. It did a remarkable job, including fixing some of the typos in the original draft. AI could also be useful in producing meeting summaries and cleaning up transcripts from school board meetings. There are just a number of tedious tasks that seem ripe for AI automation.
Rick: What’ll it take for AI to do any of this well? Can you offer a few tips for educators and system leaders on that score?
John: The first step is really to play with these tools to better understand what they can do. Think about what you would email an intern to do and use that as a prompt. If it doesn’t get it right, just respond conversationally with what you need it to do better. Second, think about different ways these capabilities could be used in education. aiEDU has a number of helpful resources and conducts workshops around the country that can help. And finally, make sure to deploy these AI tools safely and responsibly. The U.S. Department of Education in partnership with Digital Promise produced a useful guide outlining some of these best practices.
The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.