PRODUCTION – 26 July 2023, Baden-Württemberg, Karlsruhe: Students at a high school sit in front of a … [+]
Reflecting on his time as an undergraduate at Stanford, Sam Altman, CEO of OpenAI, said he explored courses outside computer science, including creative writing. “Taking writing classes was great!” he said in a May interview at Harvard Business School, encouraging students to “be relentlessly resourceful.”
The irony? Altman, who was speaking at one of the world’s top universities, leads the company behind ChatGPT, the AI tool that has upended school writing assignments — creating confusion among educators and students — since its launch nearly two years ago.
Now, in what may be the first of such cases, the parents of a Massachusetts high school student are suing school faculty and officials after disciplining him for using AI for a social studies paper. The family’s attorney said he used AI to generate notes and ideas, not to write the paper.
It’s a case that is likely to draw attention, as schools worldwide are grappling with the implications of these tools in the classroom. As AI is hailed as innovative tool by some and condemned as plagiarism by others, how should schools and educators adapt?
The greatest challenge facing schools today is ambiguity. With AI tools like ChatGPT and Microsoft Copilot becoming commonplace, school districts as well as private schools must establish clear, transparent policies on AI usage.
Academic integrity guidelines need thorough updates to clearly define the difference between plagiarism and the ethical use of AI. Without clarity, students, parents and teachers are left confused and frustrated.
Superintendents and administrators should spearhead discussions on this issue, establishing a shared understanding of AI’s role in classrooms. If schools are unprepared to enact comprehensive policies, a phased approach is necessary, gradually introducing guidelines that align with both classroom needs and broader educational goals.
But this change can’t stop at the administrative level.
Teachers must ensure that classroom expectations mirror school policies and communicate these to students before each assignment. Consistency between school and classroom policy is key to preventing confusion.
Sal Khan, founder of the non-profit Khan Academy, recently wrote a book, Brave New Words: How AI Will Revolutionize Education (and Why That’s a Good Thing). Khan argued that AI tutors such as Khanmigo can enhance students’ learning and even writing skills. Developed and deployed ethically, AI can teach students write a clear thesis statement and build an effective outline. AI can offer real-time feedback on writing, from which students can benefit as if working with a writing coach.
PALO ALTO, CA – MARCH 31: (l to r) Students Josephine Chan, Sahir S., Arohi Brahmachari, Iris Wang, … [+]
AI is transforming industries and the workforce, assisting professionals in writing, programming and marketing tasks. AI tools have been integrated in financial, manufacturing, e-commerce, IT and other industries to boost productivity. Just as we wouldn’t ban the use of internet in schools, it doesn’t make sense to shield students from AI tools that will be critical to their career and future success.
Major AI platforms like ChatGPT and Microsoft Copilot are already integrated into educational environments. For instance, OpenAI introduced ChatGPT EDU to universities and colleges.
Elite institutions around the country, including Harvard University, already adopted the service. Princeton University made Microsoft Copilot available to faculty and students.
High schools must adapt to this technology following the examples of colleges and universities.
Banning AI in schools is not only impractical but also counterproductive. When teachers don’t don’t want students to use AI for certain subjects, they should simply administer tests in a tech-free environment.
Rather than viewing AI as a threat — much like calculators, the internet, educational apps or YouTube videos have been in the past — educators should treat it as an assistant.
AI can provide organized information and insights, but it doesn’t eliminate the need for thoughtful work if teachers redesign their requirements.
By allowing students to use AI as a baseline, schools can raise the bar for evaluation. Creativity and originality will still stand out, and those who go beyond AI-generated content will earn higher grades.
AI can be leveraged to improve STEM education in K-12 schools. Machine learning and other technologies can be taught in connection to coding and robotics, so that middle and high schools can better prepare their students for college and career.
SYDNEY, AUSTRALIA – AUGUST 09: A student monitors his robot on a rescue course during the 2007 NSW … [+]
We should also adapt what we learn, how we learn and how we evaluate learning outcomes.
Teachers can challenge students to incorporate AI while still demonstrating their own thinking. This might mean guiding students to use AI as a starting point and then pushing for higher-quality analysis and unique perspectives. Students may be asked to debate AI’s viewpoints, which will foster critical-thinking abilities.
Part of AI’s integration into education involves teaching students to use it mindfully and responsibly. Young Data Scientists League, a non-profit organization focusing on empowering middle and high school students to use data science in creating public good, initiates new projects around AI ethics. Its executive director and AI researcher, Evan Shieh, argues that students can engage in projects such as sociotechnical audits of AI models to make these systems more equitable.
Shieh and Faye-Marie Vassel, a postdoctoral fellow at Stanford Institute for Human-Centered AI, show in a recent study that racial and gender biases persist in large language models and multimodal AI systems.
Therefore, using AI without critically evaluating the fairness and accuracy of its generated content and training datasets risks perpetuating biases or misleading learners.
Students should be given opportunities to critique on stereotypes and discriminative narratives in AI-generated contents. Their feedback should be given to enterprises developing these AI models.
Actively citing AI-generated materials should also become a standard practice. Just as students are taught to cite books, journal articles, websites or databases, they should also learn to cite AI tools using formats like MLA or APA. Both associations announced specific guidelines on how to cite AI-generated materials.
Finally, educators themselves need support.
In response to this author’s email, Po-Shen Loh, a mathematician at Carnegie Mellon University and founder of a non-profit that blends math with improvisational comedy for online peer tutoring, shared his insights on minimizing AI-related “cheating” and addressing a deeper civilizational challenge: what kind of humans do we want today’s education to nurture?
Loh urges educators to reconsider whether what they are teaching is genuinely meaningful in today’s world.
Loh’s teaching approach, both at Carnegie Mellon and with middle school students globally, emphasizes helping students “invent solutions to unfamiliar problems.” He makes homework only 10% of the grade, allowing collaboration and internet use, while 90% of the grade comes from in-class, device-free exams and quizzes. Exam questions are related to, but do not mirror, homework problems, encouraging students to “invent new ways of thinking.”
In his online math tutoring program, Loh pairs professional actors with star math students to help peer learners become more emotionally engaged, communicative, and supportive of others, all while tackling difficult math problems. Loh argues that education in the AI era should cultivate a service mindset, inspiring students to “create value for others,” rather than focusing only on personal gains.
AI can’t be demonized or dismissed — it’s becoming as prevalent as internet searches. Teachers need training to understand how AI works and how to incorporate it thoughtfully into the teaching and learning processes.
Like any tool, its value depends on how it’s used. We don’t consider looking up information online cheating, so why would using AI for research be any different?
AI is reshaping education, and it’s time for schools and educators to adapt. Rather than resisting this change, we must guide students to use AI responsibly.
We should re-envision assignments, learning objectives and evaluation standards. By doing so, we can ensure that students are prepared not just for grasping new knowledge and skills, but for a future fostering ethics-embedded AI.
Update: This article has been updated to include comments from Professor Po-Shen Loh of Carnegie Mellon University.
One Community. Many Voices. Create a free account to share your thoughts.
Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.
In order to do so, please follow the posting rules in our site’s Terms of Service. We’ve summarized some of those key rules below. Simply put, keep it civil.
Your post will be rejected if we notice that it seems to contain:
User accounts will be blocked if we notice or believe that users are engaged in:
So, how can you be a power user?
Thanks for reading our community guidelines. Please read the full list of posting rules found in our site’s Terms of Service.