“Taking writing classes was great!”

On May 1st, 2024, Sam Altman, CEO of OpenAI, reflected on his time as an undergraduate at Stanford, where he explored courses outside computer science, including creative writing. At Harvard Business School, Altman encouraged students to “be relentlessly resourceful.”

The irony? Altman leads the company behind ChatGPT, the AI tool most “threatening” writing in schools, while speaking at one of the world’s top universities.

Recently, a Massachusetts high school was sued by parents after disciplining a student for using AI to generate notes and an outline for a social studies paper.

As AI is hailed as innovation in one setting and condemned as plagiarism in another, how should schools and educators adapt?

Schools Must Clarify AI Guidelines

The greatest challenge facing schools today is ambiguity. With AI tools like ChatGPT and Microsoft Copilot becoming commonplace, school districts as well as private schools must establish clear, transparent policies on AI usage.

Academic integrity guidelines need thorough updates to clearly define the difference between plagiarism and the ethical use of AI. Without clarity, students, parents, and teachers are left confused and frustrated.

Superintendents and administrators should spearhead discussions on this issue, establishing a shared understanding of AI’s role in classrooms. If schools are unprepared to enact comprehensive policies, a phased approach is necessary, gradually introducing guidelines that align with both classroom needs and broader educational goals.

But this change can’t stop at the administrative level.

Teachers must ensure that classroom expectations mirror school policies and communicate these to students before each assignment. Consistency between school and classroom policy is key to preventing confusion.

Time to Make AI the Baseline of Education

The founder of the non-profit Khan Academy, Sal Khan, recently wrote a book Brave New Words: How AI Will Revolutionize Education (and Why That’s a Good Thing). Khan argued that AI tutors such as Khanmigo can enhance students’ learning and even writing skills. Developed and deployed ethically, AI can teach students write a clear thesis statement and build an effective outline. AI can offer real-time feedback on writing, from which students can benefit as if working with a writing coach.

AI is transforming industries and the workforce, assisting professionals in writing, programming, and marketing tasks. AI tools have been integrated in financial, manufacturing, e-commerce, IT and other industries to boost productivity. Just as we wouldn’t ban the use of internet in schools, it doesn’t make sense to shield students from AI tools that will be critical to their career and future success.

Major AI platforms like ChatGPT and Microsoft Copilot are already integrated into educational environments. For instance, OpenAI introduced ChatGPT EDU to universities and colleges.

Elite institutions around the country, including Harvard University, already adopted the service. Princeton University made Microsoft Copilot available to faculty and students.

High schools must adapt to this technology following the examples of colleges and universities.

Banning AI in schools is not only impractical but also counterproductive. For subjects where teachers don’t want AI usage, they should simply administer tests in a tech-free environment.

Redesign Assignments and Adjust Evaluation Criteria

Rather than viewing AI as a threat, educators should treat it as an assistant—much like calculators, internet, educational apps, or YouTube videos have been in the past.

AI can provide organized information and insights, but it doesn’t eliminate the need for thoughtful work if teachers redesign their requirements.

By allowing students to use AI as a baseline, schools can raise the bar for evaluation. Creativity and originality will still stand out, and those who go beyond AI-generated content will earn higher grades.

AI can be leveraged to improve STEM education in k-12 schools. Machine learning and other technologies can be taught in connection to coding and robotics, so that middle and high schools can better prepare their students for college and career.

What we learn, how we learn and how we evaluate learning outcomes should also adapt.

Teachers can challenge students to incorporate AI, while still demonstrating their own thinking. This might mean guiding students to use AI as a starting point and then pushing for higher-quality analysis and unique perspectives. Students may be asked to debate AI’s viewpoints, which will foster critical thinking abilities.

Train Students to Use AI Critically

Part of AI’s integration into education involves teaching students to use it mindfully and responsibly. Young Data Scientists League, a non-profit organization focusing on empowering middle and high school students to use data science in creating public good, initiates new projects around AI ethics. Its executive director and AI researcher, Evan Shieh, argues that students can engage in projects such as sociotechnical audits of AI models to make these systems more equitable.

Shieh and Faye-Marie Vassel, a postdoctoral fellow at Stanford Institute for Human-Centered AI, show in a recent study that racial and gender biases persist in large language models and multimodal AI systems.

Therefore, using AI without critically evaluating the fairness and accuracy of its generated content and training datasets risks perpetuating biases or misleading learners.

Students should be given opportunities to critique on stereotypes and discriminative narratives in AI generated contents. Their feedback should be given to enterprises developing these AI models.

Actively citing AI-generated materials should also become a standard practice. Just as students are taught to cite books, journal articles, websites or databases, they should also learn to cite AI tools using formats like MLA or APA. Both associations announced specific guidelines on how to cite AI generated materials.

Educators Need Training, Too

Finally, educators themselves need support.

AI can’t be demonized or dismissed—it’s becoming as prevalent as internet searches. Teachers need training to understand how AI works and how to incorporate it thoughtfully into the teaching and learning processes.

Like any tool, its value depends on how it’s used. We don’t consider looking up information online cheating, so why would using AI for research be any different?

AI is reshaping education, and it’s time for schools and educators to adapt. Rather than resisting this change, we must guide students to use AI responsibly.

We should re-envision assignments, learning objectives and evaluation standards students’ works. By doing so, we can ensure that students are prepared not just for grasping new knowledge and skills, but for a future fostering ethics-embedded AI.

Share.
Exit mobile version