Black teachers: How to recruit them and make them stay
Lessons in higher education: What California can learn
Keeping California public university options open
Superintendents: Well-paid and walking away
The debt to degree connection
College in prison: How earning a degree can lead to a new life
ChatGPT and other artificial intelligence, or AI, writing tools can generate humanlike stories, essays, poems and other written forms. Writers can use these tools in many ways, including as a muse that inspires ideas; a co-author that helps craft text; a reviewer that provides constructive feedback; an editor that checks the details; or a ghostwriter that writes without credit.
Educators have many concerns about the impact of these powerful tools on teaching, learning and using writing in schools. Should some uses of AI be considered appropriate, while others are treated as a modern form of plagiarism? Should students master certain writing skills before being allowed to use AI tools? Can we monitor how students use them? Do AI tools fundamentally change what students need to learn and how they should be taught?
A common initial response is to ban the use of ChatGPT. However, bans will be futile as AI writing capabilities become widely available and integrated into word-processing programs. We must accept that AI tools are changing how writing is accomplished in every field and embrace that students need to learn to use them effectively.
Reforming writing in schools requires careful consideration because it will involve changes in curriculum standards, teaching practices, student assessments, teacher preparation and education policies. In some ways, this parallels past changes in mathematics education, in which calculators went from being banned to being required. These changes take time and cannot move as quickly as AI tools are advancing.
The impressive capabilities of AI writing tools come with important limitations for educators to consider, including the following:
AI systems do not replicate human knowledge, cognition or emotion. AI systems are trained by processing an enormous corpus of digital text. By contrast, much of human knowledge stems from goal-driven activities, social interactions, modeling of others and other interactions in the real world. These experiences lead to embodied understandings of causes and effects; emotional intelligence involving understanding others’ needs, motives and perspectives; a sense of family, community and culture; and, perhaps most importantly, a sense of self. AI will never match the richness of the human experience.
AI writing quality is limited. Since AI-generated text is based on patterns found in the training texts, it often has a dull, written-by-committee style that lacks engaging and creative writing. In addition, AI tools are limited in handling complex ideas, so their output is often overly simplistic and fails to be convincing.
AI systems are often outdated. AI systems are trained when created and are not continuously updated, so they can produce outdated information and fail to respond well to requests that require timely knowledge.
AI systems can produce harmful content. The internet materials used to train AI systems can include racist, sexist, homophobic and other forms of offensive content. As a result, AI can generate unintended (or intended) toxic outputs.
AI systems can lack veracity. AI tools can fabricate statistics, historical events, quotes, references and all sorts of other information, often producing authoritative-sounding text that is simply untrue.
Given the limitations, AI tools do not produce quality text at the push of a button. Using them effectively requires that students learn to do the following:
These steps, which form the acronym SPACE, encompass new forms of human-computer interactions to accomplish writing tasks.
Educators must understand and embrace the changes driven by advances in AI, and it is time to begin the challenging work of reforming how we teach students to write with AI tools. Success will require collaborations of educators, researchers, AI experts, policymakers and others across the public and private sectors, focusing on what students need to learn to be successful in the AI-augmented world in which they will — and already do — live.
•••
Glenn M. Kleiman, Ph.D., is a senior adviser at the Stanford Graduate School of Education’s Accelerator for Learning. He was previously executive director of the Friday Institute for Educational Innovation and professor of education at North Carolina State University.
The opinions in this commentary are those of the author. If you would like to submit a commentary, please review our guidelines and contact us.
A grassroots campaign recalled two conservative members of the Orange Unified School District in an election that cost more than half a million dollars.
Legislation that would remove one of the last tests teachers are required to take to earn a credential in California passed the Senate Education Committee.
Part-time instructors, many who work for decades off the tenure track and at a lower pay rate, have been called “apprentices to nowhere.”
A bill to mandate use of the method will not advance in the Legislature this year in the face of teachers union opposition.
Comments
Comments Policy
We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.