Adapting your course to artificial intelligence
The release of artificial intelligence software known as OpenAI has generated intense discussions about the growing capabilities of AI and how artificial intelligence might change jobs, disciplines, societies, and even the way we think. Artificial intelligence has been evolving for decades, but most educators have done little to consider how these new digital tools might change teaching, learning, and student expectations. For instance:
- What do we need to know about the latest AI platforms and how our students might use them?
- How might AI change how we teach, what we teach, what we assign students to do, and how we assess student work?
- What ethical issues do we need to address as we consider how AI might change the way we and our students think and act?
- The algorithmic assumptions and decisions behind AI software are hidden, so how do we evaluate potential new tools and learning platforms?
There are no definitive answers to those questions, but the resources on this page are intended to provide guidance and ideas on how to use artificial intelligence tools with your courses.
Some things to think about with AI
Talk with them about your expectations and how you will view (and grade) assignments generated solely with artificial intelligence. Emphasize the importance of learning and explain why you are having them complete the assignments you use. Why is your class structured as it is? How will they use the skills they gain? That sort of transparency has always been important, but it is even more so now. Students intent on cheating will always cheat. Some draw from archives at greek houses, buy papers online or have a friend do the work for them. ChatGPT is just another means of avoiding the work that learning requires. Making learning more apparent will help win over some students, as will flexibility and choices in assignments. This is also a good time to emphasize the importance of human interaction in learning, something artificial intelligence lacks.
As fluent as ChatGPT often seems, its answers rarely delve beneath the surface of a topic. It makes mistakes. It makes things up. Its responses provide no clues about how it is programmed or why it provides the answers it does. A Princeton researcher called it a "bullshit generator" because it creates plausible arguments without regard for truth.
All of that makes it a valuable teaching tool, though. By having students probe for answers, we can help them improve their skepticism, challenge assumptions, and question information. By having them fact-check, we can help them understand the dangers of fluid writing that lacks substance or that relies on fallacies. By having them use ChatGPT or other AI tools for early drafts, we can push them to ask questions about information, structure, and sources. By having them apply different perspectives to ChatGPT's results, we can help broaden their understanding of points of view and argument.
Many students are already using ChatGPT in their school work. We can no more ban students from using artificial intelligence than we can ban them from using phones or calculators or laptops. We need to talk with students about how to use ChatGPT and other AI tools effectively and ethically, though. No, they should not take AI-written materials and turn them in for assignments, but yes, they should use AI when appropriate. Businesses of all sorts are already adapting to AI, and students will need to know how to use it when they move into the workforce. Students in K-12 schools are using it and will expect access when they come to college. Rather than banning ChatGPT and other AI tools or fretting over how to police them, we need to change our practices, our assignments, and our expectations. We need to focus more on helping students iterate their writing, develop their information literacy skills, and humanize their work.
One way to help with that is reflection, which helps students develop their metacognitive skills. It can also help them understand how to integrate AI into their learning processes and how they can build and expand on what AI provides. Reflection can also help reinforce academic honesty. Rather than hiding how they completed an assignment, reflection helps students embrace transparency.
ChatGPT also requires new ways of thinking, and having students reflect on their use of it could prove valuable. For instance, ChatGPT rarely produces solid results on the first try, and it often takes several iterations of a question to get good answers. Sometimes it never provides good answers. That makes it much like web or database searching, which requires patience and persistence as you refine search terms, narrow your focus, identify specific file types, try different types of syntax and search operators, and evaluate many pages of results. Add AI to the expanding repertoire of digital literacies students need. (Teaching guides and e-books are already becoming available online.)
ChatGPT's ability to create work as good as many early undergraduates means we will have to rethink assignments and curricula. For instance:
- Create assignments in which students start with ChatGPT and then have discussions about strengths and weaknesses. Have students compare the output from AI writing platforms, critique that output, and then create strategies for building on it and improving it.
- Use multistep, scaffolded assignments with feedback and revision opportunities.
- Emphasize assignment dimensions that are (currently) difficult for AI: synthesis, student voice and opinions.
- Use project-based learning.
Anne Bruder offeres additional suggestions in Education Week, Ethan Mollick does the same on his blog, and Anna Mills has created a Google Doc with many ideas (one of a series of documents and curated resources she has made available). Paul Fyfe of North Carolina State provides perhaps the most in-depth take on the use of AI in teaching, having experimented with an earlier version of the ChatGPT model more than a year ago.
Students often struggle with questions to ask for research papers or topics to pursue for projects. ChatGPT's responses to questions are often bland and shallow, but it can also suggest ideas or solutions that aren't always apparent. It can become a partner, of sorts, in writing and problem-solving. It might suggest an outline for a project, articulate the main approaches others have taken to solving a problem, or provide summaries of articles to help decide whether to delve deeper into them. It might provide a counterargument to a position or opinion, helping strengthen an argument or point out flaws in a particular perspective. We need to help students evaluate those results just as we need to help them interpret online search results and help them interpret media of all types. ChatGPT can provide motivation for starting many types of projects, though.
ChatGPT is incapable of acknowledging, although it has many. Maria Andersen, a teacher and creator of a technology startup, said that it had a white, male view of the world. Maya Ackerman of Santa Clara University told The Story Exchange: "People say the AI is sexist, but it's the world that is sexist. All the models do is reflect our world to us, like a mirror." ChatGPT has been trained to avoid hate speech, sexual content, and anything OpenAI considered toxic or harmful. Others have said that it avoids conflict, and that its deep training in English over other languages skews its perspective. Some of that will no doubt change in the coming months and years as the scope of ChatGPT expands. No matter the changes, though, ChatGPT will live in and draw from its programmers' interpretation of reality. Of course, that provides excellent opportunities for class discussions, class assignments, and critical thinking.
University policy on academic misconduct is broad enough to apply to problematic uses of artificial intelligence:
Academic misconduct by a student shall include, but not be limited to, disruption of classes; threatening an instructor or fellow student in an academic setting; giving or receiving of unauthorized aid on examinations or in the preparation of notebooks, themes, reports or other assignments; knowingly misrepresenting the source of any academic work; unauthorized changing of grades; unauthorized use of University approvals or forging of signatures; falsification of research results; plagiarizing of another's work; violation of regulations or ethical codes for the treatment of human and animal subjects; or otherwise acting dishonestly in research.
Student Affairs also emphasizes to students the importance of academic integrity. It guidance says, in part:
Academic integrity is a central value in higher education. It rests on two principles: first, that academic work is represented truthfully as to its source and its accuracy, and second, that academic results are obtained by fair and authorized means. "Academic misconduct" occurs when these values are not respected. Academic misconduct at KU is defined in the University Senate Rules and Regulations. A good rule of thumb is "if you have to ask if this is cheating, it probably is.
The questions about appropriate use of AI in academic work will no doubt linger for years. For instance, where do we set the boundaries for use of AI? If students use PowerPoint to redesign their slides, is it still their work? If they use ChatGPT to write part of a paper, is it still their paper? If they use DALL-E or other visual tools to create artwork or AI-powered tools to create music, is that really original work? If they use AI-powered summarization tools or grammar checkers, have they done the work on their own?
Those are difficult questions, and we have yet to determine where to set the boundaries of AI use in our classes or our professional lives. That makes it especially important to talk with students about your expectations and to include a statement in your syllabus about those expectations.
ChatGPT is just one of a growing number of digital tools using artificial intelligence. Those tools can summarize information, create artwork, iterate searches based on the bibliographies of articles you mark, answer questions from the perspectives of historical figures and fictional characters, turn text into audio and video, create animated avatars, analyze and enhance photos and video, create voices, and perform any number of digital tasks. AI is integrated in phones, computers, lighting systems, thermostats, and just about any digital appliance you can imagine. So the question isn't whether to use use AI; we already are, whether we realize it or not. The question is how quickly we are willing to learn to use it effectively in teaching and learning.
This is a brief list of articles, tools, and sites related to artificial intelligence and teaching. You will find additional resources in handouts for CTE sessions on AI, writing, and critical thinking; and using AI in teaching.
Guides, tools and sites
Sentient Syllabus Project, a collaborative resource created by professors in Canada, the United States, and Japan to help instructors use artificial intelligence resources meaningfully. The site can be challenging to navigate, but it contains many thoughtful materials related to syllabus language, objectives, and activities.
AI Text Generators and Teaching Writing: Starting Points for Inquiry, curated by Anna Mills of the College of Marin and hosted by Colorado State University. Also see:
- A Teacher’s Prompt Guide to ChatGPT, by Andrew Herft.
- The practical guide to using AI to do stuff, by Ethan Mollick of the Wharton School of Business.
- AI in Higher Education Metasite, by Ray Schroeder.
- Update Your Course Syllabus for ChatGPT, by Ryan Watkins, George Washington University.
- Classroom Policies for AI Generative Tools, compiled by Lance Eaton of College Unbound. Material from classes at several universities.
AI Text Classifier. Created by OpenAI, which is upfront about the tool’s weaknesses:It can misidentify material and can be fooled with some editing; it requires at least 150 to 250 words; and it doesn’t work well in languages other than English or with text written by children. Related tools:
- AI Content Detector, a free beta tool created by Copyleaks, a company that sells plagiarism detection software.
- AI Content Detector, a free tool from Writer, a company that sells AI-driven writing software.
- Hive Moderation. Has a free option.
- GPTZero. Free for now, and you can use it with paragraphs of up to 250 words. There’s a wait list for the full version.
AI, writing, and assignment design
- How to cheat on your final paper: Assigning AI for student writing, by Paul Fyfe. AI & Society (accepted 4 September 2022). A professor writes about an AI experiment he undertook before ChatGPT roiled education.
- Embrace the Bot: Designing Writing Assignments in the Face of AI, by Eric Prochaska, Faculty Focus (23 January 2023).
- Teaching Actual Student Writing in an AI World, by Kevin Jacob Kelley, Inside Higher Ed (19 January 2023). Offers 10 strategies for engaging students in writing projects that avoid AI.
- Worried About ChatGPT? Don’t Be, by Hetal Thaker, Inside Higher Ed (23 January 2023). Argues that AI is part of larger questions the humanities have been grappling with for years (relevance, creativity, assessment) and that it is just another reminder of why we need to keep pushing for solutions.
- ChatGPT can’t be credited as an author, says world’s largest academic publisher, by James Vincent, The Verge (26 January 2023). Springer Nature says authors are free to use AI tools as long as they disclose their use. Medium announced a similar policy and cites policies from other organizations.
- A Message to Students About ‘The Bot,’ by Julia McKenzie Munemo, Inside Higher Ed (23 January 2023). The director of a writing center argues that AI will further demotivate students who see writing as a chore and classes as mere tasks. To educators, Munemo writes: “In a world where students are taught to write like robots, a robot can write for them.”
- Why I’m Not Scared of ChatGPT, by Christopher Grobe, Chronicle Review (3 February 2023).
- ChatGPT’s Fluent BS Is Compelling Because Everything Is Fluent BS, by Amit Katwala, Wired (9 December 2022).“ChatGPT was trained on real-world text, and the real world essentially runs on fluent bullshit,” from media to business to politics to education.
AI and teaching
- Will ChatGPT Change the Way You Teach?, by Beth McMurtrie, Chronicle of Higher Education newsletter (5 January 2023).
- Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach, by Kalley Huang, New York Times (16 January 2023).
- ChatGPT Advice Academics Can Use Now, by Susan D’Agostino, Inside Higher Ed (12 January 2023). Thoughts from a variety of educators.
- Is ChatGPT a threat to education?, by Iqbal Pittalwala, UC Riverside News (24 January 2023). Instructors in computer science, biology, education policy, creative writing, physics and astronomy, economics, and public health offer their views.
- Designing Assignments in the ChatGPT Era, by Susan D’Agostino, Inside Higher Ed (31 January 2023).
- With ChatGPT, We’re All Editors Now, by Rachel Elliott Rigolino, Inside Higher Ed (31 January 2023). Argues that AI has increased the need for critical editing.
- Can ChatGPT fill in mentorship gaps for Gen Z workers, by Cloey Callahan, Worklife (27 January 2023).
- The bots are here to stay. Do we deny or do we adapt?, by Doug Ward, Bloom’s Sixth (20 January 2023).\
AI and critical thinking
- Conversation-Stopper: Will the development of AI tools make us less intelligent?, by John Symons, Return (26 July 2022).
- How ChatGPT robs students of motivation to write and think for themselves, by Naomi S. Baron, The Conversation (19 January 2023). A linguist argues that writing is a human process and that AI tools diminish students’ ability to improve their thinking.
- How to Teach Critical Thinking, by Daniel T. Willingham, Education: Future Frontiers (May 2019). A report commissioned by Australian educators provides a broad literature review of critical thinking research. That research suggests that critical thinking is discipline specific, even though we often consider it a general skill. The Hechinger Report also wrote about Willingham’s work.
- An AI toolkit for libraries, by Michael Upshawk, UKSG Insights (1 November 2022). This article, published in a libraries journal, provides an excellent overview of how AI tools work and how we can evaluate them for potential use.
AI and ethics
- Teaching AI Ethics, by Leon Furze. An Australian educator’s excellent guide on the many facets of AI in education. Contains discussion questions and activities for various disciplines.
- Ethical guidelines on the use of AI in education, European Commission (25 October 2022).
- Australia’s AI Ethics Principles, Department of Industry, Science and Resources. This is focused more broadly than education, but educators may find many of the points useful for discussion.
Where to find AI tools
Futurepedia. A website with an extensive list of AI tools, along with information about cost.
PromptBase. A marketplace for buying and selling prompts for the AI tools DALL-E, GPT-3, and Midjourney.
Tools based on the GPT-3 language generation model that have free options: ChatGPT, OpenAI Playground, Chatsonic.
OpenAI Cookbook. A repository of code and prompts for interacting with OpenAI.
Textgeneratorsandteachingwriting. Listserv hosted by Colorado State.
Here are a few terms you are very likely to come across in reading about artificial intelligence.
Artificial intelligence. “The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings,” according to Britannica.
Large language models. A system that uses machine learning to create the most likely probabilities of word sequencing. Large language models form the basis of artificial intelligence tools. “Large” refers to billions of words drawn from books and other digital texts.
OpenAI. A company that creates digital models and databases that other organizations use for creating chatbots and other types of digital tools. Its work is at the heart of many such tools, including ChatGPT and DALL-E, and it has created a series of GPT models.
ChatGPT. An artificial intelligence platform that writes, generates code, solves problems, and answers questions based on natural-language prompts and questions. It has attracted widespread interest and concern since it was made freely available in the fall of 2022 because of its vast knowledge base and ability to perform tasks quickly and easily.
GTP. A digital model that allows computer software to create natural-language text, generate computer code, and interact in a human-like way. GTP refers to generative pre-trained transformer, a means of extrapolating new results from previous training. The most recent version, GTP-3, draws on nearly all text available on the internet.
GitHub. A repository that allows users to download, modify, and deploy code and digital materials that others have created. Git refers to a form of version control that allows users to keep track of changes in digital files.