Adapting your course to artificial intelligence
The increasing use of applications that draw on generative artificial intelligence has led to intense discussions about the growing capabilities of generative AI and how it might change jobs, disciplines, societies, and even the way we think. Artificial intelligence has been evolving for decades, but most educators have done little to consider how these new digital tools might change teaching, learning, and student expectations. For instance:
- What do we need to know about generative AI platforms and how our students might use them?
- How might AI change how we teach, what we teach, what we assign students to do, and how we assess student work?
- What ethical issues do we need to address as we consider how AI might change the way we and our students think and act?
- The algorithmic assumptions and decisions behind AI software are hidden, so how do we evaluate potential new tools and learning platforms?
There are no definitive answers to those questions, but the resources we have collected about generative AI are intended to provide guidance and ideas on how to approach these tools in your courses.
Some things to think about with AI
Talk with them about your expectations and how you will view (and grade) assignments generated solely with artificial intelligence. Emphasize the importance of learning and explain why you are having them complete the assignments you use. Why is your class structured as it is? How will they use the skills they gain?
That sort of transparency has always been important, but it is even more so now. Students intent on cheating will always cheat. Some draw from archives at greek houses, buy papers online, or have friends do the work for them. ChatGPT and similar tools can provide just another means of avoiding the work that learning requires. Helping students understand how a course helps their learning will win over some of them, as will flexibility and choices in assignments. This is also a good time to emphasize the importance of human interaction in learning, something artificial intelligence lacks.
As fluent as ChatGPT often seems, its answers rarely delve beneath the surface of a topic. It makes mistakes. It makes things up. Its responses provide no clues about how it is programmed or why it provides the answers it does. A Princeton researcher called it a "bullshit generator" because it creates plausible arguments without regard for truth.
All of that makes it a valuable teaching tool, though. By having students probe for answers, we can help them improve their skepticism, challenge assumptions, and question information. By having them fact-check, we can help them understand the dangers of fluid writing that lacks substance or that relies on fallacies. By having them use ChatGPT or other AI tools for early drafts, we can push them to ask questions about information, structure, and sources. By having them apply different perspectives to ChatGPT's results, we can help broaden their understanding of points of view and argument.
Many students are already using ChatGPT in their school work. We can no more ban students from using artificial intelligence than we can ban them from using phones or calculators or laptops. We need to talk with students about how to use ChatGPT and other AI tools effectively and ethically, though. No, they should not take AI-written materials and turn them in for assignments, but yes, they should use AI when appropriate. Businesses of all sorts are already adapting to AI, and students will need to know how to use it when they move into the workforce. Students in K-12 schools are using it and will expect access when they come to college. Rather than banning ChatGPT and other AI tools or fretting over how to police them, we need to change our practices, our assignments, and our expectations. We need to focus more on helping students iterate their writing, develop their information literacy skills, and humanize their work.
One way to help with that is reflection, which helps students develop their metacognitive skills. It can also help them understand how to integrate AI into their learning processes and how they can build and expand on what AI provides. Reflection can also help reinforce academic honesty. Rather than hiding how they completed an assignment, reflection helps students embrace transparency.
ChatGPT also requires new ways of thinking, and having students reflect on their use of it could prove valuable. For instance, ChatGPT rarely produces solid results on the first try, and it often takes several iterations of a question to get good answers. Sometimes it never provides good answers. That makes it much like web or database searching, which requires patience and persistence as you refine search terms, narrow your focus, identify specific file types, try different types of syntax and search operators, and evaluate many pages of results. Add AI to the expanding repertoire of digital literacies students need. (Teaching guides and e-books are already becoming available online.)
ChatGPT's ability to create work as good as many early undergraduates means we will have to rethink assignments and curricula. For instance:
- Create assignments in which students start with ChatGPT and then have discussions about strengths and weaknesses. Have students compare the output from AI writing platforms, critique that output, and then create strategies for building on it and improving it.
- Use multistep, scaffolded assignments with feedback and revision opportunities.
- Emphasize assignment dimensions that are (currently) difficult for AI: synthesis, student voice and opinions.
- Use project-based learning.
Anne Bruder offeres additional suggestions in Education Week, Ethan Mollick does the same on his blog, and Anna Mills has created a Google Doc with many ideas (one of a series of documents and curated resources she has made available). Paul Fyfe of North Carolina State provides perhaps the most in-depth take on the use of AI in teaching, having experimented with an earlier version of the ChatGPT model more than a year ago.
Students often struggle with questions to ask for research papers or topics to pursue for projects. ChatGPT's responses to questions are often bland and shallow, but it can also suggest ideas or solutions that aren't always apparent. It can become a partner, of sorts, in writing and problem-solving. It might suggest an outline for a project, articulate the main approaches others have taken to solving a problem, or provide summaries of articles to help decide whether to delve deeper into them. It might provide a counterargument to a position or opinion, helping strengthen an argument or point out flaws in a particular perspective. We need to help students evaluate those results just as we need to help them interpret online search results and help them interpret media of all types. ChatGPT can provide motivation for starting many types of projects, though.
ChatGPT is incapable of acknowledging its biases, although it has many. Maria Andersen, a teacher and creator of a technology startup, said that it had a white, male view of the world. Maya Ackerman of Santa Clara University told The Story Exchange: "People say the AI is sexist, but it's the world that is sexist. All the models do is reflect our world to us, like a mirror." ChatGPT has been trained to avoid hate speech, sexual content, and anything OpenAI considered toxic or harmful. Others have said that it avoids conflict, and that its deep training in English over other languages skews its perspective. Some of that will no doubt change in the coming months and years as the scope of ChatGPT expands. No matter the changes, though, ChatGPT will live in and draw from its programmers' interpretation of reality. Of course, that provides excellent opportunities for class discussions, class assignments, and critical thinking.
University policy on academic misconduct is broad enough to apply to problematic uses of artificial intelligence:
Academic misconduct by a student shall include, but not be limited to, disruption of classes; threatening an instructor or fellow student in an academic setting; giving or receiving of unauthorized aid on examinations or in the preparation of notebooks, themes, reports or other assignments; knowingly misrepresenting the source of any academic work; unauthorized changing of grades; unauthorized use of University approvals or forging of signatures; falsification of research results; plagiarizing of another's work; violation of regulations or ethical codes for the treatment of human and animal subjects; or otherwise acting dishonestly in research.
Student Affairs also emphasizes to students the importance of academic integrity. It guidance says, in part:
Academic integrity is a central value in higher education. It rests on two principles: first, that academic work is represented truthfully as to its source and its accuracy, and second, that academic results are obtained by fair and authorized means. "Academic misconduct" occurs when these values are not respected. Academic misconduct at KU is defined in the University Senate Rules and Regulations. A good rule of thumb is "if you have to ask if this is cheating, it probably is.
The questions about appropriate use of AI in academic work will no doubt linger for years. For instance, where do we set the boundaries for use of AI? If students use PowerPoint to redesign their slides, is it still their work? If they use ChatGPT to write part of a paper, is it still their paper? If they use DALL-E or other visual tools to create artwork or AI-powered tools to create music, is that really original work? If they use AI-powered summarization tools or grammar checkers, have they done the work on their own?
Those are difficult questions, and we have yet to determine where to set the boundaries of AI use in our classes or our professional lives. That makes it especially important to talk with students about your expectations and to include a statement in your syllabus about those expectations.
ChatGPT is just one of a growing number of digital tools using artificial intelligence. Those tools can summarize information, create artwork, iterate searches based on the bibliographies of articles you mark, answer questions from the perspectives of historical figures and fictional characters, turn text into audio and video, create animated avatars, analyze and enhance photos and video, create voices, and perform any number of digital tasks. AI is integrated in phones, computers, lighting systems, thermostats, and just about any digital appliance you can imagine. So the question isn't whether to use use AI; we already are, whether we realize it or not. The question is how quickly we are willing to learn to use it effectively in teaching and learning.
Additional resources
(Updated February 2024)
This is a brief list of articles and sites related to generative artificial intelligence and teaching. You will find additional resources in handouts for CTE sessions on AI, writing, and critical thinking; using AI in teaching; and AI, policy, and academic integrity.
Guidance for instructors and administrators
- Indecision About AI in Classes Is So Last Week, by Doug Ward and colleagues from the Bay View Alliance. Inside Higher Ed (1 December 2023).
- How Universities Can Move Forward With Generative AI in Teaching and Learning, by Doug Ward and colleagues from the Bay View Alliance. Change magazine (January/February 2024).
- How We Can Turn AI Into an Opportunity Rather Than a Threat, by Doug Ward, Academic Leader (2 January 2024).
- The bots are here to stay. Do we deny or do we adapt?, by Doug Ward, Bloom’s Sixth (20 January 2023).
Guides, tools and sites
- Critical AI Literacy for Educators, a Padlet curated by Kathryn Conrad and Sean Kamperman of the KU English Department. It includes excellent resources on AI literacy, ethics, plagiarism and academic misconduct, policies, potential applications, and tools.
- A Blueprint for an AI Bill of Rights for Education, by Kathryn Conrad. Critical AI blog (July 2023).
- Sentient Syllabus Project, a collaborative resource created by professors in Canada, the United States, and Japan to help instructors use artificial intelligence resources meaningfully. The site can be challenging to navigate, but it contains many thoughtful materials related to syllabus language, objectives, and activities.
- AI Text Generators and Teaching Writing: Starting Points for Inquiry, curated by Anna Mills of the College of Marin and hosted by Colorado State University. Also see:
- A Teacher’s Prompt Guide to ChatGPT, by Andrew Herft.
- The practical guide to using AI to do stuff, by Ethan Mollick of the Wharton School of Business.
- AI in Higher Education Metasite, by Ray Schroeder.
- Update Your Course Syllabus for ChatGPT, by Ryan Watkins, George Washington University.
- Classroom Policies for AI Generative Tools, compiled by Lance Eaton of College Unbound. Material from classes at several universities.
- ChatGPT Resources for Educators, compiled by Ben Erwin, Syracuse.
AI, writing, and assignment design
- Using AI to Implement Effective Assignment Design: Five Strategies, Including Prompts, by Ethan Mollick and Lilach Mollick. SSRN (24 March 2023).
- How to cheat on your final paper: Assigning AI for student writing, by Paul Fyfe. AI & Society (accepted 4 September 2022). A professor writes about an AI experiment he undertook before ChatGPT roiled education.
- Embrace the Bot: Designing Writing Assignments in the Face of AI, by Eric Prochaska, Faculty Focus (23 January 2023).
- Teaching Actual Student Writing in an AI World, by Kevin Jacob Kelley, Inside Higher Ed (19 January 2023). Offers 10 strategies for engaging students in writing projects that avoid AI.
- Worried About ChatGPT? Don’t Be, by Hetal Thaker, Inside Higher Ed (23 January 2023). Argues that AI is part of larger questions the humanities have been grappling with for years (relevance, creativity, assessment) and that it is just another reminder of why we need to keep pushing for solutions.
- Why I’m Not Scared of ChatGPT, by Christopher Grobe, Chronicle Review (3 February 2023).
AI and teaching
- Will ChatGPT Change the Way You Teach?, by Beth McMurtrie, Chronicle of Higher Education newsletter (5 January 2023).
- Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach, by Kalley Huang, New York Times (16 January 2023).
- ChatGPT Advice Academics Can Use Now, by Susan D’Agostino, Inside Higher Ed (12 January 2023). Thoughts from a variety of educators.
- Is ChatGPT a threat to education?, by Iqbal Pittalwala, UC Riverside News (24 January 2023). Instructors in computer science, biology, education policy, creative writing, physics and astronomy, economics, and public health offer their views.
- Designing Assignments in the ChatGPT Era, by Susan D’Agostino, Inside Higher Ed (31 January 2023).
- With ChatGPT, We’re All Editors Now, by Rachel Elliott Rigolino, Inside Higher Ed (31 January 2023). Argues that AI has increased the need for critical editing.
AI and critical thinking
- How ChatGPT robs students of motivation to write and think for themselves, by Naomi S. Baron, The Conversation (19 January 2023). A linguist argues that writing is a human process and that AI tools diminish students’ ability to improve their thinking.
- How to Teach Critical Thinking, by Daniel T. Willingham, Education: Future Frontiers (May 2019). A report commissioned by Australian educators provides a broad literature review of critical thinking research. That research suggests that critical thinking is discipline specific, even though we often consider it a general skill. The Hechinger Report also wrote about Willingham’s work.
- An AI toolkit for libraries, by Michael Upshawk, UKSG Insights (1 November 2022). This article, published in a libraries journal, provides an excellent overview of how AI tools work and how we can evaluate them for potential use.
AI and ethics
- Teaching AI Ethics, by Leon Furze. An Australian educator’s excellent guide on the many facets of AI in education. Contains discussion questions and activities for various disciplines.
- Ethical guidelines on the use of AI in education, European Commission (25 October 2022).
- Australia’s AI Ethics Principles, Department of Industry, Science and Resources. This is focused more broadly than education, but educators may find many of the points useful for discussion.
(Updated February 2024)
The number of tools that use generative artificial intelligence has grown enormously since ChatGPT was released in November 2022. Some mimic the broad functionality of ChatGPT, but most focus on specific disciplines, specific tasks, or specific types of output. Sites like Future Tools, There’s an AI for That, and Futurepedia categorize the thousands of AI-related tools available.
The easiest way to get started with generative AI, though, is to try one of the most popular tools: ChatGPT, Microsoft Copilot, Google's Gemini, or Claude. Many other tools are more focused. Some of the tools below were made specifically for faculty or students. Others are more broadly focused but have useful capabilities for instructors. Most of the tools included here are free or have free components, although they provide additional functions if you pay for them. We aren't endorsing any of these tools. Rather, we see potential in what they do. You will need to decide for yourself which tools best fit your needs.
Academic search tools
- Consensus. Uses natural language to search academic literature. Consensus then displays half a dozen results along with what it considers relevant information from each paper. Free version has considerable functionality; enhanced features cost $7 a month.
- Elicit. Provide a research question and Elicit searches academic literature and provides references and summaries of related papers. (It draws from the Semantic Scholar database.) Select the ones that seem most relevant and Elicit will then search for additional research related to those papers. You can then extract information into a table and ask questions about the contents of papers. Strongest in topics related to biomedicine and machine learning. Costs $10 a month after an initial trial.
- Connected Papers. Searches Semantic Scholar and then maps related literature based on an article you choose. Free for up to five searches a month.
- OpenRead. Allows you to Search, summarize, and query academic papers. You can also upload papers to analyze, and take notes alongside them. Also creates visual maps of connections among papers. Free version allows unlimited uploads of PDFs but limits the number of AI-related queries. Paid versions are $5 and $20 a month.
- Semantic Scholar. Free search engine for scientific literature. Uses AI for classifying papers, powering natural language search, summarizing literature, and identifying connections among papers. Searches literature from more than 50 publishers.
General search
- Perplexity. Analyzes search queries, suggests improvements, and then provides sources and a summary. Allows specialized searches of academic literature, programming literature, law, marketing, YouTube, Wolfram Alpha, and Reddit. Allows uploading of PDFs, text or images. Also has a writing tool. Free for up to five queries every four hours. Paid version is $20 a month.
- Exa. Calls itself a neural search engine that searches by meaning rather than keywords. Exa says its strengths include finding such things as academic literature, GitHub code, personal pages, company pages, news, art, blogs, and jobs. It allows you to restrict searches to a particular period of time and to particular sites. It is free for a limited number of searches per month.
Research assistants
- Research Rabbit. Free tool for finding, organizing, and summarizing academic literature. Uses AI to create concept maps that link authors and ideas among academic papers. Will also email you about papers it finds that are related to your online research collections.
- Melon. Allows you to save online material to an online workspace, where you can group material into collections. Once Melon has analyzed the materials, you can summarize, ask questions about the collections, search for specific topics across your collection, and get help with writing. Works with webpages, YouTube videos, and social media text. You can also share the material you have collected. Free for now, with unspecified limits.
- NotebookLM, by Google. Load PDFs, Google Docs, or copied text to a notebook and then use generative AI to ask questions about the materials, summarize, and write. The notebook chat will even suggest questions based on its analysis of the documents. Each notebook is limited to 20 documents.
- Pinpoint. A Google tool that allows you to analyze and search large numbers of documents, audio files, and images. This is part of Google’s Journalist Studio, which includes tools for fact-checking, searching, and visualizing data.
- Fabric. Allows you to gather notes, files, images, videos, links, and entire cloud drives in one place and use AI to search and find connections. Allows tagging and document notations, and has browser plugins and desktop apps. Free while in beta.
Education-related tools
- Hugging Face. Allows you to create a chatbot using one of several different platforms. These are easy to set up, although you will need to spend some time providing directions to guide your bot, which you can then share with a URL.
- AI Tutor Pro. An online site that uses ChatGPT to guide learning about subjects you choose or to check your understanding of something. It was created by a government-funded nonprofit organization that creates online courses for underserved populations in the Province of Ontario.
- Explainpaper. A free tool for interpreting academic research. Allows you to highlight sections of academic papers you have uploaded and have AI explain. You can then ask follow-up questions. The paid version offers additional functions.
- Smartie. A free tool for creating course outlines, learning outcomes, class activities, and other course elements with a focus on equity. It also has a rubric-creation tool.
Further discussions
A Google group called AI in Education has frequent discussions about generative AI and allows members to ask and answer questions.
It’s easy to get lost when reading about artificial intelligence. Knowing some of the frequently used terms can help. Sites that can provide additional definitions and information include code.org, CompTIA, and Wikipedia.
Algorithm. A set of procedures or instructions that breaks problem-solving into a series of steps.
API. Shorthand for application program interface, a method of connecting computer systems so that a receiving system draws on the abilities of the sending system. For instance, many companies have created apps that use, modify or extend the capabilities of the system behind ChatGPT.
Artificial general intelligence. A goal of creating a computer system that is smarter than humans. This is sometimes referred to as AGI.
Artificial intelligence. A broad term for computer systems that handle work often done by people.
Bot or chatbot. This generally refers to a computer interface that responds to natural language questions in ways that mimic human interaction. A bot can also be a physical robot that carries out commands or performs tasks that humans might do.
Gemini. Google’s version of a chatbot. Has text or voice interaction. Analyzes uploaded images. Prompts limited to about 4,000 characters.
ChatGPT. An artificial intelligence platform that creates writing, generates code, solves problems, and answers questions based on natural-language prompts and questions. It has attracted widespread interest since it was made available in the fall of 2022 because of its vast knowledge base and ability to perform tasks quickly and easily. It does not have access to the internet, but various plugins and add-ons can give it that ability.
Copilot. Microsoft chatbot with integrated web search. It offers free access GPT-4, the most-powerful large language model provided by OpenAI. Analyzes uploaded images. Has text or voice interaction. A Notebook feature allows prompts of up to 18,000 characters (vs. 4,000 for the main chatbot). Provides additional privacy when faculty or staff log in with KU credentials.
Generative AI. A computer system that creates, or generates, text, images, or code based on information a user provides. ChatGPT, Bing Chat, DALL-E, and Bard are all forms of generative AI.
Large language models. An approach to artificial intelligence that analyzes enormous amounts of text and creates probabilities for word sequencing. This analysis is often called training. A large language model allows ChatGPT and similar tools to respond to questions in ways that sound human. They aren’t human, though. They simply string words together that mimic the patterns it has analyzed. “Large” refers to billions of words drawn from books and other digital texts. You will often see these referred to as LLMs.
GTP. A digital model that allows computer software to create natural-language text, generate computer code, and interact in a human-like way. GTP refers to generative pre-trained transformer, a means of extrapolating new results from previous training. The most recent version, GTP-4, draws on nearly all text available on the internet.
GitHub. A repository that allows users to download, modify, and deploy code and digital materials that others have created. Git refers to a form of version control that allows users to keep track of changes in digital files.
LLaMA. A large language model created by Meta, the company that owns Facebook.
Modeling. Development of a decision-making process in AI systems.
OpenAI. A company that creates digital models and databases that other organizations use for creating chatbots and other types of digital tools. Its work is at the heart of many such tools, including ChatGPT and DALL-E.