Some thoughts about generative AI as the semester starts
Some thoughts about generative AI as the semester starts
By Doug Ward
The shock has worn off, but the questions about how to handle generative artificial intelligence in teaching and learning seem only to grow.
Those questions lack easy answers, but there are concrete steps you can take as we head into the third year of a ChatGPT world:
- Create a clear policy about generative AI use in your class.
- Talk with students about generative AI frequently. Encourage questions.
- Talk frequently about the skills students gain in your class. Explain why those skills are important and how students can use them. Do this early in the semester and then with each assignment.
- Build community and trust in your classes. Student use of generative AI is a symptom of underlying issues of trust, perceptions of value, and social pressures, among other things.
- Create assignments that help students explore generative AI. You don't have to like or promote generative AI, but students need to understand its strengths and weaknesses, and how to approach its output with a critical eye.
- Experiment with generative AI yourself and consider how it is – or might – change your discipline and your teaching.
That’s just a start. As I said, the questions about generative AI keep piling up. Here are a few additional updates, thoughts, and observations.
What is the university doing with AI?
Several things have been taking place, and there are many opportunities to learn more about generative AI.
- AI Task Force. A task force that includes members of the Lawrence and medical school campuses began work in the fall. It will make recommendations on how the university might approach generative AI. It will then be up to university leaders and faculty and university governance to decide what types of policies (if any) to pursue.
- Faculty Senate and University Senate. Both governance bodies have had discussions about generative AI, but no formal policies have emerged.
- University bot. The university has contracted with a vendor to provide a chatbot for the KU website. The bot is still being developed, but vendor interviews focused on such uses as interacting with prospective students, responding to text queries from students, providing reminders to students, and answering questions related to IT and enrollment management.
- AI in Teaching Working Group. This group, through the Center for Teaching Excellence, meets monthly online, and it has a related Teams site. If you are interested in joining either, email Doug Ward (dbward@ku.edu).
- AI think tank. Lisa Dieker (lisa.dieker@ku.edu) has organized the AI Tech User Think Tank through the FLITE Center in the School of Education and Human Sciences. It is intended primarily for connecting faculty interested in AI-related grant work and research, but meetings cover many types of AI-related issues. Contact her if you are interested in joining.
- Digital Education Council. The School of Education and Human Sciences has joined the Digital Education Council, an international group of universities and corporations focused on collaborative innovation and technology. Much of the group’s recent work has focused on use of generative AI in education and industry.
- Libraries AI discussion group. The KU Libraries staff has been actively exploring how generative AI might change the way people search, find, and use information. A Teams discussion site has been part of that. Most conversations are, of course, library related, but participants often share general information about AI or about library trials.
- CTE AI course. CTE has made AI-related modules available for instructors to copy, use, or adapt in their own courses. The modules cover such areas as how generative AI works, why it creates many ethical quandaries, how it can be used ethically, and what the future of AI might entail. Anyone interested in gaining access to the modules should email Doug Ward (dbward@ku.edu).
What about a policy for classes?
The university has no policy related to AI use in classes, and we know of no policy at the school level, either. That means it is crucial for instructors to talk with students about expectations on AI use and to include syllabus information about use of, or prohibitions on, generative AI.
We can’t emphasize that enough: Talk with students about generative AI. Encourage them to ask questions. Make it clear that you welcome those questions. No matter your policy on use of generative AI, help students understand what skills they will gain from your class and from each assignment. (See Maintaining academic integrity in the AI era.)
What are we hearing about AI use among students?
Students have been conflicted about generative AI. Some see use of it as cheating. Some view the training of generative AI on copyrighted material as theft of intellectual property. Some worry about privacy and bias. Others worry about AI’s environmental impact.
Even so, large percentages of students say they use generative AI in their coursework, even if instructors ask them not to. They expect faculty to adapt to generative AI and to help them learn how to use it in jobs and careers. For the most part, that hasn’t happened, though.
Most students welcome the opportunity to talk about generative AI, but many are reluctant to do so out of fear that instructors will accuse them of cheating. That has to change. Only by engaging students in discussions about generative AI can we find a way forward.
Why are so many students using generative AI?
Many instructors assume students are lazy and want to cheat. The reality is far more complex. Yes, some avoid the hard work of learning. Most, though, use generative AI for other reasons, which include the following:
- Students feel unprepared. Many students struggled during the pandemic. Expectations of them diminished, and many never gained the core reading, writing, math, and analytical skills they need in college. College requirements and expectations have largely remained the same, though, with students unsure how to cope. Generative AI has become a way to make up for shortcomings.
- They feel overwhelmed. Some students have families or other obligations, many work 20 or more hours a week, and most still feel lingering effects from the pandemic. Anxiety, depression, and related mental health issues have increased. That mix pushes many students to take shortcuts just to get by.
- They feel pressured to achieve high GPAs. Scholarships often require a 3.5 GPA or higher, and students who want to attend graduate school or medical school feel a need to maintain high GPAs. That can push them toward AI use if they fear falling below whatever benchmark they have set for themselves or that others have imposed on them.
- They lack skills in time management. Students who wait until the last minute to study or to complete assignments create unnecessary stress for themselves. They also find out that assignments can’t be completed at the last minute, and they turn to AI for help.
- They worry about job expectations. Students have been getting mixed messages about generative AI. Some instructors denounce it and see any use of it as cheating. At the same time, many employers say they expect graduates to know how to use it. Current students are especially job-oriented. Depending on what they hear and read, they may see experience with generative AI as more important than skills they would gain by doing coursework themselves.
- They see a degree as a consumer product. As the cost of college has increased, many students have started looking at a degree in transactional terms. A degree is simply a means to a job. They are paying a lot of money, the reasoning goes, and that should give them the right to use whatever tools they want to use and to approach class in whatever way helps them succeed.
- They don’t see value in an assignment or class. This is a big aspect of most types of academic misconduct. Most students want to learn, but they don’t always understand why they must take particular classes or complete some assignments. If students don’t see value in an assignment or a class, they may just turn over any work to generative AI.
- Generative AI is ubiquitous. We have all been using some form of AI for years, and generative AI is being added to most software. It often seems impossible to avoid, and many students already see it as just another tool at their disposal.
None of that should be seen as an excuse to cheat, but instructors need to keep it in mind as they structure their classes. Here are some strategies they can use:
- Never assume students have particular skills. Even if they had these skills in the past, they may not today. Provide opportunities for review or for practicing and learning crucial skills.
- Scaffold assignments. Assign papers or projects in smaller segments so that students can get feedback and revise their work. Avoid high-stakes tests and use low-stakes quizzes and exams instead.
- Make expectations transparent. Explain the skills students need to succeed and the expectations you have for successful completion of assignments. Provide examples. Provide resources for reviewing or learning expected skills. Encourage students to talk with you privately if they feel ill-prepared to handle an assignment.
- Provide flexibility. This might involve dropping the lowest grades in segments of coursework, giving students, say, two opportunities to turn in work a few days after a deadline, or leniency if they communicate with you about challenges they are having. Most students still need structure, but flexibility will diminish pressure on them and on you.
Can’t detectors solve problems of student AI use?
Results from AI detectors are not proof that a student has cheated. Detection tools have proved unreliable, and students can fool them with minimal effort. Research has been consistent that AI detectors are far more likely to flag the work of students for whom English is not a first language. Because of those problems, many universities have disabled AI detectors. Faculty at KU have access to a Turnitin detector. If you use it, use the results as information for talking with students.
What are you hearing from the Generative AI in Teaching Working Group?
Discussions have been robust and energizing, covering such topics as creating policies about student use of generative AI, integrating generative AI into courses, finding and experimenting with news tools, helping students learn about the pitfalls and biases of AI tools, and finding ways to expand learning with generative AI. Members share news from conferences, colleagues, readings, and research. They offer ideas, insights, recommendations, and support. Here are a few things that come up frequently.
- More instructors need to experiment. Most members of the working group have been experimenting with generative AI tools and with ways to integrate them into teaching and learning. That can sometimes feel isolating, especially if colleagues are unwilling to learn about generative AI or to acknowledge student use of it.
- We need access to tools. Many people who want to explore feel held back by lack of access to appropriate tools and by the university’s slow response to generative AI.
- We need time and resources. Experimenting with generative AI takes time, and integrating it into classes takes more time, especially because many long-used approaches need to be rethought.
- We need to help students learn about generative AI. Most are uninformed and use generative AI with little or no understanding of how to prompt effectively, how to scrutinize the output of chatbots, or how to choose appropriate tools. Students need to know about how businesses are using AI tools and how to prompt chatbots effectively. The university needs courses that delve into the ethical aspects of generative AI, including privacy, safety, and appropriate use. Existing classes also need to integrate these topics.
Will the students of the near future be able to get a meaningful education without AI assistance?
Absolutely. The human aspects of education remain as crucial as ever. A college education provides access to a community of learners, interaction with professors and peers, support from experienced staff and advisors, and an opportunity to explore the outer and inner worlds with a diverse group of people.
But ...
We are still trying to figure out how and where generative AI fits into human-centered learning. People often associate generative AI with ChatGPT, but thousands of specialty tools have been developed. Many of those tools show promise in education, including those that adapt to individual students, provide opportunities to practice skills, provide feedback and direction, and allow people to explore topics they might never have considered before. We have only scratched the surface on how generative AI might transform teaching and learning.
One thing is clear: Students who hand off the challenging aspects of learning to generative AI will eventually fail. Learning takes work. It’s hard. It takes time and practice. Generative AI may be able to assist at times, but it can’t learn for students.
************************
Doug Ward is associate director of the Center for Teaching Excellence and an associate professor journalism and mass communications.