Using AI ethically in writing assignments



The use of generative artificial intelligence in writing isn’t an either/or proposition. Rather, think of a continuum in which AI can be used at nearly any point to inspire, ideate, structure, and format writing. It can also help with research, feedback, summarization, and creation. You may also choose not to use any AI tools. This handout is intended to help you decide.

A starting point

Many instructors fear that students will use chatbots to complete assignments, bypassing the thinking and intellectual struggle involved in shaping and refining ideas and arguments. That’s a valid concern, and it offers a starting point for discussion:

Turning in unedited AI-generated work as one’s own creation is academic misconduct.

Most instructors agree on that point.  After that, the view of AI becomes murkier. AI is already ubiquitous, and its integrations and abilities will only grow in the coming years. Students in grade school and high school are also using generative AI, and those students will arrive at college with expectations to do the same. So how do we respond?

Writing as process and product

We often think of writing as a product that demonstrates students’ understanding and abilities. It can serve that role, especially in upper-level classes. In most classes, though, we don’t expect perfection. Rather, we want students to learn the process of writing. Even as students gain experience and our expectations for writing quality rise, we don’t expect them to work in a vacuum. They receive feedback from instructors, classmates, friends, and others. They get help from the writing center. They work with librarians. They integrate the style and thinking of sources they draw on. That’s important because thinking about writing as a process involving many types of collaboration helps us consider how generative AI might fit in.   


Generative AI as a writing assistant

We think students can learn to use generative AI effectively and ethically. Again, rather than thinking of writing as an isolated activity, think of it as a process that engages sources, ideas, tools, data, and other people in various ways. Generative AI is simply another point of engagement in that process. Here’s what that might look like at various points:

Early in the process

  • Generating ideas. Most students struggle to identify appropriate topics for their writing. Generative AI can offer ideas and provide feedback on students’ ideas.  
  • Narrowing the scope of a topic. Most ideas start off too broad, and students often need help in narrowing the scope of writing projects. Instructors and peers already do that. Generative AI becomes just another voice in that process.
  • Researching. Some chatbots have access to the internet, combining search engines with generative AI. Specialty tools or plugins allow search of research-oriented databases. All have the potential to help students throughout the writing process.
    • Finding initial sources. Bing and Bard can help students find sources early in the writing process. Specialty tools like Semantic Scholar, Elicit, Prophy, and Dimensions can provide more focused searches, depending on the topic.
    • Finding connections among ideas. Research Rabbit, Aria (a plug-in for Zotero) and similar tools can create concept maps of literature, showing how ideas and research are connected. Elicit identifies patterns across papers and points to related research. ChatGPT Pro can also find patterns in written work. When used with a plugin, it can also create word clouds and other visualizations.
    • Gathering and formatting references. Software like EndNote and Zotero allow students to store and organize sources. They also save time by formatting sources in whatever style the writer needs.
    • Summarizing others’ work. ChatGPT, Bing and specialty AI tools like Elicit do a good job of summarizing research papers and webpages, helping students decide whether a source is worth additional time.
    • Interrogating research papers or websites. This is a new approach AI has made possible. An AI tool analyzes a paper (often a PDF) or a website. Then researchers can then ask questions about the content, ideas, approach, or other aspects of a work. Some tools can also provide additional sources related to a paper.
    • Analyzing data. Many of the same tools that can summarize digital writing can also create narratives from data, offering new ways of bringing data into written work.
    • Finding hidden patterns. Students can have an AI tool analyze their notes or ideas for research, asking it to identify patterns, connections, or structure they might not have seen on their own.
  • Outlining. ChatGPT, Bing and other tools do an excellent job of outlining potential articles or papers. That can help students organize their thoughts throughout the research and writing process. Each area of an outline provides another entry point for diving deeper into ideas and potential writing topics.
  • Creating an introduction. Many writers struggle with opening sentences or paragraphs. Generative AI can provide a draft of any part of a paper, giving students a boost as they bring their ideas together.

 

Deeper into the process

  • Creating drafts. Students might ask AI to draft an opening paragraph, an area where many writers struggle. They might ask for a draft of a section or even an entire paper. It is then up to them to edit, fact-check, and add their own voice and their own intellectual contributions to the writing. This is the function most faculty members worry about. They see this as avoiding the intellectual work of analyzing and synthesizing sources and creating original work. Some students will certainly do that. Others will use the AI drafts to motivate them, speed the writing process, and ultimately improve their work.
    • Thinking critically. Creating good prompts for generative AI involves considerable critical thinking. This isn’t a process of asking a single question and receiving perfectly written work. It involves trial and error, clarification and repeated follow-ups. Even after that, students will need to edit, add sources, and check the work for AI-generated fabrication or errors.
  • Creating titles or section headers for papers.This is an important but often overlooked part of the writing process, and the headings that generative AI produces can help students spot potential problems in focus.
  • Helping with transitions and endings. These are areas where students often struggle or get stuck, just as they do with openings.
  • Getting feedback on details. Students might ask an AI tool to provide advice on improving the structure, flow, grammar, and other elements of a paper.
  • Getting feedback on a draft. Instructors already provide feedback on drafts of assignments and often have students work with peers to do the same. Students may also seek the help of the writing center or friends. Generative AI can also provide feedback, helping students think through large and small elements of a paper. We don’t see that as a substitute for any other part of the writing process. Rather, it is an addition.

A caveat

Generative AI has many weaknesses. It is programmed to generate answers whether it has appropriate answers or not. Students can’t blame AI for errors, and they are still accountable for everything they turn in. Instructors need to help them understand both the strengths and the weaknesses of using generative AI, including the importance of checking all details.

A range of AI use

Better understanding of the AI continuum provides important context, but it doesn’t address a question most instructors are asking: How much is too much? There’s no easy answer to that. Different disciplines may approach the use of generative AI in very different ways. Similarly, instructors may set different boundaries for different types of assignments or levels of students. Here are some ways to think through an approach:

  • Talk with students about AI use. This helps create a climate of trust and openness, and helps students approach their work ethically.
    • Discuss ethics. What are the ethical foundations of your field? What principles should guide students? Do students know and understand those principles? What happens to professionals who violate those principles?
    • Be honest. Most professions, including academia, are trying to work through the very issues instructors are. We are all experimenting and trying to define boundaries even as the tools and circumstances change. Students need to understand those challenges. We should also bring students into conversations about appropriate use of generative AI. Many of them have more experience with AI than instructors do, and adding their voices to discussions will make it more likely that students will follow whatever guidelines we set.  
  • Set boundaries. You may ask students to avoid, for instance, AI for creating particular assignments or for generating complete drafts of assignments. (Again, this may vary by discipline.) Just make sure students understand why you want them to avoid AI use and how forgoing AI assistance will help them develop skills they need to succeed in future classes and in the professional world.
  • Make the purpose of assignments clear. Students need clear goals, but you, as an instructor, also need to consider why you are asking students to complete each assignment. In many cases, the process of writing for a particular discipline is as important as any product students may produce. As AI tools become more familiar, they are very likely to become an expected part of the process.
    • Review your assignments. If AI can easily complete them, students may not see the value or purpose. How can you make assignments more authentic, focusing on real-world problems and issues students are likely to see in the workplace?
    • Scaffold assignments. Having students create assignments in smaller increments reduces pressure and leads to better overall work.
  • Include reflection. Have students think of AI as a method and have them reflect on their use of AI. This might be a paragraph or two at the end of a written assignment in which they explain what AI tools they have used, how they have used those tools, and what AI ultimately contributed to their written work. Also have them reflect on the quality of the material AI provided and on what they learned from using the AI tools. This type of reflection helps students develop metacognitive skills (thinking about their own thinking). It also provides important information to instructors about how students are approaching assignments and what additional revisions they might need to make.
  • Engage with the Writing Center, KU Libraries, and other campus services about AI, information literacy, and the writing process. Talk with colleagues and watch for advice from disciplinary societies. This isn’t something you have to approach alone.

What now?

Generative AI is evolving rapidly. Large numbers of tools have incorporated it, and new tools are proliferating. Step back and consider how AI has already become part of academic life:  

  • AI-augmented tools like spell-check and auto-correct brought grumbles, but there was no panic.
  • Grammar checkers followed, offering advice on word choice, sentence construction, and other aspects of writing. Again, few people complained.
  • Citation software has evolved along with word-processing programs, easing the collection, organization, and formatting of sources.
  • Search engines used AI long before generative AI burst into the public consciousness.

As novel as generative AI may seem, it offers nothing new in the way of cheating. Students could already buy papers on the internet, copy and paste from an online site, have someone else create a paper for them, or tweak a paper from the files of a fraternity or a sorority. So AI isn’t the problem. AI has simply forced instructors to deal with long-known issues in academic structure, grading, distrust, and purpose. That is beyond the scope of this handout, other than some final questions for thought:

Why are we so suspicious of student intentions? And how can we create an academic climate that values learning and honesty?