By Doug Ward

Adapting colleges and universities to generative artificial intelligence was never going to be easy. Surveys released over the past two weeks provide evidence of just how difficult that adaptation will be, though.

Here’s a summary of what I'm seeing in the results:

Faculty: We lack the time, understanding, and resources to revamp classes to an AI age. A few of us have been experimenting, but many of us don’t see a need to change.

Administrators: We think generative AI will allow our institutions to customize learning and improve students' research skills, but we need to make substantial changes in teaching. We are spending at least some time and money on AI, but most of our institutions have been slow to adapt and aren’t prepared to help faculty and staff gain the skills and access to the tools they need.

That’s oversimplified, but it captures some of the broad themes in the surveys, suggesting (at least to me) a rocky path over the coming years. And though the challenges are real, we can find ways forward. From what I'm seeing in the surveys, we need to help instructors gain experience with generative AI, encourage experimentation, and share successes. We also need to do a better job of defining AI, generative AI, and use of AI, especially in class policies and institutional guidance. The surveys suggest considerable confusion. They also suggest a need to move quickly to help students gain a better understanding of what generative AI is, how it can be used effectively, and why it has many ethical challenges associated with it. In most cases, that will require a rethinking of how and what we teach. We have provided considerable guidance on the CTE website, and we will continue to explore options this spring. 

Some of the more specific results from the surveys can help guide us toward the areas that need attention.

Bar chart comparing faculty who view AI as a challenge vs those who see it as an opportunity
APAC: Asia Pacific region; EMEA: Europe, the Middle East and Africa; LATAM: Latin America.

 

U.S. educators see AI differently from their global counterparts

Faculty in the United States and Canada view generative AI in a far gloomier way than their colleagues in other countries, a survey from the Digital Education Council suggests. They are far more likely to say that generative AI is a challenge and that they will not use it in their teaching in the future.

Worldwide, 35% of the survey’s respondents said generative AI was a challenge to education and 65% said it was an opportunity. Regionally, though, there were considerable differences, with 43% of faculty in the U.S. and Canada calling AI a challenge compared with 35% in Europe, the Middle East, and Africa; 30% in the Asia Pacific region, and 22% in Latin America.

Similarly, a much greater percentage of faculty in the U.S. and Canada said they did not expect to use AI in their teaching in the future. Looked at another way, 90% to 96% of faculty in other regions of the world said they expected to integrate AI into their classes, compared with 76% in the U.S. and Canada.

Alessandro Di Lullo, chief executive of the Digital Education Council, said in a briefing before the survey results were released that faculty skepticism in the U.S. and Canada surprised him. Historically, he said, instructors in both countries have had “propensity towards innovation and more openness towards innovation.”  

Bar chart showing leaders' feelings on preparation of institutions to handle generative AI

AAC&U survey suggests need but little momentum

A survey released this week by the Association of American Colleges and Universities and Elon University offered a similarly sober assessment of U.S. higher education’s handling of generative AI. That survey included only university leaders, with large percentages saying their institutions weren’t prepared to help faculty, students, or staff work with generative AI even though they anticipate a need for substantial change. 

Leaders of small colleges and universities expressed more concern than those at larger institutions. Eighty-seven percent of leaders at small institutions (those with fewer than 3,000 students) said that preparing faculty to guide students on AI was a key challenge, compared with 51% to 54% at larger institutions. Leaders said the biggest challenges included faculty’s lack of familiarity with – and resistance to – generative AI tools; worries that AI will diminish student learning; lack of training and infrastructure to handle generative AI; and security.

“Use of these tools is an attack on everything we do,” one leader said in the survey.

Most leaders said they were concerned about academic integrity, student reliance on AI tools, and digital inequities, but they also said generative AI would enhance learning and improve student skills in research and writing, along with creativity. Among leaders at institutions with 10,000 or more students, 60% said they expected the teaching model to change significantly in the next five years to adapt to generative AI. 

Most leaders see a need for some immediate changes, with 65% saying that last year's graduates were not prepared to work in jobs that require skills in generative AI.

Figuring out the role of generative AI in teaching

In the Digital Education Council survey, 61% of faculty respondents said they had used generative AI in their teaching, although most reported minimal to moderate use, primarily for creating class material but also for completing administrative tasks, helping students learn about generative AI, engaging students in class, trying to detect cheating, and generating feedback for students.

Of the 39% of respondents who said they didn’t use generative AI, reasons included lack of time, uncertainty about how to use it in teaching, and concern about risks. Nearly a quarter said they saw no clear benefits of using generative AI.

That tracks with what I have seen among faculty at KU and at other universities. Many see a need for change but aren't sure how to proceed. Most have also struggled with how to maintain student learning now that generative AI can be used to complete assignments they have developed over several years.

Danny Bielik, president of Digital Education Council, said in a briefing that administrators needed to understand that many instructors were struggling to see the relevance of generative AI in their teaching.

“It's a wake-up call and a reminder to institutional leadership that these people exist, they're real, and they also need to be brought along for the journey if institutions are starting to make decisions,” Bielik said.

Bar chart showing skills that faculty say educators will need in the future

 

'The role of humans is changing'

Other elements of the survey tracked along familiar lines:

  • Views of AI. 57% of respondents said they had a positive view of AI in education and 13% had a negative view. The rest were somewhere in between.
  • Roles of instructors. 64% said they expected the roles of instructors to change significantly because of generative AI; 9% expected minimal or no change. Relatedly, 51% said AI was not a threat to their role as an instructor, and 18% said their role was threatened. Those who considered themselves more proficient with generative AI were more likely to say that teaching would need to adapt.
  • AI as a skill. Two-thirds of respondents said it was important to help students learn about generative AI for future jobs. Even so, 83% said they were concerned about students’ ability to evaluate the output of chatbots, with a similar percentage saying they worried about students becoming too reliant on AI.
  • Use of AI in class: 57% of faculty surveyed said they allowed students to use generative AI on assignments as long as they followed instructor stipulations and disclosed its use; 23% said no AI use was permitted, and 11% said AI use was mandatory.

Di Lullo said he was surprised by some of the results, especially because “the role of humans is changing” and colleges and universities need to adapt.

Bielik said the survey results were a “very good indication that there are people not necessarily sitting on the fence, but they're not paying as much attention to it as we are.”

Yet another recent poll supports that observation. Just a few days after the Digital Education Council survey was released, a Gallup poll said that that nearly two-thirds of Americans didn't realize they were already using AI-infused technology. That technology includes such things as assistant software like Siri and Alexa, navigation software, weather apps, social media, video streaming, and online shopping. Overall, Gallup said, Americans tend to see generative AI in negative terms, with young adults (age 18 to 29) expressing the highest percentage of concern about its impact. Three-fourths of young adults said they were especially worried about job prospects as use of generative AI grows. Those of all ages who know about AI’s integration into existing technology view it more positively. 

As we rethink our teaching, we need to build community and trust among students and encourage them to help us find a way forward. We also need to help students understand how the skills they gain in college will help them become more adaptable to change. First, though, we need to adapt ourselves.

Bar chart on areas where college and university leaders see challenges to adopting generative AI tools

***********************************

Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

Posted on by Doug Ward

By Doug Ward

The shock has worn off, but the questions about how to handle generative artificial intelligence in teaching and learning seem only to grow.
Those questions lack easy answers, but there are concrete steps you can take as we head into the third year of a ChatGPT world:

  • Create a clear policy about generative AI use in your class.
  • Talk with students about generative AI frequently. Encourage questions.
  • Talk frequently about the skills students gain in your class. Explain why those skills are important and how students can use them. Do this early in the semester and then with each assignment.
  • Build community and trust in your classes. Student use of generative AI is a symptom of underlying issues of trust, perceptions of value, and social pressures, among other things.
  • Create assignments that help students explore generative AI. You don't have to like or promote generative AI, but students need to understand its strengths and weaknesses, and how to approach its output with a critical eye.  
  • Experiment with generative AI yourself and consider how it is – or might – change your discipline and your teaching.

That’s just a start. As I said, the questions about generative AI keep piling up. Here are a few additional updates, thoughts, and observations.A university building embedded in a hill made of computer chips and diodes, with Q&A rising from the top

What is the university doing with AI?

Several things have been taking place, and there are many opportunities to learn more about generative AI.

  • AI Task Force. A task force that includes members of the Lawrence and medical school campuses began work in the fall. It will make recommendations on how the university might approach generative AI. It will then be up to university leaders and faculty and university governance to decide what types of policies (if any) to pursue.
  • Faculty Senate and University Senate. Both governance bodies have had discussions about generative AI, but no formal policies have emerged.
  • University bot. The university has contracted with a vendor to provide a chatbot for the KU website. The bot is still being developed, but vendor interviews focused on such uses as interacting with prospective students, responding to text queries from students, providing reminders to students, and answering questions related to IT and enrollment management.
  • AI in Teaching Working Group. This group, through the Center for Teaching Excellence, meets monthly online, and it has a related Teams site. If you are interested in joining either, email Doug Ward (dbward@ku.edu).
  • AI think tank. Lisa Dieker (lisa.dieker@ku.edu) has organized the AI Tech User Think Tank through the FLITE Center in the School of Education and Human Sciences. It is intended primarily for connecting faculty interested in AI-related grant work and research, but meetings cover many types of AI-related issues. Contact her if you are interested in joining.
  • Digital Education Council. The School of Education and Human Sciences has joined the Digital Education Council, an international group of universities and corporations focused on collaborative innovation and technology. Much of the group’s recent work has focused on use of generative AI in education and industry.
  • Libraries AI discussion group.  The KU Libraries staff has been actively exploring how generative AI might change the way people search, find, and use information. A Teams discussion site has been part of that. Most conversations are, of course, library related, but participants often share general information about AI or about library trials.
  • CTE AI course. CTE has made AI-related modules available for instructors to copy, use, or adapt in their own courses. The modules cover such areas as how generative AI works, why it creates many ethical quandaries, how it can be used ethically, and what the future of AI might entail. Anyone interested in gaining access to the modules should email Doug Ward (dbward@ku.edu). 

What about a policy for classes?

The university has no policy related to AI use in classes, and we know of no policy at the school level, either. That means it is crucial for instructors to talk with students about expectations on AI use and to include syllabus information about use of, or prohibitions on, generative AI.

We can’t emphasize that enough: Talk with students about generative AI. Encourage them to ask questions. Make it clear that you welcome those questions. No matter your policy on use of generative AI, help students understand what skills they will gain from your class and from each assignment.  (See Maintaining academic integrity in the AI era.)  

What are we hearing about AI use among students?

Students have been conflicted about generative AI. Some see use of it as cheating. Some view the training of generative AI on copyrighted material as theft of intellectual property. Some worry about privacy and bias. Others worry about AI’s environmental impact. 

Even so, large percentages of students say they use generative AI in their coursework, even if instructors ask them not to. They expect faculty to adapt to generative AI and to help them learn how to use it in jobs and careers. For the most part, that hasn’t happened, though. 

Most students welcome the opportunity to talk about generative AI, but many are reluctant to do so out of fear that instructors will accuse them of cheating. That has to change. Only by engaging students in discussions about generative AI can we find a way forward.

Why are so many students using generative AI?

Many instructors assume students are lazy and want to cheat. The reality is far more complex. Yes, some avoid the hard work of learning. Most, though, use generative AI for other reasons, which include the following:

  • Students feel unprepared. Many students struggled during the pandemic. Expectations of them diminished, and many never gained the core reading, writing, math, and analytical skills they need in college. College requirements and expectations have largely remained the same, though, with students unsure how to cope. Generative AI has become a way to make up for shortcomings. 
  • They feel overwhelmed. Some students have families or other obligations, many work 20 or more hours a week, and most still feel lingering effects from the pandemic. Anxiety, depression, and related mental health issues have increased. That mix pushes many students to take shortcuts just to get by. 
  • They feel pressured to achieve high GPAs. Scholarships often require a 3.5 GPA or higher, and students who want to attend graduate school or medical school feel a need to maintain high GPAs. That can push them toward AI use if they fear falling below whatever benchmark they have set for themselves or that others have imposed on them. 
  • They lack skills in time management. Students who wait until the last minute to study or to complete assignments create unnecessary stress for themselves. They also find out that assignments can’t be completed at the last minute, and they turn to AI for help.
  • They worry about job expectations. Students have been getting mixed messages about generative AI. Some instructors denounce it and see any use of it as cheating. At the same time, many employers say they expect graduates to know how to use it. Current students are especially job-oriented. Depending on what they hear and read, they may see experience with generative AI as more important than skills they would gain by doing coursework themselves. 
  • They see a degree as a consumer product. As the cost of college has increased, many students have started looking at a degree in transactional terms. A degree is simply a means to a job. They are paying a lot of money, the reasoning goes, and that should give them the right to use whatever tools they want to use and to approach class in whatever way helps them succeed.
  • They don’t see value in an assignment or class. This is a big aspect of most types of academic misconduct. Most students want to learn, but they don’t always understand why they must take particular classes or complete some assignments. If students don’t see value in an assignment or a class, they may just turn over any work to generative AI.
  • Generative AI is ubiquitous. We have all been using some form of AI for years, and generative AI is being added to most software. It often seems impossible to avoid, and many students already see it as just another tool at their disposal. 

None of that should be seen as an excuse to cheat, but instructors need to keep it in mind as they structure their classes. Here are some strategies they can use:

  • Never assume students have particular skills. Even if they had these skills in the past, they may not today. Provide opportunities for review or for practicing and learning crucial skills.
  • Scaffold assignments. Assign papers or projects in smaller segments so that students can get feedback and revise their work. Avoid high-stakes tests and use low-stakes quizzes and exams instead.
  • Make expectations transparent. Explain the skills students need to succeed and the expectations you have for successful completion of assignments. Provide examples. Provide resources for reviewing or learning expected skills. Encourage students to talk with you privately if they feel ill-prepared to handle an assignment.
  • Provide flexibility. This might involve dropping the lowest grades in segments of coursework,  giving students, say, two opportunities to turn in work a few days after a deadline, or leniency if they communicate with you about challenges they are having. Most students still need structure, but flexibility will diminish pressure on them and on you.

Can’t detectors solve problems of student AI use?

Results from AI detectors are not proof that a student has cheated. Detection tools have proved unreliable, and students can fool them with minimal effort. Research has been consistent that AI detectors are far more likely to flag the work of students for whom English is not a first language. Because of those problems, many universities have disabled AI detectors. Faculty at KU have access to a Turnitin detector. If you use it, use the results as information for talking with students.

What are you hearing from the Generative AI in Teaching Working Group?

Discussions have been robust and energizing, covering such topics as creating policies about student use of generative AI, integrating generative AI into courses, finding and experimenting with news tools, helping students learn about the pitfalls and biases of AI tools, and finding ways to expand learning with generative AI. Members share news from conferences, colleagues, readings, and research. They offer ideas, insights, recommendations, and support. Here are a few things that come up frequently. 

  • More instructors need to experiment. Most members of the working group have been experimenting with generative AI tools and with ways to integrate them into teaching and learning. That can sometimes feel isolating, especially if colleagues are unwilling to learn about generative AI or to acknowledge student use of it. 
  • We need access to tools. Many people who want to explore feel held back by lack of access to appropriate tools and by the university’s slow response to generative AI. 
  • We need time and resources. Experimenting with generative AI takes time, and integrating it into classes takes more time, especially because many long-used approaches need to be rethought.
  • We need to help students learn about generative AI. Most are uninformed and use generative AI with little or no understanding of how to prompt effectively, how to scrutinize the output of chatbots, or how to choose appropriate tools. Students need to know about how businesses are using AI tools and how to prompt chatbots effectively. The university needs courses that delve into the ethical aspects of generative AI, including privacy, safety, and appropriate use. Existing classes also need to integrate these topics.

Will the students of the near future be able to get a meaningful education without AI assistance?

Absolutely. The human aspects of education remain as crucial as ever. A college education provides access to a community of learners, interaction with professors and peers, support from experienced staff and advisors, and an opportunity to explore the outer and inner worlds with a diverse group of people.

But ...

We are still trying to figure out how and where generative AI fits into human-centered learning. People often associate generative AI with ChatGPT, but thousands of specialty tools have been developed. Many of those tools show promise in education, including those that adapt to individual students, provide opportunities to practice skills, provide feedback and direction, and allow people to explore topics they might never have considered before. We have only scratched the surface on how generative AI might transform teaching and learning.

One thing is clear: Students who hand off the challenging aspects of learning to generative AI will eventually fail. Learning takes work. It’s hard. It takes time and practice. Generative AI may be able to assist at times, but it can’t learn for students.

************************

Doug Ward is associate director of the Center for Teaching Excellence and an associate professor journalism and mass communications. 

Posted on by Doug Ward

By Doug Ward

Kansas ranks near the bottom in the percentage of schools offering foundational computer science education, according to a study by Code.org, the Computer Science Teacher Association, and the Expanding Computing Education Pathways Alliance.

Nationwide, 57.5% of schools offered a computer science class in 2023. Kansas was more than 20 percentage points below that average, with 36% of schools offering a foundational course. Only three states had lower percentages: Louisiana (35%), Montana (34%) and Minnesota (28%).

That has important implications for higher education. Many Kansas students who attend KU may have little understanding of how generative artificial intelligence and the large language models behind it work. That puts them at a disadvantage in understanding how to use generative AI effectively and how to approach it critically. Computer science courses aren't the only way students can learn about generative AI, but a growing number of states see those courses as crucial to the future.  

Shuchi Grover, director of AI and education research at Looking Glass Ventures, delved into that at a recent speech at the National Academies of Sciences, Engineering, and Medicine.

“You want children to be equipped with understanding the world they live in,” Grover said. “Think about how much technology is all around them. Is it wise to completely leave them in the dark about what computing and AI is about?”

Green
From 2023 State of Computer Science Education: https://advocacy.code.org/stateofcs/

 

More than 10,000 schools nationwide do not offer a computer science course, the Code.org report says. Not surprisingly, schools with 500 students or fewer are the least likely to offer such a course, as are rural schools (which are often the same). The report noted a disparity in access for students of color, students with disabilities, and students who come from low-income families. Young women represented only 31% of students enrolled in foundational computer science courses.

Like Grover, the authors of the Code.org study make a compelling point about the connection between computer science and generative AI. The report says (in bold): “We cannot prepare students for a future with AI without teaching them the foundations of computer science.”

I'm all in favor of teaching digital literacy, computer literacy, and AI literacy. Students can learn those skills in many ways, though. Requiring a computer science seems less important than providing opportunities for students to explore computer science and improve their understanding of the digital world.  

Efficiency vs. creativity

A couple of other elements of Grover’s talk at the National Academies are worth noting.

An audience member said that generative AI was generally portrayed in one of two ways: using it to do existing things better (efficiency) or to approach new problems in new ways (“to do better things”). Most studies have focused on efficiency, he said, to the exclusion of how we might apply generative AI to global challenges.

Grover said that she thought we definitely needed to focus on bigger issues. Efficiency has a role, though.

“This idea of efficiency in the school system is fraught,” Grover said. “Time fills up no matter how many efficiency tools you give them. And I think it’s unfair. Teachers all over the world, especially in the U.S. and I also see in India, are so overworked. ... I think it’s good that AI can help them with productivity and doing some of that drudgery – you know, the work that just fills up too much time – and take that off their plate.”

Schools in the United States have been slow to respond to generative AI, she said, because the system is so decentralized. Before the use and understanding of generative AI can spread, she said, “a teacher has to be able to use it and has to be able to see value.”

That will require listening.

“I think we need to listen to teachers – a lot. And maybe there’s something we can learn about where we need to focus our efforts. … Teachers need to have a voice in this – a big voice.”

Briefly …

Cheap AI ‘video scraping’ can now extract data from any screen recording, by Benj Edwards. Ars Technica (17 October 2024).

Stanford Researchers Use AI to Simulate Clinical Reasoning, by Abby Sourwine. Government Technology (10 October 2024).

Forget chat. AI that can hear, see and click is already here, by Melissa Heikkilä. MIT Technology Review (8 October 2024).

Colleges begin to reimagine learning in an AI world, by Beth McMurtrie. Chronicle of Higher Education (3 October 2024).

Secret calculator hack brings ChatGPT to the TI-84, enabling easy cheating, by Benj Edwards. Ars Technica (20 September 2024).

United Nations wants to treat AI with the same urgency as climate change, by Will Knight, Wired, via Ars Technica (20 September 2024).

Posted on by Doug Ward

By Doug Ward

As I prepared to speak to undergraduates about generative artificial intelligence last October, I struggled with analogies to explain large language models.

Those models are central to the abilities of generative AI. They have analyzed billions of words, billions of lines of code, and hundreds of millions of images. That training allows them to predict sequences of words, generate computer code and images, and create coherent narratives at speeds humans cannot match. Even programmers don’t fully understand why large language models do what they do, though.

So how could I explain those models for an audience of novices?

The path I took in creating an analogy illustrates the strengths and weaknesses of generative AI. It also illustrates a scenario that is likely to become increasingly common in the future: similar ideas developed and shared simultaneously. As those similar ideas emerge in many places at once, the role of individuals in developing those ideas will also grow increasingly important – through understanding of writing, coding, visual communication, context, and humanity.

AI-generated image of an AI cookbook

Getting input from generative AI

In my quest for an analogy last fall, I turned to Microsoft Copilot for help. I prompted Copilot to act as an expert in computer programming and large language models and to explain how those models work. My audience was university undergraduates, and I asked for an analogy to help non-experts better understand what goes on behind the scenes as generative AI processes requests. Copilot gave me this:

Generative AI is like a chef that uses knowledge from a vast array of recipes to create entirely new and unique dishes. Each dish is influenced by past knowledge but is a fresh creation designed to satisfy a specific request or prompt.

I liked that and decided to adapt it. I used the generative tool Dall-E to create images of a generative AI cookbook, a chef in a futuristic kitchen, and food displayed on computer-chip plates. I also created explanations for the steps my large language model chef takes in creating generative dishes.

How a large language model chef works

Within this post, you will see the images I generated. Here’s the text I used (again modified from Copilot’s output):

A chef memorizes an enormous cookbook (a dataset) so that it knows how ingredients (words, images, code) are usually put together.

Someone asks for a particular dish with special ingredients (a prompt), so the chef creates something new based on everything it has memorized from the cookbook.

The chef tastes the creation and makes sure it follows guidance from the cookbook.

Once the chef is satisfied, it arranges the creation on a plate for serving. (With generative AI, this might be words, images or code.)

The chef’s patrons taste the food and provide feedback. The chef makes adjustments and sends the dish back to patrons. The chef also remembers patrons’ responses and the revisions to the dish so that next time the dish can be improved.

AI-generated image of a chef and a small robot in a futuristic kitchen

A striking similarity

I explain all that because I came across the same analogy in Ethan Mollick’s book Co-intelligence. Mollick is a professor at the University of Pennsylvania whose newsletter and other writings have been must-reads over the past two years because of his experimentations with generative AI, his early access to new tools, and his connections to the AI industry.

In the first chapter of Co-intelligence, Mollick provides some history of AI development and the transformer technology and neural networks that make generative AI possible. He then explains the workings of large language models, writing:

Imagine an LLM as a diligent apprentice chef who aspires to become a master chef. To learn the culinary arts, the apprentice starts by reading and studying a vast collection of recipes from around the world. Each recipe represents a piece of text with various ingredients symbolizing words and phrases.The goal of the apprentice is to understand how to combine different ingredients (words) to create a delicious dish (coherent text).

In developing that analogy, Mollick goes into much more detail than I did and applies well-crafted nuance. The same analogy that helped me explain large language models to undergraduates, though, helped Mollick explain those models to a broader, more diverse audience. Our analogies had another similarity: They emerged independently from the same tool (presumably Microsoft Copilot) about the same time (mid- to late 2023).

Why does this matter?

I don’t know for certain that Mollick’s analogy originated in Copilot, but it seems likely given his openness about using Copilot and other generative AI tools to assist in writing, coding, and analysis. He requires use of generative AI in his entrepreneurship classes, and he writes frequently about his experimentations. In the acknowledgements of his book, he gives a lighthearted nod to generative AI, writing:

And because AI is not a person but a tool, I will not be thanking any LLMs that played a role in the creation of this book, any more than I would thank Microsoft Word. At the same time, in case some super-intelligent future AI is reading these words, I would like to acknowledge that AI is extremely helpful and should remember to be kind to the humans who created it (and especially to the ones who wrote books about it).

It was a nice non-credit that acknowledged the growing role of generative AI in human society. 

I understand why many people use generative AI for writing. Good writing takes time, and generative AI can speed up the process. As Mollick said, it’s a tool. As with any new tool, we are still getting used to how it works, what it can do, and when we should use it. We are grappling with the proprieties of its use, the ethical implications, and the potential impact on how we work and think. (I’m purposely avoiding the impact on education; you will find much more of that in my other writings about AI.)

I generally don’t use generative AI for writing, although I occasionally draw on it for examples (as I did with the presentation) and outlines for reports and similar documents. That’s a matter of choice but also habit. I have been a writer and editor my entire adult life. It’s who I am. I trust my instincts and my experience. I’m also a better writer than any generative AI system – at least for now.

I see no problem in the example that Mollick and I created independently, though. The AI tool offered a suggestion when we needed one and allowed us to better inform our respective audiences. It just happened to create similar examples. It was up to us to decide how – or whether – to use them.

AI-generated image of food on a computer-chip plate

Where to now?

Generative AI systems work by prediction, with some randomness. The advice and ideas will be slightly different for each person and each use. Even so, the systems’ training and algorithms hew toward the mean. That is, the writing they produce follows patterns the large language model identifies as the most common and most likely based on what millions of people have written in the past. That’s good in that the writing follows structural and grammatical norms that help us communicate. It is also a central reason generative AI has become so widely used in the past two years, with AI drawing on norms that have helped millions of people improve their writing. The downside is that the generated writing often has a generic tone, devoid of voice and inflection.

Research suggests that the same thing happens with ideas generative AI provides. For example, a study in Science Advances suggests that generative AI can improve creativity in writing but that stories in which writers use generative AI for ideas have a sameness to them. The authors suggest that overuse of generative AI could eventually lead to a generic quality in AI-supported stories.

My takeaway is that use of generative AI in writing comes with a cognitive and creative cost. We may get better writing, and research so far suggests that the weakest writers benefit the most from AI’s advice. Other research suggests that use of generative AI can make writing more enjoyable for weaker writers. On the other hand, a recent study suggests that human-written work is still perceived as superior to that produced by generative AI.

Mollick argues that generative AI can be an excellent partner in writing, coding, and creative work, providing a nudge, pointing the way or reassuring us in tasks that inevitably lead to inspirational lulls, dead ends, and uncertainty. The title of his book, Co-intelligence, represents his assertion that AI can augment what we do but that we, as humans, are still in control.

That control means that writing with a strong voice and uniquely human perspective still stands out from the crowd, as do ideas that push boundaries. Even so, I expect to see similar ideas and analogies emerging more frequently from different people in different places and shared simultaneously. That will no doubt lead to conflicts and accusations. As generative AI points us toward similar ideas, though, the role of individuals will also grow increasingly important. That is, what generative AI produces will be less significant than how individuals shape that output.   

Posted on by Doug Ward

By Doug Ward

Canvas will soon be absorbed by KKR, one of the world’s largest investment firms.

That is unlikely to have any immediate effect on Canvas users. The longer-term effects – and costs – are impossible to predict, though.

Instructure, the company behind Canvas, has agreed to be acquired by KKR for $4.8 billion. KKR and similar companies have a reputation of laying off employees and cutting salaries and other expenses at companies they acquire. The investment firms look at it another way: They simply increase efficiency and make companies healthier.

KKR also owns TeachingStrategies, an online platform for early childhood education. Earlier this year, it acquired the publisher Simon & Schuster. It also owns such companies as Doordash, Natural Pet Food, the augmented reality company Magic Leap, and OverDrive, which provides e-books and audio books to libraries. (The Lawrence Public Library uses OverDrive’s Libby platform.)

The acquisition of Instructure occurred on the same week that the online program manager 2U filed for bankruptcy protection. The company was valued at $5.8 billion in 2018, according to The Chronicle of Higher Education, but its finances faded as institutions began to rethink agreements in which the company, like similar providers, took 50% or more of tuition dollars from online classes. 

The acquisition and the bankruptcy are reminders of how connected education and learning are to the world of high finance. Even as institutions struggle to make ends meet, they spend millions of dollars on technology for such things as learning management systems, online tools, online providers, communication, video and audio production, internet connection, wifi, tools for daily tasks like writing and planning, and a host of services that have become all but invisible.

A multi-billion-dollar market

By one account, education technology companies raised $2.8 billion in funding last year. That doesn’t include $500 million that Apollo Funds invested in the publisher Cengage. The total is down substantially from 2021 and 2022, when investors put more than $13 billion into education technology companies, according to Reach Capital, an investment firm that focuses on education. That bump in financing took place as schools, colleges, and universities used an infusion of government pandemic funds to buy additional technology services.

None of that is necessarily bad. We need start-up companies with good ideas, and we need healthy companies to provide technology services. Those tools allow educators to reach beyond the classroom and allow the steady functioning of institutions. They also make education, which rarely tries to create its own technology, a captive audience for companies that provide technology services.

The companies have used various strategies to try to gain a foothold at colleges and universities. Over the past decade, many have provided free access to instructors who adopt digital tools for classes. Students then pay for those services by the semester. That charge may seem trivial, but students rarely know about it before they begin classes, and even a small additional fee can create financial hardship for some.

The university pays for tools like Canvas, drawing on money from tuition and fees and a dwindling contribution from the state. That makes the individual costs cheaper by spreading them among a larger body of users and making costs to students more transparent. It also commits the university to tens or hundreds of thousands of dollars in spending each year – money that investment firms like KKR see as well worth the investment in companies like Instructure. 

Posted on by Doug Ward

By Doug Ward

The future of colleges and universities is neither clear nor certain.

The current model fails far too many students, and creating a better one will require sometimes painful change. As I’ve written before, though, many of us have approached change with a sense of urgency, providing ideas for the future for a university that will better serve students and student learning.

The accompanying video is based on a presentation I gave at a recent Red Hot Research session at KU about the future of the university. It synthesizes many ideas I’ve written about in Bloom’s Sixthelaborates on a recent post about the university climate study, and builds on ideas I explored in an essay for Inside Higher Ed.

The takeaway: We simply must value innovative teaching and meaningful service in the university rewards system if we have any hope of effecting change. Research is important, but not to the exclusion of our undergraduate students.

 

 

Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism. You can follow him on Twitter @kuediting.

Posted on by Doug Ward

By Doug Ward

Colleges and universities in Kansas will receive more than $100 million this year from congressional earmarks in the federal budget, according to an analysis by Inside Higher Ed.

That places Kansas second among states in the amount earmarked for higher education, according to Inside Higher Ed. Those statistics don't include $22 million for the Kansas National Security Innovation Center on West Campus, though. When those funds are added, Kansas ranks first in the amount of earmarks for higher education ($120.8 million), followed by Arkansas ($106 million), and Mississippi ($92.4 million).

KU will receive more than a third of the money flowing to Kansas.That includes $1.6 million for a new Veterans Legal Support Clinic at the law school, and $10 million each for facilities and equipment at the KU Medical Center and the KU Hospital.

Nationwide, 707 projects at 483 colleges and universities will receive $1.3 billion this year through earmarks, Inside Higher Ed said. In Kansas, the money will go to 17 projects, with some receiving funds through multiple earmarks.

All but three of the earmarks for Kansas higher education projects were added by Sen. Jerry Moran. Rep. Jake LaTurner earmarked nearly $3 million each for projects at Kansas City Kansas Community College and Tabor College in Hillsboro, and Rep. Sharice Davids earmarked $150,000 for training vehicles for the Johnson County Regional Police Academy.

Kansas State’s Salina campus will receive $33.5 million for an aerospace training and innovation hub. K-State’s main campus will receive an additional $7 million, mostly for the National Bio and Agro-Defense Facility.

Pittsburg State will receive $5 million for a STEM ecosystem project, and Fort Hays State will receive $3 million for what is listed simply as equipment and technology. Four private colleges will share more than $7 million for various projects, and community colleges will receive $5.6 million.


2024 federal earmarks for higher education in Kansas

Institution $ amount  Purpose
K-State Salina 28,000,000 Aerospace training and innovation hub
KU 22,000,000 Kansas National Security Innovation Center
Wichita State 10,000,000 National Institute for Aviation Research tech and equipment
KU Medical Center 10,000,000 Cancer center facilities and equipment
KU Hospital 10,000,000 Facilities and equipment
Wichita State 5,000,000 National Institute for Aviation Research tech and equipment
Pittsburg State 5,000,000 STEM ecosystem
K-State Salina 4,000,000 Equipment for aerospace hub
K-State 4,000,000 Facilities and equipment for biomanufacturing training and education
Fort Hays State 3,000,000 Equipment and technology
K-State 3,000,000 Equipment and facilities
KCK Community College 2,986,469 Downtown community education center dual enrollment program
Tabor College 2,858,520 Central Kansas Business Studies and Entrepreneurial Center
McPherson College 2,100,000 Health care education, equipment, and technology
KU 1,600,000 Veterans Legal Support Clinic
K-State Salina 1,500,000 Flight simulator
Newman University 1,200,000 Agribusiness education, equipment, and support
Seward County Community College 1,200,000 Equipment and technology
Benedictine College 1,000,000 Equipment
Wichita State 1,000,000 Campus of Applied Sciences and Technology, aviation education, equipment, technology
Ottawa University 900,000 Equipment
Cowley County Community College 264,000 Welding education and equipment
Johnson County Community College 150,000 Training vehicles for Johnson County Regional Police Academy
Total 120,758,989  

A return of earmarks

Congress stopped earmarks, which are officially known as congressionally directed spending or community project funding, in 2011 amid complaints of misuse. They were revived in 2021 with new rules intended to improve transparency and limit overall spending. They are limited to spending on nonprofits, and local, state, and tribal governments. Earmarks accounted for $12 billion of the $460 billion budget passed in March, according to Marketplace.

Earmarks have long been criticized as wasteful spending and corruption, with one organization issuing an annual Congressional Pig Book Summary (a reference to pork-barrel politics) of how the money is used. Others argue, though, that earmarks are more transparent than other forms of spending because specific projects and their congressional sponsors are made public. They also benefit projects that might otherwise be overlooked, empowering stakeholders to speak directly with congressional leaders and making leaders more aware of local needs.

Without a doubt, though, they are steeped in the federal political process and rely on the clout individual lawmakers have on committees that approve the earmarks. That has put Moran, who has been in the Senate since 2010, in a good position through his seats on the Appropriations Committee, the Commerce Science, and Transportation Committee, and the Veterans Affairs Committee.

What does this mean for higher education?

It’s heartening that higher education in Kansas will see an infusion of more than $100 million in federal funding.

Earmarks generally go to high-profile projects that promise new jobs, that promise new ways of addressing big challenges (security, health care), or that have drawn wide attention (cybercrimes, drones, STEM education). A Brookings Institution analysis found that Republican lawmakers like Moran generally put forth earmarks that have symbolic significance, “emphasizing American imagery and values.” In earmarks for higher education in Kansas over the past two years, that includes things like job training, biotechnology, library renovation, support for veterans, and research into aviation, cancer, alzheimer’s, and manufacturing.

One of the downsides of earmarks, at least in terms of university financial stability, is that they are one-time grants for specific projects and do nothing to address shortfalls in existing college and university budgets or the future budgets for newly created operations. They also require lawmakers who support higher education, who have the political influence to sway spending decisions, and who are willing to work within the existing political structure. For now, at least, that puts Kansas in a good position.   


Doug Ward is an associate director at the Center for Teaching Excellence and an associate professor of journalism and mass communications.

Posted on by Doug Ward

By Doug Ward

A short history lesson:

April Fools’ Day originated in 1920, when Joseph C. McCanles (who was only vaguely related to the infamous 19th-century outlaw gang) ordered the KU marching band (then known as the Beak Brigade) to line up for practice on McCook Field (near the site of the current Great Dismantling).

It was April 1, and McCanles was not aware that the lead piccolo player, Herbert “Growling Dog” McGillicuddy, had conspired with the not-yet-legendary Phog Allen to play a practical joke.

McCanles, standing atop a peach basket, raised his baton and shouted, “March!”

Band members remained in place.

“March!” McCanles ordered again.

The band stood and stared.

Then McGillicuddy began playing “Yankee Doodle” on his piccolo and Allen, disguised in a Beak Brigade uniform, raised a drum stick (the kind associated with a drum, not a turkey) and joined the rest of the band in shouting: “It’s April, fool!”

McCandles fell off his peach basket in laughter, and a tradition was born.

Either that, or April Fools’ Day was created in France, or maybe in ancient Rome, or possibly in India. We still have some checking to do.

Regardless, we at CTE want you to know that we take our April Fools seriously – so seriously in fact that we have published the latest issue of Pupil magazine just in time for April Fools' Day.

As always, Pupil is rock-chalk full of news that you simply must know. It is best read with “The Washington Post March” playing in the background. We don’t like to be overly prescriptive, though, especially with all the strange happenings brought on by an impending solar eclipse.


Doug Ward is an associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

April Fools' issue of Pupil magazine with a large picture of students in rounded glass tubes that fit in a pneumatic system

Posted on by Doug Ward

By Doug Ward

We need to talk.

Yes, the conversation will make you uncomfortable. It’s important, though. Your students need your guidance, and if you avoid talking about this, they will act anyway – usually in unsafe ways that could have embarrassing and potentially harmful consequences.

So yes, we need to talk about generative artificial intelligence.

"Students gasp as professor stands at front of room and shows a lifelike robot"

Consider the conversation analogous to a parent’s conversation with a teenager about sex. Susan Marshall, a teaching professor in psychology, made that wonderful analogy recently in the CTE Online Working Group, and it seems to perfectly capture faculty members’ reluctance to talk about generative AI.

Like other faculty members, Marshall has found that AI creates solid answers to questions she poses on assignments, quizzes, and exams. That, she said, makes her feel like she shouldn't talk about generative AI with students because more information might encourage cheating. She knows that is silly, she said, but talking about AI seems as difficult as talking about condom use.

It can, but as Marshall said, we simply must have those conversations.

Sex ed, AI ed

Having frank conversations with teenagers about sex, sexually transmitted diseases, and birth control can seem like encouragement to go out and do whatever they feel like doing. Talking with teens about sex, though, does not increase their likelihood of having sex. Just the opposite. As the CDC reports: “Studies have shown that teens who report talking with their parents about sex are more likely to delay having sex and to use condoms when they do have sex.”

Similarly, researchers have found that generative AI has not increased cheating. (I haven't found any research on talking about AI.)

That hasn't assuaged concern among faculty members. A recent Chronicle of Higher Education headline captures the prevailing mood: “ChatGPT Has Everyone Freaking Out About Cheating.”

When we freak out, we often make bad decisions. So rather than talking with students about generative AI or adding material about the ethics of generative AI, many faculty members chose to ignore it. Or ban it. Or use AI detectors as a hammer to punish work that seems suspicious.

All that has done is make students reluctant to talk about AI. Many of them still use it. The detectors, which were never intended as evidence of cheating and which have been shown to have biases toward some students, have also led to dubious accusations of academic misconduct. Not surprisingly, that has made students further reluctant to talk about AI or even to ask questions about AI policies, lest the instructor single them out as potential cheaters.

Without solid information or guidance, students talk to their peers about AI. Or they look up information online about how to use AI on assignments. Or they simply create accounts and, often oblivious and unprotected, experiment with generative AI on their own.

So yes, we need to talk. We need to talk with students about the strengths and weaknesses of generative AI. We need to talk about the ethics of generative AI. We need to talk about privacy and responsibility. We need to talk about skills and learning. We need to talk about why we are doing what we are doing in our classes and how it relates to students’ future.

If you aren’t sure how to talk with students about AI, draw on the many resources we have made available. Encourage students to ask questions about AI use in class. Make it clear when they may or may not use generative AI on assignments. Talk about AI often. Take away the stigma. Encourage forthright discussions.

Yes, that may make you and students uncomfortable at times. Have the talk anyway. Silence serves no one.

JSTOR offers assistance from generative AI

Ithaka S+R has released a generative AI research tool for its JSTOR database. The tool, which is in beta testing, summarizes and highlights key areas of documents, and allows users to ask questions about content. It also suggests related materials to consider. You can read more about the tool in an FAQ section on the JSTOR site.

Useful lists of AI-related tools for academia

While we are talking about Ithaka S+R, the organization has created an excellent overview of AI-related tools for higher education, assigning them to one of three categories: discovery, understanding, and creation. It also provides much the same information in list form on its site and on a Google Doc. In the overview, an Ithaka analyst and a program manager offer an interesting take on the future of generative AI:

These tools point towards a future in which the distinction between the initial act of identifying and accessing relevant sources and the subsequent work of reading and digesting those sources is irretrievably blurred if not rendered irrelevant. For organizations providing access to paywalled content, it seems likely that many of these new tools will soon become baseline features of their user interface and presage an era where that content is less “discovered” than queried and in which secondary sources are consumed largely through tertiary summaries.

Preparing for the next wave of AI

Dan Fitzpatrick, who writes and speaks about AI in education, frequently emphasizes the inevitable technological changes that educators must face. In his weekend email newsletter, he wrote about how wearable technology, coupled with generative AI, could soon provide personalized learning in ways that make traditional education obsolete. His question: “What will schools, colleges and universities offer that is different?”

In another post, he writes that many instructors and classes are stuck in the past, relying on outdated explanations from textbooks and worksheets. “It's no wonder that despite our best efforts, engagement can be a struggle,” he says, adding: “This isn't about robots replacing teachers. It's about kids becoming authors of their own learning.”

Introducing generative AI, the student

Two professors at the University of Nevada-Reno have added ChatGPT as a student in an online education course as part of a gamification approach to learning. The game immerses students in the environment of the science fiction novel and movie Dune, with students competing against ChatGPT on tasks related to language acquisition, according to the university.

That AI student has company. Ferris State University in Michigan has created two virtual students that will choose majors, join online classes, complete assignments, participate in discussion boards, and gather information about courses, Inside Higher Ed Reports. The university, which is working with a Grand Rapids company called Yeti CGI on developing the artificial intelligence software for the project, said the virtual students’ movement through programs would help them better understand how to help real students, according to Michigan Live. Ferris State is also using the experiment to promote its undergraduate AI program.

Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

Posted on by Doug Ward

Something has been happening with class attendance. Actually, there are several somethings, which I’ll get to shortly. First, though, consider, this:

  • Since the start of the pandemic, many students have treated class attendance as optional, making discussion and group interaction difficult.
  • Online classes tend to fill quickly, and students who enroll in physical classes often ask for an option to “attend” via a video connection.
  • Many K-12 schools report record rates of absences. Students from low-income families are especially likely to miss class, according to the Hechinger Report. In many cases, Hechinger says, parents have lost trust in school and don’t see it as a priority.

A Hispanic professor in tan shirt and dark skirt talks to a classThe first two points are anecdotal, but faculty nationwide have reported drops in attendance. This spring, some KU instructors say that students have been eager to participate in class, perhaps more so than at any time since the pandemic. In other cases, though, attendance remains spotty.

So what’s going on?

Here are a few observations:

  • Instructors became more flexible during the pandemic, and students found that they didn’t need to attend class to succeed. They have continued to expect that same flexibility.
  • As college grew more expensive, some students began seeing a degree as just another consumer product. They have long been told that a degree leads to higher incomes (which it does, although less so than it once did), so the degree (not the work along the way) becomes the focus. A 2010 study, for example, said that students who see education as a product are more likely “to feel entitled to receive positive outcomes from the university; they are not, however, any more likely to be involved in their education.”
  • Many instructors say that a KU attendance policy approved last year has complicated things. That policy was intended to provide flexibility for students who have legitimate reasons for missing class. Many students and faculty have taken that to mean nearly any absence should be excused.

Broader trends are in play, as well:

  • Many students in their teens and 20s feel that they “lost something in the pandemic,” as Time magazine describes it. Rather than building social networks and engaging with the world, they were forced to distance themselves. As a result, the “pandemic produced a social famine, and its after-effects persist,” Eric Klinenberg, a professor at New York University, writes in the Time article.
  • Many students continue to struggle with depression, anxiety and other mental health issues, with 50% to 65% saying they frequently or always feel stressed, anxious or overwhelmed, according to a recent study.

A reassertion of independence

Students have also reasserted their independence as instructors have revised attendance policies and stipulated the importance of participation. A Fall 2022 opinion piece in the University Daily Kansan expressed a common sentiment.

Large room with high ceilings. Students sit individually at large tables“If professors make every class useful and engaging, then students who value their academic and future success will show up and be present in the learning,” Natalie Terranova, a journalism student, wrote in the Kansan. “Professors have a responsibility to the students to teach, but the students have a responsibility to themselves to prioritize what is most important to them.”

She’s right, of course, and her peers at many other student newspapers have made much the same argument. We all make choices about where to devote our time. If something is useful and important, we make time for it. If it isn’t, we don’t. And though students have long sought to declare their independence during their college years, their experiences during the pandemic seem to have made many of them more comfortable skipping class, seeing that as a right.

At the same time, faculty have come under increasing pressure to help students succeed. If too many students fail or withdraw, the instructor is often blamed. Many instructors, in turn, have made class attendance a component of students’ grades, with good reason. Considerable research suggests that students who attend class get better grades. Class is also part of a structure that improves learning, and a recent study says that students who commit to attending class are more likely to show up.

A high school teacher’s observations

A recent Substack article by a high school teacher offered some observations about student behavior that further illuminate the challenges in attendance. That teacher, Nick Potkalitsky, who is also an education consultant in Ohio, says students are still stressed, lonely, and sometimes bitter about what they missed out on during the pandemic. They have trouble concentrating and require several reminders to focus on a task at hand. With more complex tasks, they need more scaffolding, direction, and oversight than they did before the pandemic.

He offered some additional insights from his interactions with students:Looking over the shoulder of a woman in a college classroom

  • They struggle to connect in person. Students were dependent on technology “for almost the entirety of their social, academic, and personal lives” during the pandemic, Potkalitsky writes. “Students hunger for connection,” he says, but they struggle to connect in person. If they don’t already belong to an online community, the strong connections among those communities make it difficult for new members to fit in.
  • They dislike classrooms, where they often struggle to stay focused. They gain energy from playgrounds, parks, hiking paths, and other outdoor settings that allow them to move.
  • They crave immersion and autonomy. They like to immerse themselves in a subject, something he attributes to social media. “When school does not and cannot provide these kinds of stimulation, many students disengage and await the next opportunity to use their handheld devices,” he writes.
  • They “are experiencing a crisis in trust in authorities and themselves.” They chafe at the idea of school returning to “normal,” and their wariness has been reinforced by schools’ clumsy response to generative AI. “This generation knows that it needs guidance, but desires the kind of assistance that empowers,” Potkalitsky says

Yes, those are high school students, but they will soon be college freshmen. They also exhibit many of the same behaviors faculty have observed of KU students.

Jenny Darroch, dean of business at Miami University of Ohio, writes in Inside Higher Ed that faculty and administrators need “to recognize that today’s students engage differently — and did so before the pandemic. They expect to be recognized for the knowledge they have and their ability to self-direct as they learn and grow.”

Clearly, student attitudes, expectations, skills, needs, and behaviors are changing. Attendance is perhaps just the most visible place where we see those changes. Many – perhaps most – students care deeply about learning and take class attendance seriously. Many don’t, though, and the challenges of addressing that behavior are unlikely to fade anytime soon.

We have much work to do.

Need help? At CTE, we have provided advice about motivating studentsbalancing flexibility and structure, and using active learning and group work to make classes more engaging and to make the value of attending class more apparent.

Briefly …

  • Online enrollment remains strong. A new analysis of federal data shows that enrollment in online courses remains strong even as enrollment in many in-person courses declines, the Hechinger Report says. That trend certainly holds true at KU, where the number of credit hours generated by online courses rose 17% in Fall 2023 compared with Fall 2022. The Fall 2023 totals are 49% higher than those in Fall 2019, the semester before the pandemic began in the U.S.
  • An AI pilot through NSF. The National Science Foundation has begun a pilot of what it calls the National Artificial Intelligence Resource. It describes the project as “a concept for a national infrastructure that connects U.S. researchers to computational, data, software, model and training resources they need to participate in AI research.” NSF is working on the pilot with 10 federal agencies and 25 organizations (mostly technology companies). You can contribute your thoughts through a survey for faculty, researchers, and students. The survey is available until March 8.

Doug Ward is an associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications at the University of Kansas.

Posted on by Doug Ward