AI trends that are shaping the future of education


By Doug Ward

A few eye-popping statistics help demonstrate the growing reach of generative AI:

Beneath the growing use of generative artificial intelligence lie many trends and emerging habits shaping the future of technology, jobs, and education. Social, political, and economic forces were already creating tectonic shifts beneath educational institutions. Generative AI has added to and accelerated the tremors over the past two and a half years, leaving many educators feeling angry and powerless. 

Faculty work at a table and whiteboard
 Photo illustration by Google Gemini

Regardless of our views about generative AI, we must adapt. That will mean rethinking pedagogy, assignments, grading, learning outcomes, class structures, majors, and perhaps even disciplines. It will mean finding ways of integrating generative AI into assignments and helping students prepare to use AI in their jobs. That doesn’t mean all AI all the time. It does mean making skill development more transparent, working harder at building trust among students, and articulating the value of learning. It means having frequent conversations with students about what generative AI is and what it can and can’t do. It means helping students understand that getting answers from chatbots is no substitute for the hard work of learning. Importantly, it means ending the demonization of AI use among students and recognizing it as a tool for learning.

I’ll be writing more about that in the coming year. As a prelude, I want to share some of the significant trends I see as we head into year three of ChatGPT and a generative AI world.   

Use of generative AI

Younger people are far more likely to use generative AI than older adults are. According to a Pew Research Center survey, 58% of 18- to 29-year-olds have used ChatGPT, compared with 34% of all adults. In late 2024, more than a quarter of 13- to 17-year-olds said they had used generative AI for schoolwork, Pew Research said. As teenagers make AI use a habit, we can expect them to continue that habit in college. 

Young people have long been quicker to adopt digital technology than their parents and grandparents (and their teachers). They are less set in their ways, and they gravitate toward technology that allows them to connect and communicate, and to create and interact with media. Once again, they are leading changes in technology use.

AI use among college students is widespread

In a worldwide survey of college students, 86% said they had used AI in their studies, and many students say generative AI has become essential to their learning. In interviews and a focus group conducted by The Chronicle of Higher Education, students said they used AI to brainstorm ideas, find weak areas in their writing, create schedules and study plans, and make up for poor instruction.  

Some students said they relied on AI summaries rather than reading papers or books, complaining that reading loads were excessive. Others rely on generative AI to tutor them because they either can’t make it to professors’ office hours, don’t want to talk with the professors, or don't think professors can help them. Some students also use ChatGPT to look up questions in class rather than participate in discussion. Some, of course, use generative AI to complete assignments for them.

That use of AI to avoid reading, writing, and discussion is frustrating for faculty members. Those activities are crucial to learning. Many students, though, see themselves as being efficient. We need to do a better job of explaining the value of the work we give students, but we also need to scrutinize our assignments and consider ways of approaching them differently. Integrating AI literacy into courses will also be critical. Students need – and generally want – help in learning how to use generative AI tools effectively. They also need help in learning how to learn, a skill they will need for the rest of their lives.  

Most faculty have been skeptical of generative AI 

Most instructors lack the time or desire to master use of AI or to make widescale changes to classes to adapt to student use of AI. A Pew poll suggests that women in academia are considerably more skeptical of generative AI than men are, and U.S. and Canadian educators are more skeptical of generative AI than their counterparts in other countries. Research also reinforces what was already apparent: Generative AI can impede learning if students use it to replace their thinking and engagement with coursework.

All of that has created feelings of resentment, helplessness, and a hardening of resistance. Some instructors say AI has devalued teaching. Others describe it as frightening or demoralizing. In a New York Times opinion piece, Meghan O’Rourke writes about the almost seductive powers of ChatGPT she felt as she experimented with generative AI. Ultimately, though, O'Rourke, a creative writing professor at Yale, described large language models as “intellectual Soylent Green,” a reference to the science fiction film in which the planet is dying and the food supply is made of people

Educators are facing “psychological and emotional” issues as they try to figure out how to handle generative AI in their classes. I have seen this firsthand, although AI is just one of many other forces bearing down on faculty. I’ve spoken with faculty members who feel especially demoralized when students turn in lifeless reflections that were obviously AI-generated. "I want to hear what you think," one instructor said she had told her students. Collectively, this has led to what one educator called an existential crisis for academics.

Use of AI in peer review creeps upward

Some publishers have begun allowing generative AI to help speed up peer review and deal with a shortage of reviewers. That, in turn, has led some researchers to add hidden prompts in papers to try to gain more favorable reviews, according to Inside Higher Ed. A study in Science Advances argues that more than 13% of researchers in biomedical research used generative AI to create abstracts in 2024.

Use among companies continues to grow

By late 2024, 78% of businesses were using some form of AI in their operations, up from 55% in 2023. Many of those companies are shifting to use of local AI systems rather than cloud systems, in large part for security reasons. Relatedly, unemployment rates for new graduates have increased, with some companies saying that AI can do the work of entry-level employees. Hiring has slowed the most in information, finance, insurance, and technical services fields, and many highly paid white-collar jobs may be at risk. The number of internships has also declined. The CEO of Anthropic has warned that AI could lead to the elimination of up to half of entry-level white-collar jobs. If anything even close to that occurs, it will destroy the means for employees to gain experience and raise even more questions about the value of a college education in its current form.

Efforts to promote use of AI

Federal government makes AI a priority in K-12

The Department of Education has made use of AI a priority for K-12 education, calling for integration of AI into teaching and learning, creation of more computer science classes, and the use of AI to “promote efficiency in school and classroom operations,” improve teacher training and evaluation, and support tutoring. It mentions “AI literacy,” but implies that that means learning to use AI tools (which is only part of what students need). Technology companies have responded by providing more than $20 million to help create an AI training hub for K-12 teachers. The digital publication District Administration says education has reached “a turning point” with AI, as pressure grows for adoption of AI even as federal focus on ethics and equity has faded and federal guidelines do little to promote accountability, privacy, or data security. The push for more technology skills in K-12 comes as the growth in computer science majors at universities has stalled as students evaluate their job prospects amid layoffs at technology companies. That push also means that students are likely to enter college with considerable experience using generative AI in coursework, potentially deepening the conflicts with faculty if colleges and universities fail to adapt.

Canvas to add generative AI

Instructure plans to embed ChatGPT into Canvas soon. Instructure's announcement about this is vague, though not all that surprising, especially because Blackboard has added similar capabilities. Instructure calls the new functions IgniteAI, and its says they can be used for "creating quizzes, generating rubrics, summarizing discussions, aligning content to outcomes." It says these will be opt-in features for institutions. (A Reddit post provides more details of what Instructure demonstrated at its annual conference.) What this means for the KU version of Canvas isn’t clear, but the Educational Technology staff will be evaluating the new tools. 

Google and OpenAI create tools for students and teachers

Google and OpenAI have offered tailored versions of their generative AI platforms for teachers and students. Google has added Gemini to its Google for Education tools and has released Gemini for Education, pitching it as transformative because of its ability to personalize learning and "inspire fresh ideas." The free version offers only limited access to its top models and Deep Research function, but the paid version, which is used primarily by school districts, has full access.

ChatGPT has created what it calls study mode for students. OpenAI says study mode takes a Socratic approach to help “you work through problems step by step instead of just getting an answer.” A PCWorld reviewer found the tool helpful, saying it "actually makes me use my brain." MIT Technology Review said, though, that it was “more like the same old ChatGPT, tuned with a new conversation filter that simply governs how it responds to students, encouraging fewer answers and more explanations.” It said the tool was part of OpenAI’s push “to rebrand chatbots as tools for personalized learning rather than cheating.” 

AI companies see education as a lucrative market. By one estimate, educational institutions' spending on AI will grow by 37% a year over the next five years. Magic School, Curipod, Khanmigo, and Diffit are just four of many AI-infused tools created specifically for educators and students. That is important because student use in K-12 normalizes generative AI as part of the learning process.  

To attract more students to ChatGPT, OpenAI made its pro version free for students for a few months in the spring. Google went even further, offering the pro version of Gemini free to students for a year. That means many students have access to more substantial generative AI tools than faculty do.

Social and technological trends 

Online search is changing quickly

Nearly every search engine now uses generative AI to create summaries rather than providing lists of links. Those summaries usually cite only a small number of articles, and the chief executive of the Atlantic said Google was “shifting from being a search engine to an answer engine." As a result, fewer people are clicking on links to articles, and publishers report fewer visits to websites. News sites and other organizations that rely on advertising report substantial declines in web traffic. Bryan Alexander speculates that if this trend continues, we could see a decline in the web as an information source. The Wall Street Journal said companies’ use of generative AI was “rewiring how the internet is used altogether.” This poses yet another challenge for educators as students draw on AI summaries rather than working through articles and synthesizing information on their own.

Use of AI agents is spreading

Agents allow AI systems to act autonomously. They generally work in sequence (or in tandem) to complete a task. A controlling bot (a parent) sends commands to other bots (child systems), which execute commands, gather and check information, and either act on their own or push information back up the line for the parent bot to act. 

Businesses have been cautious about deployment of agents, in part because of cost and security. Interest and spending have intensified, though, and companies have been using agents in such areas as customer service, inventory management, code generation, fraud detection, gene analysis, and the monitoring of digital traffic. One executive has said that software as a service was turning into "agent as a service." 

Software companies have also made agent technology available to the public. OpenAI's agent can log into a learning management systems and complete assignments. Perplexity's new browser uses AI to search, summarize, and automate tasks, and it has been used to write in Google Docs in a way that mimics a human pace. ChatGPT agents can complete homework assignments autonomously, connect to other applications, log into websites, and even click on the human verification boxes that many websites put up. ChatGPT has also been used to automate grading and analyze teaching. The website Imaginative says companies are in a race to create agents that "organize your day without forcing you to switch apps.” Just how effective current agents are is open to debate, but the use of autonomous systems is growing.

Many children use AI for companionship

A vast majority of teenagers prefer human friendship over AI companions, but a third say that interacting with an AI companion is at least as satisfying as speaking with a human, according to Common Sense Media. An Internet Matters report says children as young as 9 use generative AI for companionship and friendship. They practice conversations, consult about what to wear, and ask questions about such things as feelings and body image. Some college students say that generative AI is diminishing relationships with other students.

Video games gaining AI capabilities

Video game makers are experimenting with generative technology that gives characters memories and allows them to adapt to game play. Stanford and Google researchers have added simulations of real people to games. Genie, a tool from Google's DeepMind division, creates an interactive world based on user prompts or images, and allows users to change characters and scenery with additional prompts. Similar approaches are already being used in educational technology, and it seems likely that we will eventually see AI characters act as teachers that can adapt to students’ work, voices, and even facial expressions as they guide students through interactive scenarios. 

Audio, video, and image abilities improve

As the speed of AI models improves, AI companies see voice as a primary means of user interaction with chatbots. Already, the general AI models like ChatGPT, Gemini, Copilot, and Claude can analyze and create images and video, act on voice commands, and converse with users. Gemini will analyze information on a screen and provide advice on using and troubleshooting applications. A company called Rolling Square has created earbuds called Natura AI, which are the only means of accessing its AI system. Users interact with agents, which the company calls “AI people,” to do nearly anything that would usually require a keyboard and screen. A company called Rabbit has made similar promises with a device it released last year. It followed up this summer with an AI agent called Intern.

That is just one aspect of voice technology. More than 20% of online searches are done by voice, and the number of voice assistants being used has doubled since 2020, to 8.4 billion. Those include such tools as Alexa (Amazon), Siri (Apple), and Gemini (Google). The use of tools like Otter, Fireflies and Teams to monitor and transcribe meetings is growing, and it is common to see someone’s chatbot as a proxy in online meetings. Students are using transcription tools to record lectures, and use of medical transcription is growing substantially. Companies are using voice agents on websites and for customer service calls, and companies and governments are using voice as a means of digital verification and security.

  • AI eyeglasses. Companies are creating eyewear with AI assistants embedded in them. The glasses translate text and spoken language, read and summarize written material, search the web, take photographs and record video, recognize physical objects, and connect to phones and computers. They usually contain a small display on one lens, and some can speak to you through bone conduction speakers. The trend toward miniaturization will make keeping technology out of the classroom virtually impossible.
  • AI Audio. The capability of AI systems to generate audio and music continues to improve. Technology from ElevenLabs, for example, is used in call centers, educational technology, and AI assistants. It can clone voices, change voices, or create new voices. Google’s NotebookLM creates podcasts from text, audio and video files you give it, and other companies have begun offering similar capabilities. Tools like Suno and Udio create music from written prompts. Google’s assistant technology answers and screens calls on smartphones. AI is making the use of voice so prevalent that one blogger argues that we are returning to “an oral-first culture.” 

So now what?

As use of generative AI grows among students, instructors must find ways to reimagine learning. That doesn't mean that everyone should adopt all things AI. As these trends indicate, though, the number of tools (and toys) that technology companies are infusing with AI is growing rapidly. Some of them offer promise for teaching and learning. Others will make cheating on traditional assignments easier and virtually impossible to detect. Adapting our classes will require experimentation, creativity, and patience. At CTE, we have many things planned (and already available) to help with that process, and we will continue to develop materials, provide examples, and help faculty adapt. We see opportunities for productive change, and we encourage instructors to join us.   


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

Posted on by Doug Ward
Tagged AI, future of higher education, teaching and technology, trends