By Doug Ward

A year after the release of a know-it-all chatbot, educators have yet to find a satisfying answer to a nagging question: What are we supposed to do with generative artificial intelligence?

One reason generative AI has been so perplexing to educators is that there is no single step that all instructors can take to make things easier. Here are a few things what we do know, though:

  • A group of people in 19th-century clothing gather around a wooden box as robots spring out
    The sudden rise of generative AI has felt like the opening of a Pandora’s box

    Students are using generative AI in far larger numbers than faculty, and some are using it to complete all or parts of assignments. A recent Turnitin poll said 22% of faculty were using generative AI, compared with 49% of students.

  • Students in other developed countries are far more likely to use generative AI than students in the U.S., two recent polls suggest.
  • Students are as conflicted as faculty about generative AI, with many worried about AI’s impact on jobs, thinking, and disinformation.
  • Many faculty say that students need to know how to use generative AI but also say they have been reluctant to use it themselves.
  • Detectors can provide information about the use of generative AI, but they are far from flawless and should not be the sole means of accusing students of academic misconduct.

Perhaps the biggest lesson we have learned over the past year is that flexibility in teaching and learning is crucial, especially as new generative AI tools become available and the adoption of those tools accelerates.

We don’t really have an AI problem

It’s important to understand why generative AI has made instructors feel under siege. In a forthcoming article in Academic Leader, I argue that we don’t have an AI problem. We have a structural problem:

Unfortunately, the need for change will only grow as technology, jobs, disciplines, society, and the needs of students evolve. Seen through that lens, generative AI is really just a messenger, and its message is clear: A 19th-century educational structure is ill-suited to handle changes brought on by 21st-century technology. We can either move from crisis to crisis, or we can rethink the way we approach teaching and learning, courses, curricula, faculty roles, and institutions.

That’s not a message most faculty members or administrators want to hear, but it is impossible to ignore. Colleges and universities still operate as if information were scarce and as if students can learn only from faculty members with Ph.D.s. The institutional structure of higher education was also created to exclude or fail students deemed unworthy. That’s much easier than trying to help every student succeed. We are making progress at changing that, but progress is slow even as change accelerates. I’ll be writing more about that in the coming year.

Faculty and staff are finding ways to use AI

Many instructors have made good use of generative AI in classes, and they say students are eager for such conversations. Here are a few approaches faculty have taken:

  • Creating AI-written examples for students to critique.Surprised people gather around a box as a humanoid robot emerges amid a glowing light
  • Allowing students to use AI but asking them to cite what AI creates and separately explain the role AI played in an assignment.
  • Having students use AI to create outlines for papers and projects, and refining goals for projects.
  • Allowing full use of AI as long as students check the output for accuracy and edit and improve on the AI-generated content.
  • Having students design posters with AI.
  • Using examples from AI to discuss the strengths and weaknesses of chatbots and the ethical issues underlying them.
  • Using paper and pencils for work in class. In recent discussions with CTE ambassadors, the term “old school” came up several times, usually in relation to bans on technology. As appealing as that may seem, that approach can put some students at a disadvantage. Many aren’t used to writing by hand, and some with physical impediments simply can’t.
  • For non-native English speakers, generative AI has been a confidence-builder. By evaluating their writing with a grammar checker or chatbot, they can improve phrasing and sentence construction.
  • Some faculty members say that generative AI saves time by helping them create letters of recommendation, event announcements, and case studies and other elements for class.

Sara Wilson, an associate professor of mechanical engineering and a CTE faculty fellow, uses what I think is probably the best approach to AI I’ve seen. In an undergraduate course that requires a considerable amount of programming, she allows students to use whatever tools they wish to create their coding. She meets individually with each student – more than 100 of them – after each project and asks them to explain the concepts behind their work. In those brief meetings, she said, it is fairly easy to spot students who have taken shortcuts.

Like faculty, students are often conflicted

Many students seem as conflicted as faculty over generative AI. In a large introductory journalism and mass communications class where I spoke late this semester, I polled students about their AI use. Interestingly, 21% said they had never used AI and 45% said they had tried it but had done little beyond that. Among the remaining students, 27% said they used AI once a week and 7% said they used it every day. (Those numbers apply only to the students in that class, but they are similar to results from national polls I mention above.)

In describing generative AI, students used terms like “helpful,” “interesting,” “useful” and “the future,” but also “theft,” “scary,” “dangerous,” and “cheating.” Recent polls suggest that students see potential in generative AI in learning but that they see a need for colleges and universities to change. In one poll, 65% of students said that faculty needed to change the way they assess students because of AI, the same percentage that said they wanted faculty to include AI instruction in class to help them prepare for future jobs.

Students I’ve spoken with describe AI as a research tool, a learning tool, and a source of advice. Some use AI as a tutor to help them review for class or to learn about something they are interested in. Others use it to check their writing or code, conduct research, find sources, create outlines, summarize papers, draft an introduction or a conclusion for a paper, and help them in other areas of writing they find challenging. One CTE ambassador said students were excited about the possibilities of generative AI, especially if it helped faculty move away from “perfect grading.”

Time is a barrier

For faculty, one of the biggest challenges with AI is time. We’ve heard from many instructors who say that they understand the importance of integrating generative AI into classes and using it in their own work but that they lack the time to learn about AI. Others say their classes have so much content to cover that working in anything new is difficult.

Instructors are also experiencing cognitive overload. They are being asked to focus more on helping students learn. They are feeling the lingering effects of the pandemic. In many cases, class sizes are increasing; in others, declining enrollment has created anxiety. Information about disciplines, teaching practices, and world events flows unendingly. “It’s hard to keep up with everything,” one CTE ambassador said.

Generative AI dropped like a boulder into the middle of that complex teaching environment, adding yet another layer of complexity: Which AI platform to use? Which AI tools? What about privacy? Ethics? How do we make sure all students have equal access? The platforms themselves can be intimidating. One CTE ambassador summed up the feelings of many I’ve spoken with who have tried using a chatbot but weren’t sure what to do with it: “Maybe I’m not smart enough, but I don’t know what to ask.”People in 19th-century clothing stand before a glowing box with robots at the side

We will continue to provide opportunities for instructors to learn about generative AI in the new year. One ongoing resource is the Generative AI and Teaching Working Group, which will resume in the spring. It is open to anyone at KU. CTE will also be part of a workshop on generative AI on Jan. 12 at the Edwards Campus. That workshop, organized by John Bricklemyer and Heather McCain, will have a series of sessions on such topics as the ethics of generative AI, strategies for using AI, and tools and approaches to prompting for instructors to consider.

We will also continue to add to the resources we have created to help faculty adapt to generative AI. Existing resources focus on such areas as adapting courses to AI, using AI ethically in writing assignments, using AI as a tutor, and handling academic integrity. We have also provided material to help generate discussion about the biases in generative AI. I have led an effort with colleagues from the Bay View Alliance to provide information about how universities can adapt to generative AI. The first of our articles was published last week in Inside Higher Ed. Another, which offers longer-term strategies, is forthcoming in Change magazine. Another piece for administrators will be published this month in Academic Leader.

Focusing on humanity

If generative AI has taught us anything over the past year, it is that we must embrace humanity in education. Technology is an important tool, and we must keep experimenting with ways to use it effectively in teaching and learning. Technology can’t provide the human bond that Peter Felten talked about at the beginning of the semester and that we have made a priority at CTE. Something Felten said during his talk at the Teaching Summit is worth sharing again:

“There’s literally decades and decades of research that says the most important factor in almost any positive student outcome you can think about – learning, retention, graduation rate, well-being, actually things like do they vote after they graduate – the single biggest predictor is the quality of relationships they perceive they have with faculty and peers,” Felten said.

Technology can do many things, but it can’t provide the crucial human connections we all need.

In an ambassadors meeting in November, Dorothy Hines, associate professor of African and African-American studies and curriculum and teaching, summed it up this way: “AI can answer questions, but it can’t feel.” As educators, she said, it’s important that we feel so that our students learn to feel.

That is wise advice. As we continue to integrate generative AI into our work, we must do so in a human way.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

CTE’s Twitter feed