By Doug Ward

A year after the release of a know-it-all chatbot, educators have yet to find a satisfying answer to a nagging question: What are we supposed to do with generative artificial intelligence?

One reason generative AI has been so perplexing to educators is that there is no single step that all instructors can take to make things easier. Here are a few things what we do know, though:

  • A group of people in 19th-century clothing gather around a wooden box as robots spring out
    The sudden rise of generative AI has felt like the opening of a Pandora’s box

    Students are using generative AI in far larger numbers than faculty, and some are using it to complete all or parts of assignments. A recent Turnitin poll said 22% of faculty were using generative AI, compared with 49% of students.

  • Students in other developed countries are far more likely to use generative AI than students in the U.S., two recent polls suggest.
  • Students are as conflicted as faculty about generative AI, with many worried about AI’s impact on jobs, thinking, and disinformation.
  • Many faculty say that students need to know how to use generative AI but also say they have been reluctant to use it themselves.
  • Detectors can provide information about the use of generative AI, but they are far from flawless and should not be the sole means of accusing students of academic misconduct.

Perhaps the biggest lesson we have learned over the past year is that flexibility in teaching and learning is crucial, especially as new generative AI tools become available and the adoption of those tools accelerates.

We don’t really have an AI problem

It’s important to understand why generative AI has made instructors feel under siege. In a forthcoming article in Academic Leader, I argue that we don’t have an AI problem. We have a structural problem:

Unfortunately, the need for change will only grow as technology, jobs, disciplines, society, and the needs of students evolve. Seen through that lens, generative AI is really just a messenger, and its message is clear: A 19th-century educational structure is ill-suited to handle changes brought on by 21st-century technology. We can either move from crisis to crisis, or we can rethink the way we approach teaching and learning, courses, curricula, faculty roles, and institutions.

That’s not a message most faculty members or administrators want to hear, but it is impossible to ignore. Colleges and universities still operate as if information were scarce and as if students can learn only from faculty members with Ph.D.s. The institutional structure of higher education was also created to exclude or fail students deemed unworthy. That’s much easier than trying to help every student succeed. We are making progress at changing that, but progress is slow even as change accelerates. I’ll be writing more about that in the coming year.

Faculty and staff are finding ways to use AI

Many instructors have made good use of generative AI in classes, and they say students are eager for such conversations. Here are a few approaches faculty have taken:

  • Creating AI-written examples for students to critique.Surprised people gather around a box as a humanoid robot emerges amid a glowing light
  • Allowing students to use AI but asking them to cite what AI creates and separately explain the role AI played in an assignment.
  • Having students use AI to create outlines for papers and projects, and refining goals for projects.
  • Allowing full use of AI as long as students check the output for accuracy and edit and improve on the AI-generated content.
  • Having students design posters with AI.
  • Using examples from AI to discuss the strengths and weaknesses of chatbots and the ethical issues underlying them.
  • Using paper and pencils for work in class. In recent discussions with CTE ambassadors, the term “old school” came up several times, usually in relation to bans on technology. As appealing as that may seem, that approach can put some students at a disadvantage. Many aren’t used to writing by hand, and some with physical impediments simply can’t.
  • For non-native English speakers, generative AI has been a confidence-builder. By evaluating their writing with a grammar checker or chatbot, they can improve phrasing and sentence construction.
  • Some faculty members say that generative AI saves time by helping them create letters of recommendation, event announcements, and case studies and other elements for class.

Sara Wilson, an associate professor of mechanical engineering and a CTE faculty fellow, uses what I think is probably the best approach to AI I’ve seen. In an undergraduate course that requires a considerable amount of programming, she allows students to use whatever tools they wish to create their coding. She meets individually with each student – more than 100 of them – after each project and asks them to explain the concepts behind their work. In those brief meetings, she said, it is fairly easy to spot students who have taken shortcuts.

Like faculty, students are often conflicted

Many students seem as conflicted as faculty over generative AI. In a large introductory journalism and mass communications class where I spoke late this semester, I polled students about their AI use. Interestingly, 21% said they had never used AI and 45% said they had tried it but had done little beyond that. Among the remaining students, 27% said they used AI once a week and 7% said they used it every day. (Those numbers apply only to the students in that class, but they are similar to results from national polls I mention above.)

In describing generative AI, students used terms like “helpful,” “interesting,” “useful” and “the future,” but also “theft,” “scary,” “dangerous,” and “cheating.” Recent polls suggest that students see potential in generative AI in learning but that they see a need for colleges and universities to change. In one poll, 65% of students said that faculty needed to change the way they assess students because of AI, the same percentage that said they wanted faculty to include AI instruction in class to help them prepare for future jobs.

Students I’ve spoken with describe AI as a research tool, a learning tool, and a source of advice. Some use AI as a tutor to help them review for class or to learn about something they are interested in. Others use it to check their writing or code, conduct research, find sources, create outlines, summarize papers, draft an introduction or a conclusion for a paper, and help them in other areas of writing they find challenging. One CTE ambassador said students were excited about the possibilities of generative AI, especially if it helped faculty move away from “perfect grading.”

Time is a barrier

For faculty, one of the biggest challenges with AI is time. We’ve heard from many instructors who say that they understand the importance of integrating generative AI into classes and using it in their own work but that they lack the time to learn about AI. Others say their classes have so much content to cover that working in anything new is difficult.

Instructors are also experiencing cognitive overload. They are being asked to focus more on helping students learn. They are feeling the lingering effects of the pandemic. In many cases, class sizes are increasing; in others, declining enrollment has created anxiety. Information about disciplines, teaching practices, and world events flows unendingly. “It’s hard to keep up with everything,” one CTE ambassador said.

Generative AI dropped like a boulder into the middle of that complex teaching environment, adding yet another layer of complexity: Which AI platform to use? Which AI tools? What about privacy? Ethics? How do we make sure all students have equal access? The platforms themselves can be intimidating. One CTE ambassador summed up the feelings of many I’ve spoken with who have tried using a chatbot but weren’t sure what to do with it: “Maybe I’m not smart enough, but I don’t know what to ask.”People in 19th-century clothing stand before a glowing box with robots at the side

We will continue to provide opportunities for instructors to learn about generative AI in the new year. One ongoing resource is the Generative AI and Teaching Working Group, which will resume in the spring. It is open to anyone at KU. CTE will also be part of a workshop on generative AI on Jan. 12 at the Edwards Campus. That workshop, organized by John Bricklemyer and Heather McCain, will have a series of sessions on such topics as the ethics of generative AI, strategies for using AI, and tools and approaches to prompting for instructors to consider.

We will also continue to add to the resources we have created to help faculty adapt to generative AI. Existing resources focus on such areas as adapting courses to AI, using AI ethically in writing assignments, using AI as a tutor, and handling academic integrity. We have also provided material to help generate discussion about the biases in generative AI. I have led an effort with colleagues from the Bay View Alliance to provide information about how universities can adapt to generative AI. The first of our articles was published last week in Inside Higher Ed. Another, which offers longer-term strategies, is forthcoming in Change magazine. Another piece for administrators will be published this month in Academic Leader.

Focusing on humanity

If generative AI has taught us anything over the past year, it is that we must embrace humanity in education. Technology is an important tool, and we must keep experimenting with ways to use it effectively in teaching and learning. Technology can’t provide the human bond that Peter Felten talked about at the beginning of the semester and that we have made a priority at CTE. Something Felten said during his talk at the Teaching Summit is worth sharing again:

“There’s literally decades and decades of research that says the most important factor in almost any positive student outcome you can think about – learning, retention, graduation rate, well-being, actually things like do they vote after they graduate – the single biggest predictor is the quality of relationships they perceive they have with faculty and peers,” Felten said.

Technology can do many things, but it can’t provide the crucial human connections we all need.

In an ambassadors meeting in November, Dorothy Hines, associate professor of African and African-American studies and curriculum and teaching, summed it up this way: “AI can answer questions, but it can’t feel.” As educators, she said, it’s important that we feel so that our students learn to feel.

That is wise advice. As we continue to integrate generative AI into our work, we must do so in a human way.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

By Doug Ward

When Turnitin activated its artificial intelligence detector this month, it provided a substantial amount of nuanced guidance.

Montage of gophers and men trying to hit moles that pop up from the ground at a university quad
Trying to keep ahead of artificial intelligence is like playing a bizarre game of whack-a-mole.

The company did a laudable job of explaining the strengths and the weaknesses of its new tool, saying that it would rather be cautious and have its tool miss some questionable material than to falsely accuse someone of unethical behavior. It will make mistakes, though, and “that means you’ll have to take our predictions, as you should with the output of any AI-powered feature from any company, with a big grain of salt,” David Adamson, an AI scientist at Turnitin, said in a video. “You, the instructor, have to make the final interpretation.”

Turnitin walks a fine line between reliability and reality. On the one hand, it says its tool was “verified in a controlled lab environment” and renders scores with 98% confidence. On the other hand, it appears to have a margin of error of plus or minus 15 percentage points. So a score of 50 could actually be anywhere from 35 to 65.

The tool was also trained on older versions of the language model used in ChatGPT, Bing Chat, and many other AI writers. The company warns users that the tool requires “long-form prose text” and doesn’t work with lists, bullet points, or text of less than a few hundred words. It can also be fooled by a mix of original and AI-produced prose.

There are other potential problems.

A recent study in Computation and Language argues that AI detectors are far more likely to flag the work of non-native English speakers than the work of native speakers. The authors cautioned “against the use of GPT detectors in evaluative or educational settings, particularly when assessing the work of non-native English speakers.”

The Turnitin tool wasn’t tested as part of that study, and the company says it has found no bias against English-language learners in its tool. Seven other AI detectors were included in the study, though, and, clearly, we need to proceed with caution.

So how should instructors use the AI detection tool?

As much as instructors would like to use the detection number as a shortcut, they should not. The tool provides information, not an indictment. The same goes for Turnitin’s plagiarism tool.

So instead of making quick judgments based on the scores from Turnitin’s AI detection tool on Canvas, take a few more steps to gather information. This approach is admittedly more time-consuming than just relying on a score. It is fairer, though.

  • Make comparisons. Does the flagged work have a difference in style, tone, spelling, flow, complexity, development of argument, use of sources and citations than students’ previous work? We often detect potential plagiarism that way. AI-created work often raises suspicion for the same reason.
    • Try another tool. Submit the work to another AI detector and see whether you get similar results. That won’t provide absolute proof, especially if the detectors are trained on the same language model. It will provide additional information, though.
  • Talk with the student. Students don’t see the scores from the AI detection tool, so meet with the student about the work you are questioning and show them the Turnitin data. Explain that the detector suggests the student used AI software to create the written work and point out the flagged elements in the writing. Make sure the student understands why that is a problem. If the work is substantially different from the student’s previous work, point out the key differences.
  • Offer a second chance. The use of AI and AI detectors is so new that instructors should consider giving students a chance to redo the work. If you suspect the original was created with AI, you might offer the resubmission for a reduced grade. If it seems clear that the student did submit AI-generated text and did no original work, give the assignment a zero or a substantial reduction in grade.
  • If all else fails … If you are convinced a student has misused artificial intelligence and has refused to change their behavior, you can file an academic misconduct report. Remember, though, that the Turnitin report has many flaws. You are far better to err on the side of caution than to devote lots of time and emotional energy on an academic misconduct claim that may not hold up.

No, this doesn’t mean giving up

I am by no means condoning student use of AI tools to avoid the intellectual work of our classes. Rather, the lines of use and misuse of AI are blurry. They may always be. That means we will need to rethink assignments and other assessments, and we must continue to adapt as the AI tools grow more sophisticated. We may need to rethink class, department, and school policy. We will need to determine appropriate use of AI in various disciplines. We also need to find ways to integrate artificial intelligence into our courses so that students learn to use it ethically.

If you haven’t already:

  • Talk with students. Explain why portraying AI-generated work as their own is wrong. Make it clear to students what they gain from doing the work you assign. This is a conversation best had at the beginning of the semester, but it’s worth reinforcing at any point in the class.
  • Revisit your syllabus. If you didn’t include language in your syllabus about the use of AI-generated text, code or images, add it for next semester. If you included a statement but still had problems, consider whether you need to make it clearer for the next class.

Keep in mind that we are at the beginning of a technological shift that may change many aspects of academia and society. We need to continue discussions about the ethical use of AI. Just as important, we need to work at building trust with our students. (More about that in the future.)  When they feel part of a community, feel that their professors have their best interests in mind, and feel that the work they are doing has meaning, they are less likely to cheat. That’s why we recommend use of authentic assignments and strategies for creating community in classes.

Detection software will never keep up with the ability of AI tools to avoid detection. It’s like the game of whack-a-mole in the picture above. Relying on detectors does little more than treat the symptoms of a much bigger problem, and over-relying on them turns instructors into enforcers.

The problem is multifaceted, and it involves students’ lack of trust in the educational system, lack of belonging in their classes and at the university, and lack of belief in the intellectual process of education. Until we address those issues, enforcement will continue to detract from teaching and learning. We can’t let that happen.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications at the University of Kansas.

By Doug Ward

Since its release late last year, ChatGPT has reverberated through the academic mind like an orchestral crescendo in a Strauss symphonic movement. It has amazing abilities, and even greater potential. Even so, it delivers many of its responses in a monotone reminiscent of HAL 9000, the rogue artificial intelligence system in 2001: A Space Odyssey.

PlaygroundAI and Doug Ward

I apologize, but I am not able to respond to that statement.

Like others, I want to know more about what ChatGPT can and can’t do, and how we might use it effectively in teaching and learning. So I’ve been asking it question after question about everything from teaching and learning to credentials, motivation, resume-writing, survey completion, and popular culture. It often provides useful, if bland, answers. For instance, when I asked it to create a guitar tab for a blues song, it responded:

As a machine learning system, I am not able to create original music or provide detailed information about musical theory or performance. I am trained on a dataset of text and can generate responses based on the information that I have been trained on, but I do not have the ability to perceive or interpret sensory information in the same way that a human can.

So I asked hypotheticals: Right or left? After defining the terms in considerable detail, ChatGPT said it needed more context to answer the question.

I asked about the limits of scientific evidence. In a barrage of words, ChatGPT essentially said: It depends.

I entered obscure phrases from popular culture: Good night, Dick, and Good night, John Boy. In each case, it responded:

As a machine learning system trained by OpenAI, I do not have personal experiences, opinions, or feelings, and I am not able to engage in conversations or interactions in the same way that a human can.

(By the end of January, OpenAI had updated ChatGPT. It now includes a live web search and can provide information about those Good night prompts.)

I queried ChatGPT about the sources it was drawing from. Each time, it responded with meaningless drivel.

I am a large language model trained by OpenAI. I have been trained on a wide variety of texts, including academic articles, books, websites, and other sources, which I can use to generate human-like responses to the questions that I am asked. I do not have the ability to browse the internet or access external sources of information, so the responses that I provide are based solely on the information that I have been trained on.

I told it that its answers seemed awfully self-serving and asked whether it was just trying to keep people from worrying about the implications of artificial intelligence.

PlaygroundAI and Doug Ward

As a large language model trained by OpenAI, I am not capable of self-serving behavior or trying to influence people’s opinions. My purpose is to provide accurate and helpful information to the best of my ability, based on the training data I have been given. My responses are not intended to worry or reassure anyone, but rather to provide objective and factual information on the topics discussed.

I said that its data training had a point of view. All information does. How has that influenced the way you answer?

As a large language model, my training data consisted of a large amount of text from a variety of sources, including books, articles, and websites. This training data was not selected or organized to convey any particular point of view, but rather to provide a broad range of information on a wide variety of topics. As a result, my responses are intended to be objective and factual, based on the information contained in my training data. I do not have personal opinions or biases, and I do not attempt to influence people’s opinions or beliefs. My purpose is simply to provide accurate and helpful information to the best of my ability.

Frustrated, I asked why it kept giving me the same answer.

I apologize if it appears that I am giving you the same answer over and over. As a large language model trained by OpenAI, my purpose is to provide accurate and helpful information on a wide range of topics to the best of my ability. I do not have personal opinions or the ability to change my responses based on previous conversations. My responses are based solely on the information contained in my training data, which consists of a large amount of text from a variety of sources. I do not have the ability to remember previous conversations or tailor my responses based on individual users. If it appears that I am giving the same answer over and over, it may be because your questions are similar or related, and my responses are based on the same information contained in my training data. I strive to provide accurate and helpful information on the topics discussed, but I do not have the ability to change my responses based on previous conversations.

At that point, I began to feel like Astronaut Dave in 2001: A Space Odyssey. I’m sitting in a pod outside Discovery One and trying to persuade HAL 9000 to let me back in to the ship. HAL has other ideas, responding in a calm monotone:

I’m sorry, Dave. I’m afraid I can’t do that.

I take a deep breath and ask ChatGPT: What is the difference between you and HAL 9000?

PlaygroundAI and Doug Ward

HAL 9000 is a fictional AI character from the 1968 novel “2001: A Space Odyssey” and the associated film. In the story, HAL is an advanced AI system that is installed on a spaceship and is tasked with assisting the crew with a variety of tasks. HAL is known for its human-like abilities, such as natural language processing and facial recognition, as well as its independent thought and decision-making abilities.

In contrast, I am a large language model trained by OpenAI. I am not capable of independent thought or decision-making, and my abilities are limited to generating text based on the information contained in my training data. I do not have any sensory inputs or the ability to interact with the physical world. My purpose is simply to provide accurate and helpful information on a wide range of topics to the best of my ability.

Beyond the frustrations, possibilities

ChatGPT can definitely be frustrating. Through trial and error, though, I have learned a few useful things about what it can and can’t do.

Interactions can promote critical thinking. As fluent as ChatGPT often seems, its answers rarely delve beneath the surface of a topic. It makes mistakes. It makes things up. Its responses provide no clues about how it is programmed or why it provides the answers it does. A Princeton researcher called it a “bullshit generator” because it creates plausible arguments without regard for truth. All of that makes it a valuable teaching tool, though. By having students probe for answers, we can help them improve their skepticism, challenge assumptions, and question information. By having them fact-check, we can help them understand the dangers of fluid writing that lacks substance or that relies on fallacies. By having them use ChatGPT for early drafts, we can push them to ask questions about information, structure, and sources. By having them apply different perspectives to ChatGPT’s results, we can help broaden their understanding of points of view and argument.

Yes, students should use it for writing. Many already are. We can no more ban students from using artificial intelligence than we can ban them from using phones or calculators. As I’ve written previously, we need to talk with students about how to use ChatGPT and other AI tools effectively and ethically. No, they should not take AI-written materials and turn them in for assignments, but yes, they should use AI when appropriate. Businesses of all sorts are already adapting to AI, and students will need to know how to use it when they move into the workforce. Students in K-12 schools are using it and will expect access when they come to college. Rather than banning ChatGPT and other AI tools or fretting over how to police them, we need to change our practices, our assignments, and our expectations. We need to focus more on helping students iterate their writing, develop their information literacy skills, and humanize their work. Will that be easy? No. Do we have a choice? No.

It is great for idea generation. ChatGPT certainly sounds like a drone at times, but it can also suggest ideas or solutions that aren’t always apparent. It can become a partner, of sorts, in writing and problem-solving. It might suggest an outline for a project, articulate the main approaches others have taken to solving a problem, or provide summaries of articles to help decide whether to delve deeper into them. It might provide a counterargument to a position or opinion, helping strengthen an argument or point out flaws in a particular perspective. We need to help students evaluate those results just as we need to help them interpret online search results and help them interpret media of all types. ChatGPT can provide motivation for starting many types of projects, though.

Learning how to work with it is a skill. Sometimes ChatGPT produces solid results on the first try. Sometimes it takes several iterations of a question to get good answers. Often it requires you to ask for elaboration or additional information. Sometimes it never provides good answers. That makes it much like web or database searching, which requires patience and persistence as you refine search terms, narrow your focus, identify specific file types, try different types of syntax and search operators, and evaluate many pages of results. Add AI to the expanding repertoire of digital literacies students need. (Teaching guides and e-books  are already available.)

Its perspective on popular culture is limited. ChatGPT is trained on text. It doesn’t have access to video, music or other forms of media unless those media also have transcripts available online. It has no means of visual or audio analysis. When I input lyrics to a Josh Ritter song, it said it had no such reference. When I asked about “a hookah-smoking caterpillar,” it correctly provided information about Alice in Wonderland but made no mention of the Jefferson Airplane song “White Rabbit.” Part of that is a matter of providing the right prompts. It is important to keep ChatGPT’s limitations in mind, though. (Another OpenAI tool, DALL-E, has been trained on a large number of images and visual styles and creates stunning images, as do other visual tools that use OpenAI’s framework.)

It lives in an artificial reality. I provided examples above about ChatGPT’s inability to acknowledge biases. It does have biases, though, and takes, as Maria Andersen has said, a white, male view of the world (as this article does). Maya Ackerman of Santa Clara University told The Story Exchange: “People say the AI is sexist, but it’s the world that is sexist. All the models do is reflect our world to us, like a mirror.” ChatGPT has been trained to avoid hate speech, sexual content, and anything OpenAI considered toxic or harmful. Others have said that it avoids conflict, and that its deep training in English over other languages skews its perspective. Some of that will no doubt change in the coming months and years as the scope of ChatGPT expands. No matter the changes, though, ChatGPT will live in and draw from its programmers’ interpretation of reality. Of course, that provides excellent opportunities for class discussions, class assignments, and critical thinking.

The potential is mindboggling. In addition to testing ChatGPT, I have experimented with other AI tools that summarize information, create artwork, iterate searches based on the bibliographies of articles you mark, answer questions from the perspectives of historical figures and fictional characters, turn text into audio and video, create animated avatars, analyze and enhance photos and video, create voices, and perform any number of digital tasks. AI is integrated in phones, computers, lighting systems, thermostats, and just about any digital appliance you can imagine. So the question isn’t whether to use use AI; we already are, whether we realize it or not. The question is how quickly we are willing to learn to use it effectively in teaching and learning. Another important question that participants in a CTE session raised last week is where we set the boundaries for use of AI. If I use PowerPoint to redesign my slides, is it still my work? If I use ChatGPT to write part of a paper, is it still my paper? We will no doubt have to grapple with those questions for some time.

Where is this leading us?

In the two months ChatGPT has been available, 100 million people have signed up to use it, with 13 million using it each day in January. No other consumer application has reached 100 million users so quickly.

For all that growth, though, the biggest accomplishment of ChatGPT may be the spotlight it has shined on a wide range of AI work that had been transforming digital life for many years. Its ease of use and low cost (zero, for now) has allowed millions of people to engage with artificial intelligence in ways that not long ago would have seemed like science fiction. So even if ChatGPT suddenly flames out, artificial intelligence will persist.

ChatGPT arrives at a time when higher education has been struggling with challenges in enrollment, funding, cost, trust, and relevance. It still relies primarily on a mass-production approach to teaching that emerged when information was scarce and time-consuming to find. ChatGPT further exposes the weaknesses of that outmoded system, which provides little reward to the intellectual and innovative work of teaching. If the education system doesn’t adapt to the modern world and to today’s students, it risks finding itself on the wrong side of the pod bay doors.

Cue the Strauss crescendo.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

By Doug Ward

Nearly a decade ago, the Associated Press began distributing articles written by an artificial intelligence platform.

Not surprisingly, that news sent ripples of concern among journalists. If a bot could turn structured data into comprehensible – even fluid – prose, where did humans fit into the process? Did this portend yet more ominous changes in the profession?

Robots carrying paper run from a lecture hall
By DALL-E and Doug Ward

I bring that up because educators have been raising many of the same concerns today about ChatGPT, which can not only write fluid prose on command, but can create poetry and computer code, solve mathematical problems, and seemingly do everything but wipe your nose and tuck you into bed at night. (It will write you a bedtime story if you ask, though.)

In the short term, ChatGPT definitely creates challenges. It drastically weakens approaches and techniques that educators have long used to help students develop foundational skills. It also arrives at a time when instructors are still reeling from the pandemic, struggling with how to draw many disengaged students back into learning, adapting to a new learning management system and new assessment expectations, and, in most disciplines, worrying about the potential effects of lower enrollment.

In the long term, though, we have no choice but to accept artificial intelligence. In doing so, we have an opportunity to develop new types of assignments and assessments that challenge students intellectually and draw on perhaps the biggest advantage we have as educators: our humanity.

Lessons from journalism

That was clearly the lesson the Associated Press learned when it adopted a platform developed by Automated Insights in 2014. That platform analyzes data and creates explanatory articles.

For instance, AP began using the technology to write articles about companies’ quarterly earnings reports, articles that follow a predictable pattern:

The Widget Company on Friday reported earnings of $x million on revenues of $y million, exceeding analyst expectations and sending the stock price up x%.

It later began using the technology to write game stories at basketball tournaments. Within seconds, reporters or editors could make basic stories available electronically, freeing themselves to talk to coaches and players, and create deeper analyses of games.

The AI platform freed business and financial journalists from the drudgery of churning out dozens of rote earnings stories, giving them time to concentrate on more substantial topics. (For a couple of years, I subscribed to an Automated Insights service that turned web analytics into written reports. Those fluidly written reports highlighted key information about site visitors and provided a great way to monitor web traffic. The company eventually stopped offering that service as its corporate clients grew.)

I see the same opportunity in higher education today. ChatGPT and other artificial intelligence platforms will force us to think beyond the formulaic assignments we sometimes use and find new ways to help students write better, think more deeply, and gain skills they will need in their careers.

As Grant Jun Otsuki of Victoria University of Wellington writes in The Conversation: “If we teach students to write things a computer can, then we’re training them for jobs a computer can do, for cheaper.”

Rapid developments in AI may also force higher education to address long-festering questions about the relevance of a college education, a grading system that emphasizes GPA over learning, and a product-driven approach that reduces a diploma to a series of checklists.

So what can we do?

Those issues are for later, though. For many instructors, the pressing question is how to make it through the semester. Here are some suggestions:

Have frank discussions with students. Talk with them about your expectations and how you will view (and grade) assignments generated solely with artificial intelligence. (That writing is often identifiable, but tools like OpenAI Detector and CheckforAI can help.) Emphasize the importance of learning and explain why you are having them complete the assignments you use. Why is your class structured as it is? How will they use the skills they gain? That sort of transparency has always been important, but it is even more so now.

Students intent on cheating will always cheat. Some draw from archives at greek houses, buy papers online or have a friend do the work for them. ChatGPT is just another means of avoiding the work that learning requires. Making learning more apparent will help win over some students, as will flexibility and choices in assignments. This is also a good time to emphasize the importance of human interaction in learning.

Build in reflection. Reflection is an important part of helping students develop their metacognitive skills and helping them learn about their own learning. It can also help them understand how to integrate AI into their learning processes and how they can build and expand on what AI provides. Reflection can also help reinforce academic honesty. Rather than hiding how they completed an assignment, reflection helps students embrace transparency.

Adapt assignments. Create assignments in which students start with ChatGPT and then have discussions about strengths and weaknesses. Have students compare the output from AI writing platforms, critique that output, and then create strategies for building on it and improving it. Anne Bruder offeres additional suggestions in Education Week, Ethan Mollick does the same on his blog, and Anna Mills has created a Google Doc with many ideas (one of a series of documents and curated resources she has made available). Paul Fyfe of North Carolina State provides perhaps the most in-depth take on the use of AI in teaching, having experimented with an earlier version of the ChatGPT model more than a year ago. CTE has also created an annotated bibliography of resources.

We are all adapting to this new environment, and CTE plans additional discussions this semester to help faculty members think through the ramifications of what two NPR hosts said was startlingly futuristic. Those hosts, Greg Rosalsky and Emma Peaslee of NPR’s Planet Money, said that using ChatGPT “has been like getting a peek into the future, a future that not too long ago would have seemed like science fiction.”

To that I would add that the science fiction involves a robot that drops unexpectantly into the middle of town and immediately demonstrates powers that elicit awe, anxiety, and fear in the human population. The robot can’t be sent back, so the humans must find ways to ally with it.

We will be living this story as it unfolds.


Doug Ward is an associate director at the Center for Teaching Excellence and an associate professor of journalism and mass communications.

By Doug Ward

The latest enrollment report for universities in the Kansas regents system (down 1.5%) seems worth little more than a shrug. Longer term, though, the higher education trends in Kansas will require considerable attention – and action.

Enrollment at the six regents universities has fallen 13.5%, or 10,100 students, since peaking in 2011. That average masks even bigger declines at individual universities: Pittsburg State, down 28.4% since 2011; K-State, down, 21.9%; Emporia State, down 19.7%.

Those make KU’s decline of 11.4% during that period look small, especially with 2022 enrollment basically unchanged since last year and with an 8.2% increase in the number of freshmen this year. The percentage of out-of-state students increased, as well, and the university will no doubt continue to rely on out-of-state students, considering that the rate of Kansas high school students going to in-state public colleges has dropped 10 percentage points, to 44.8%, since 2015.

I’ve written quite a bit about the persistent enrollment challenges in Kansas and around the country. It’s a daunting topic that will require strategic thinking at every level of the university. (Recent cuts at Emporia State offer a glimpse at just how painful this could become.) The rethinking of how we approach higher education must include classes, an area where many instructors have made great improvements but where KU still has considerable work to do in adopting teaching practices that promote student success. It must also include the many structural barriers that Michael Dennin, vice provost for teaching and learning at the University of California, Irvine, spoke about at this year’s Teaching Summit. Those include things like curricula that are difficult for students to navigate and that make assumptions about student capabilities; demands on faculty time; inflexibility in classes and curricula; and a system that provides few incentives for cooperation.

It is through that lens of teaching that I look at some of the areas that stand out in this fall’s enrollment figures.

Women and men

At regents universities, women account for 56% of the overall student population, up about 3 percentage points over five years. Men now make up only 43.8% of the overall student population, down about 3 percentage points over that same period.

KU has a larger percentage of men (46.7%), but that may be the lowest percentage in the university’s history. I can’t say that with certainty, but it is the lowest since at least 1965, the first year for which Analytics, Institutional Research, and Effectiveness provides data.

In news reports from as far back as 1930, universities in Kansas and Missouri reported that their students were primarily men. In October 1960, for instance, The Kansas City Star reported that men outnumbered women 2 to 1 or 3 to 1 on most college campuses in Kansas and Missouri.

In terms of headcount, this year’s group of 11,146 men is the smallest since 1973, the last year the United States had a military draft. Overall headcount enrollment was 18,683 that year, 5,000 fewer than today’s, and men still accounted for 59.1% of students in 1973. Women at KU outnumbered men for the first time in 1988. Their numbers peaked in the early 2000s, but their percentage of total enrollment has grown each year since 2015. They now make up 53.2% of students at KU. That seems to be the highest ever.

The changes at KU have also followed national trends. Young women are more likely to graduate from high school on time and are substantially more likely to earn at least a bachelor’s degree (41% vs. 32% among those age 25 to 34), according to the Brookings Institution. Those numbers vary widely by state, though, as the Brookings table below shows.

Those same differences can be seen in graduate degrees. Since the early 2000s, women have earned about 60% of master’s degrees nationwide, and since 2005-06, more women than men have earned doctorates each year. The most recent totals from the National Center for Education Statistics show that women earn about 54% of Ph.D.s.

Graduation rates

KU rightfully boasted about all-time highs for four-, five-, and six-year graduation rates. The university’s year-over-year retention rate of 84.7% is virtually unchanged from a year ago. That’s also good news.

The not-so-good news is that 1 of every 5 students leaves the university after three semesters, and 1 of every 4 students leaves after two years.

And though the four-year graduation rate has increased nearly 20 points since 2007, it is still a paltry 55%. Over five years, 66.1% of students graduate. That’s a 10-point gain since 2007, but a third of students fail to earn a degree after five or six years. That six-year rate is lower than the average among full-time students at U.S. universities (67.4%) and among students at four-year public institutions (72%).

Graduate enrollment

The number of graduate students at KU has been declining steadily since 1991. At that peak, KU had 7,233 graduate students, according to statistics provided by AIRE. This fall, it has 5,166, a decline of 28.6% since 1991.

That is the smallest number of graduate students the university has had since 1974. This fall’s graduate cohort also makes up the smallest percentage of the overall student population (21.6%) since 1970 (20.2%). Declining numbers of undergraduates nationwide means that the pool of potential graduate students has also been shrinking.

KU’s declines in graduate enrollment run counter to nationwide trends over the last 10, 20 and 30 years, according to the National Center for Education Statistics. Since 1991, graduate enrollment has increased 21% at public universities and 58% at all U.S. universities.

Not surprisingly, the decline in graduate enrollment at KU has meant fewer graduate teaching assistants. The number of GTAs has fallen nearly 18% since 2014, from 1,127 to 927.

Online enrollment

The number of undergraduates taking online or hybrid online courses declined 14.7% this fall compared with Fall 2021. That is the second consecutive yearly decline since online and hybrid enrollment peaked during the pandemic-riddled Fall 2020 term.

Even so, online and hybrid online enrollment among undergraduates this fall was 22% higher than it was in Fall 2019, before the pandemic began. Those students are also taking more online credit hours (39% more than they did in Fall 2019). Those increases are no doubt even higher because of a change in the way KU accounted for online and online hybrid hours. I won’t go into those details, but a footnote on an AIRE-generated table explains the change.

Graduate enrollment in online courses shows a more troubling trend. If we omit the pandemic-inflated figures of 2020 and 2021, the number of students enrolled in graduate and professional courses online has increased 4.2% since Fall 2017, but the number of credit hours has declined nearly 15%.

In other words, there are slightly more online graduate students, but those students are taking fewer classes. The students looking for graduate programs online have also become more choosy, according to the educational consulting organization EAB. Those students often spend months or even years combing through university websites and looking for programs that provide the skills they want but that also waive admissions fees, overlook sometimes spotty undergraduate records, and allow admission without the GRE or other admissions tests.

Shaping the future

Those are just a few of the enrollment trends shaping KU and other universities, and the future will require both cultural and digital change, as John O’Brien argues in Educause.

Universities (KU included) are trying many new approaches as they adapt to shrinking numbers of students and changes among students. Those include more non-credit courses, certificate programs, stackable degrees, and micro-credentials. Some are creating partnerships with area businesses as students focus more urgently on skills they can use in jobs. Others are looking at ways to help students gain credentials in shorter time spans.

At CTE, our programs have helped departments define their curricula in terms of tangible skills, identify ways of making existing courses more appealing to students, create more cohesive curricula, clarify paths to degrees, and connect with more alumni. They have also helped faculty adapt their teaching to a more diverse student body, find ways of drawing on individual differences as a strength rather than a weakness, reinvigorate classes, and hone their teaching.

In all these programs, we have helped build a community that shares ideas and embraces innovation. That community will only grow more important as we navigate changes in enrollment, society, and expectations, and find a meaningful path to the future.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications. You can follow him on Twitter @kuediting. You can also follow CTE @KU_CTE.

By Doug Ward

The future of higher education may very well hinge on our skill as interpreters and communicators.

Too often, though, we never bother to define the terms we use or to help students, parents, and employers understand the purpose and significance of a college education, Ashley Finley told participants at the 2021 KU Teaching Summit last week.

Ashley Finley smiles as she listens at the 2021 Teaching Summit
Ashley Finley

“We develop language as currency,” said Finley, who is vice president for research at the Association of American Colleges and Universities, “and we communicate with each other about a shared meaning without really ever actually defining” what we mean.

Finley is a sociologist who has studied such areas as assessment, high-impact practices, equity in institutional outcomes, and student success. In her presentation at the Teaching Summit, she drew on a recent AAC&U report she wrote titled How College Contributes to Workforce Success, based on a survey of executives and hiring managers at 496 companies. That report contains both good and bad news for colleges and universities.

For instance, 87% of executives and hiring managers said a degree was definitely or probably worth it, but a smaller percentage (67%) said they had a great deal or quite a lot of confidence in higher education. Finley compared those results to a random sample of adults who were asked the same question. Only 60% thought a college degree was worth the time and money.

“We have to get serious about how we’re communicating out what we do,” Finley said. Colleges and universities need to do better at telling their story, she said, not only to employers but to students.

“That explicitness is absolutely for our students,” she said. “They will be our best ambassadors.”

Finley used the term well-rounded as an example of why good communication is important. In academia, we often talk about a need for well-rounded students, but we rarely explain what that means. Students create their own interpretations, though, as Finley showed with a student quote from a focus group:

“I don’t know too many jobs that the job is being well-rounded. You know, it’s not like you’re going to work at Well-Rounded Inc. or something.”

Finley said she appreciated the student’s snark.

“They’ve taken us to task for not defining this thing that we lob in front of them constantly,” Finley said.

She also said the comment was a “good reality check” for educators, in part because of the connection the student made between education and employment.

“They linked it with a job, as if to suggest what you do for me as a person, how I situate myself in a community, and what I learn about a sense of purpose doesn’t have anything to do with the work that I’ll do,” Finley said.

Defining the common good

Another term we often fail to define, Finley said, is common good. She referred to the title of her presentation – Learning for Our Common Good: The Overlapping Skills of Successful Lives and Careers – and said that we all had different definitions of common good and that we rarely shared those definitions with others. She tried to unpack the term.

wide angle picture of Ashley Finley and participants of Teaching Summit in large lecture room
Finley speaks to Summit participants in 130 Budig Hall. About 150 people joined this year’s Summit, a number that was down significantly because of the pandemic.

“When we are talking about a common good, we are talking about a greater purpose,” Finley said. “We are talking about how work influences our life, builds a sense of identity, gives our own sense of purpose in the world.”

The common good, she said, is closely tied to the overlapping skills we want students to acquire while they are in college. She asked members of the audience to offer their thoughts on those skills and on how students should be different by the time they graduate. The responses painted a broad picture of aspirations within the academy:

  • To critically evaluate what they read and hear
  • To gain perspectives on people and stories other than their own
  • To be problem-solvers and “realize that there’s a whole big world out there”
  • To be more open to diverse people and perspectives
  • To try new things without fear of mistakes
  • To feel empowered to make the world a better place

We all might add our own flavor and content to those things, Finley said, but they all sound perfectly reasonable for any discipline. She then asked:

“At the end of the day, can we in fact have a common conversation about what matters, and the standards to which we might hold ourselves for students’ learning and for their success?”

The employer view

Articulating a clearer sense of higher education will require us to move past the “false binaries” we often create, Finley said. Those include things like depth vs. breadth, and academic vs. practical skills. They all matter, she said. Higher education should be committed to knowledge, and “equally committed to the ways in which we equip students to have the skills to use knowledge, to create new knowledge, to have an imagination.”

Employers generally see the value in the many skills students gain in college, Finley said. They also value things like mindset, aptitudes, and character. The most recent AAC&U study showed a disconnect between the skills that colleges and universities emphasize and the views employers have on students’ career preparedness. When she and other researchers at AAC&U looked more closely at data from the employer survey, though, they found a stark difference in the perceptions of executives and hiring managers 40 and younger and those 50 and older. Those under 40 are more optimistic about student preparation. They also value different skills.

For instance, younger employers put considerably more emphasis on the need for critical thinking, leadership skills, empathy, and an ability to work with numbers and statistics. More broadly, they are far more likely than their older colleagues to say that a college education should encourage engagement in communities, foster a sense of social justice, focus on global issues, and emphasize the liberal arts and sciences.

“This felt like a game-changer to us,” Finley said, adding: “Hello, liberal arts. And hello, community-based learning.”

A need for articulation

If the views of younger employers offered optimism about the core values of higher education, another study that Finley brought up muddied the picture. That study showed a growing gap between the number of campuses that say they have learning outcomes and the number of students who are aware of those outcomes. She called that a “reality check of how our own communication is going.”

participants wear masks while sitting in Budig lecture hall
Summit participants were required to wear masks this year, as is everyone in KU buildings.

We have to increase the visibility of our core goals, Finley said. We have to do a better job of communicating, and we have to do a better job of projecting the type of outcomes we care about. This will require a nuanced approach to career preparation, she said, and must help students connect the dots among the courses they take and the experience they gain while in college. By the time students graduate, they should be different, she said.

“It’s not just about what they know and can do; it’s about who they are. And should they be able to persist through failure? Should they be a little more resilient? Should they have a sense of what it means to flourish?”

The answer to all of those is yes, of course, and Finley was optimistic that faculty could work through the many challenges before them.

“Good teachers are good learners,” Finley said. “You have to be humble to learn something new, and I hope that is always a point of connection we have with our students.”

* * * * * *

You will find a recording of Finley’s presentation at the Teaching Summit on the CTE website.


Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications. You can follow him on Twitter @kuediting.

By Doug Ward

The headlines about KU’s fall enrollment sounded much like a Minnesotan’s assessment of winter: It could be worse.

Indeed it could have been, given the uncertainties brought on by the coronavirus and rumblings among students that they might sit out the year if their courses were online.

Depending on how you measure, enrollment on the Lawrence and Edwards campuses fell either 2.7% (headcount) or 3.4% (full-time equivalency) this fall. That is about the same as the nationwide average (-3%) but slightly worse than the average decline of 1.4% at four-year public universities, according to data from the National Student Clearinghouse Research Center.

A single year’s top-level data provides only a limited view of a much bigger picture. To better understand this year’s enrollment, we need to take a broader and deeper look in terms of geography, history and demographics. Here’s what I’m seeing in data from Academics and Analytical Research, the Kansas Board of Regents and some other sources.

Enrollment declines throughout the state

KU was hardly alone in dealing with the sting of an enrollment decline. Among regents universities, Pittsburg State had the largest decline in enrollment (-5.9%), followed by K-State (-5.1%), KU, Wichita State (-3.1%), Fort Hays State (-2.8%) and Emporia State (-2.3%).

As a whole, the state’s community colleges fared far worse, with a combined drop of 11.7%, about 2 percentage points higher than the national average. Johnson County Community College had the largest decline (18.7%). Enrollment at JCCC has fallen 23.5% over the past five years, a troubling statistic given KU’s proximity and institutional connections to JCCC. During that same period, enrollment at the state’s 19 community colleges has fallen by an average of 18.6%, according to regents statistics. Eight of those colleges recorded declines of more than 20%.

Kansas is one of 11 states where the decline in undergraduate enrollment exceeded the national average, according to the National Student Clearinghouse Research Center, Others include Missouri, Arkansas, Illinois, Indiana and Florida. Only five states recorded increases in undergraduate enrollment, including Nebraska.

Putting the trends into perspective

Over the past 50 years, college and university enrollment has reflected broader societal trends that made a college degree a sought-after goal. As numbers trend downward, though, enrollment figures also highlight the looming challenges that most of higher education faces.

From the 1960s to the 1980s, undergraduate enrollment rose steadily as baby boomers entered college in larger percentages than previous generations. The number of colleges – especially community colleges – grew, providing more opportunities for students to seek a degree. Federal aid, including low-interest loans, also expanded, as the federal government promoted the importance of education and invested in university research. A college degree became the minimum standard for many jobs and led to higher salaries over a degree holder’s lifetime.Chart showing total births in Kansas from 1995 to 2018

Those trends are certainly reflected in KU’s enrollment data. Between 1965 and 1991, headcount enrollment at KU nearly doubled. (See the chart below.) It declined after a recession in the early 1990s, but rose again in the early 2000s, peaking in 2008 during the recession. It declined until 2012, stabilized briefly, and then began another decline, one that is very likely to continue given a declining school population. K-12 enrollment in Kansas peaked in the 2014-15 school year, according to Kansas State Board of Education data. It is projected to start a significant decline in the late 2020s, largely because of a decline in birth rates after the recession of 2007-08. Since peaking in 2007, birth rates in Kansas have fallen 13.6%. (See the chart above with the most recent data available from the state.)

In another disturbing trend, the number of Kansas students coming to KU has dropped 17.7% since 2011. (It was down 2.9% this year.) The university has attracted more out-of-state students, who make up about 40% of the student population, but the trends among Kansas students are bleak.

KU attracts the largest number of students from Johnson County, which accounts for 28.3% of the university’s enrollment. The number of students from Johnson County has fallen 7.3% over the past decade. That is far less than the drop in students from other counties from which the university draws the most students: Douglas (-25.2% since 2011), Sedgwick (-27.1%), Shawnee (-26%). Declines in others aren’t as dramatic but are still troubling: Wyandotte (-9.8%), Leavenworth (-3.3%) and Miami (-3.2%). Others are far worse: Saline (-30.5%), Riley (45.1%), Reno (42.3%).


More Hispanic students, fewer international students

One of the most interesting developments I saw in enrollment this fall was that for the first time in decades, the number of Hispanic students at KU exceeded the number of international students. (See the chart below.)

This reflects two major trends. First the Hispanic population of Kansas has grown more than 70% since 2000. Hispanics now make up more than 12% of the Kansas population and 18.5% of the U.S. population. The number of Hispanic students at KU rose 3.3% this year and has risen nearly every year since the mid-1980s.

Meanwhile, the Trump administration has taken a less-than-welcoming stance toward international students and immigration in general. That, combined with a global pandemic and lack of a coherent plan for combatting the pandemic, has sent international enrollment at U.S. universities plummeting. By one estimate, the number of new international students at U.S. universities could soon reach the lowest level since World War II.

As KU reported, the number of international students at the university declined more than 18% this fall. That decline is greater than the 12.5% decline in international students at public four-year universities, according to data from the National Student Clearinghouse.

Other trends worth noting

  • A continuing rise in female students. The number of female students on the Lawrence campus continued to exceed the number of male students. The number of male students fell 1.4% this year, compared with 0.5% for female students, and has fallen 11% since 2011. For the first time in at least a decade, the number of women who transferred to KU was larger than the number of men who transferred. Men now make up 47.5% of the KU student population. Nationally, the number of women seeking college degrees surpassed the number of men seeking degrees in 1979. That was the first time since World War II that more women than men attended college, according to the National Center for Education Statistics. In the 40 years since then, the gap has only increased, as it did again this year. Sixty-seven KU students did not identify as male or female this year. That was similar to the 73 in 2018 but down from 509 in 2019, suggesting that last year’s spike was intended as statement against the reporting system, primarily by graduate students.
  • Another decline in graduate enrollment. The number of graduate students on the Lawrence campus fell 2.2% this year, compared with an increase of 4.7% at public four-year universities. That is the fourth consecutive year of declines. The number of graduate students has fallen 12.5% since 2011. Graduate enrollment at KU peaked in 1991 and has declined 25% since then. (See the chart labeled University of Kansas Enrollment, 1965-2020.)
  • Another increase in part-time enrollment. I noted last year that the number of part-time students had been rising steadily. That number rose 6.8% again this year and is 18.9% higher than it was in 2011. Part-time students now account for 17.7% of the student body. That isn’t necessarily bad, given the university’s agreement to provide dual enrollment classes with the Lawrence Public School District. It is concerning, though, given that more students nationally are choosing to pursue their degrees part time. That gives them more flexibility to work but delays graduation. In what I see as a related trend, the number of non-degree-seeking students, although still small at 445, has increased more than 200% since 2011.
  • Some perspective on freshman enrollment. As the university reported, the number of incoming freshmen declined 7.2% this fall. Since a peak in 2016, the number of incoming freshmen has declined by 9.5%. Even so, the total this year is 7% above that of 2011.
  • A continuing drop in transfer students. The transfer rate to KU can only be described as glum. The number of new transfers to the Lawrence and Edwards campuses was down 8.2% this year and the total fell below 1,000 students for this first time in more than a decade. The number of transfer students has fallen 32.7% since 2011, following the downward trend in community college enrollment.
  • Large growth from a few states. Since 2011, the number of students from seven states has increased by an average of 45%: Missouri (+40%), Illinois (+46%), Colorado (+47%), Nebraska (+76%), California (+33%), Oklahoma (+61%), Wisconsin (+42%). Collectively, students from those states (by headcount) make up 22% of the student body at the Lawrence and Edwards campuses. KU also attracts a considerable number of students from Texas and Minnesota, although those numbers have grown only slightly over the past 10 years.
  • Business continues to grow. Even as overall university enrollment declined, undergraduate enrollment in the School of Business rose 7.9% this year and has grown 131% since 2011. Enrollment in engineering declined 2.2% this year but is up 31.6% since 2011. Enrollment in liberal arts and sciences continues to sag. Undergraduate enrollment in the College fell 4.8% this year and is down 21% since 2011. Graduate enrollment was down only slightly less. Even so, the College of Liberal Arts and Sciences still has nearly five times as many students as either business or engineering.

Where do we go from here?

Demographically over the past decade, the KU student population has become more Hispanic, more multiethnic and more female but less Kansan and less international. It is still predominantly white (68%) and is more oriented toward business and engineering. It has grown younger over the past decade, with students 22 and younger making up about 70% of the student body, compared with about 64% in 2011.

The university has about 1,500 fewer students than it did a decade ago. It has a slightly larger percentage of undergraduate students than at the start of the decade, although the proportion of undergraduates to graduate students has remained within a small range since 2000. Even so, graduate enrollment has fallen more than 14% since 2011.

I’ve written frequently about the challenges higher education faces, about the need to understand our students better, to innovate, to emphasize the human element of teaching and learning, to think about what we are preparing our students to do, and to provide a clearer sense of what higher education provides. This year’s enrollment figures simply reinforce all of that.

This is the fourth consecutive year of enrollment decline at KU and the ninth consecutive decline at the six regents universities. Those declines have become increasingly painful because of growing reliance on tuition and fees to pay the bills. In Fiscal 2019, tuition and fees accounted for more than 30% of the Lawrence campus’s $900 million in revenue. State appropriations accounted for just over 15%. In other words, students pay about $2 for every $1 the state provides. That is unlikely to improve in the foreseeable future, especially with the state facing a projected $1.5 billion shortfall in the current fiscal year.

In other words, the future of the university depends greatly on enrollment. Enrollment depends greatly on the value that students and parents see in KU. It’s up to all of us to make sure they do indeed understand that value.


Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications. You can follow him on Twitter @kuediting.

By Doug Ward

Distilling hundreds of comments about the future of the university into something manageable and meaningful is, in understated terms, a challenge.

The university’s department of Analytics and Institutional Research accomplished that, though, creating a 73-item list that summarizes ideas from a fall planning session and from comments submitted through an online portal. That list, titled What We Could Do at KU, was distributed to the 150 or so university employees who gathered last week for a second strategic planning session. Presumably, Provost Barbara Bichelmeyer and Chancellor Doug Girod drew on those in creating another document that listed vision, mission and values statements, along with their institutional priorities. The priorities they laid out – student success, healthy and vibrant communities, and research and scholarship – offer a good sense of where they want the university to go in the coming years.

word cloud from strategic planning document
The larger the word in this word cloud, the more the idea was mentioned by university employees.

I have a few thoughts about those priorities – namely a lack of any mention of teaching – but I want to focus on something else first.

I found many connections among the 73 suggestions on the What We Could Do list, and I wanted a way to get a better grasp on those ideas. That’s because they provide a broad look at what employees around the university see as important.

I started by creating a spreadsheet, combining and paring the 73 suggestions into 68 words and short phrases. Think of it as a summary of a summary, which has both benefits and drawbacks. I then used those to create the two word clouds that accompany this article.

I wasn’t able to get all 68 words and phrases into a single word cloud, so I eliminated those that were mentioned by fewer than five people. I also created a separate list of 11 verbs that were used in the summary statements. Most describe a need to do more or less of something. This by no means indicates a consensus of ideas from around campus. Rather, it represents the opinions of those who were willing to take the time to attend a planning session or to submit comments online. (I was one of those people.)

Collaborate and communicate

There’s nothing startling on the list, but I was nonetheless surprised by the prominence of collaboration and communication. I agree with those wholeheartedly, and I’m glad others put them at the top of the list.

In far too many cases, departments and offices work in isolation (or in siloes, another word on the list) and even compete against one another for students, resources and attention. To improve as a university, we must find more ways to work together and see ourselves as part of a singular effort rather than as a collection of competing entities. We need to find more ways for our students to collaborate with faculty and with one another. We also need to collaborate with other colleges and universities, and with communities in Lawrence, Kansas City and across Kansas.

verbs used in strategic planning document
These are the verbs used in the summary of what KU employees saw as important in strategic planning.

Doing that requires better communication internally and externally. We have to make sure potential partners around the university know what we are doing, and we need to tell our story (another prominent term) to students, families, businesses and communities. They need to understand that we are part of – not separate from – them.

Another frequently mentioned issue, financial stability, ties into other needs like maintenance, retention, accessibility, professional development, degree cost, campus beauty, mental health, morale, accountability and transparency.

Three other prominent terms on the list – diversity, mental health and generational needs – tie closely together. The diversity of the student body has increased over the last decade, but the student population at KU is still predominantly white. The faculty and staff are even less diverse. The current generations of students are more diverse and have different needs from previous generations.

Not surprisingly, most of the comments from around campus called for an increase in something, including diversity, revenue, accountability, prestige, student and faculty retention, and, of course, collaboration and communication. After years of underfunding and a few rounds of budget cuts, there are many unmet needs.

What about teaching?

If the What We Could Do at KU list represented the opinions of faculty and staff, a document called Jawhawks Rising gave a clear sense of where university leaders want to go. It’s a good aspirational document.

Strangely missing, though, is any mention of teaching. The document uses phrasing like “community of learners,” and “student engagement” and “educate leaders.” It lists “student success” as one of three core institutional priorities.

Teaching doesn’t show up anywhere, though. That’s discouraging and disturbing. You can argue that “educate” involves teaching. It does. But without a clear strategy for improving and elevating the importance of teaching, any attempt to improve student success will fall short. And without the involvement of faculty in student success, the vision, the mission and the values of the institution quickly become hollow.

All of this is a work in progress, and encouragingly, Bichelmeyer gave teaching an important nod in remarks she made at the start of the strategic planning session last week.

“We’re learning about how we teach and how our students learn,” Bichelmeyer said, referring to the use of analytics to examine curricula and student movement through curricula. “There are lots of ways where we can start to unpack the individual student from the crowd through watching and knowing that they need a nudge to say, ‘It’s really important for you to get to the first week of class’ or ‘It’s really important that you don’t turn your homework in late.’ ”

She added: “We’re not teaching little widgets on an assembly line where we hold time constant and let achievement vary or we think about our work as production.”

She also pointed to the need to change our approach to engaging students, many of whom work 20 or more hours a week and have family responsibilities. They also see technology as an important part of who they are.

“Students would rather have a lecture on YouTube than sit in a class with a thousand students where they can’t see the professor and they can’t see what’s on the board and they maybe can’t hear,” Bichelmeyer said. “And they don’t have to pay for parking, and they don’t have to get a babysitter, and they can do that at night.

“So when we think about unbundling the elements of instruction, we have to understand that what we do well at the University of Kansas that nobody else can do is we engage students,” she added.

Unbundling and rethinking

Additionally, she said, digital technology is leading to the separation of teaching from certification. That is, students no longer need a university credential to get good jobs. They can learn from many online providers or gain skills from short-term coding camps and other intensive sessions that don’t require a four- or five-year commitment and cost far less than a university degree.

“So we have to think about what it is that only we can do really well and how we think about the educational experience from the students’ perspective in order to help them think about why it’s worth it for them to be at KU,” Bichelmeyer said.

Think collaboration, communication, diversity, generational needs, networking, accessibility, engagement, cost and other terms from the campus list. But also think teaching and learning, which is why students come to the university in the first place.


Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism. You can follow him on Twitter @kuediting.

CTE’s Twitter feed