Series of three photos. A lone tree across water at sunset; a home atop a mountaintop; a rainbow behind a rainbow-colored ice cream cone.

By Doug Ward

The pandemic has taken a heavy mental and emotional toll on faculty members and graduate teaching assistants.

That has been clear in three lunch sessions at CTE over the past few weeks. We called the sessions non-workshops because the only agenda was to share, listen, and offer support. I offered some takeaways from the first session in March. In the most recent sessions, we heard many similar stories:

  • Teaching has grown more complicated, the size of our classes has grown, and our workloads and job pressures have increased. Stress is constant. We are exhausted and burned out. “I feel hollow,” one participant said.
  • Students need more than we can give. Many students are overwhelmed and not coming to class; they can’t even keep track of when work is due. They also aren’t willing to complete readings or put in minimal effort to succeed. All of that has drained the joy from teaching. “I have to psych myself up just to go to class some days,” one instructor said.
  • We don’t feel respected, and we have never been rewarded for the vast amounts of intellectual and emotional work we have put in during the pandemic. Instead, the workload keeps increasing. “It feels like the message is: ‘We hear you. Now shut up,’ ” one participant said.
  • We need time to heal but feel unable to ease up. Nearly all of those who attended the non-workshops were women, who often have additional pressures at home and feel that they will be judged harshly on campus if they try to scale back. “Society expects us to bounce right back, and we can’t,” one participant said.

Much has been written about the strain of the pandemic and its effect on faculty members and students. We can’t offer grand solutions to such a complex problem, which has systemic, cultural, psychological, and individual elements. We can offer support in small ways, though. So here is a motley collection of material intended to provide a modicum of inner healing. Some of these will require just a few minutes. Others will require a few hours. If none of them speak to you, that’s OK. Make sure to seek out the things that do brighten your soul, though.

An image (times 6)

I asked an artificial intelligence image generator called Catbird to create representations of serenity in everyday life. You will find three of those at the top of the page and three at the bottom. They won’t solve problems, but they do provide a momentary escape.

A song

“What’s Up,” by 4 Non Blondes. I recently rediscovered this early ’90s song, and its message seems more relevant than ever. It addresses the challenges of everyday life even as it provides a boost of inspiration. Even if you aren’t a fan of alt-rock, it’s worth a listen just to hear Linda Perry’s amazing voice.

A resource for KU employees

GuidanceResources. Jeff Stolz, director of employee mental health and well-being, passed along a free resource for KU employees. It is provided by the state Employee Assistance Program and can be accessed through the GuidanceResources site or mobile app. Employees can sign up for personal or family counseling, legal support, financial guidance, and work-life resources. The first time you log in, you will need to create an account and use the code SOKEAP.

A TED Talk

Compassion Fatigue: What is it and do you have it?, by Juliette Watt. Compassion fatigue, Watt says, is the cost of caring for others, the cost of losing yourself in who you’re being for everyone else.”

A recent article

My Unexpected Cure for Burnout, by Catherine M. Roach. Chronicle of Higher Education (20 April 2023). Try giving away books and asking students to write notes in return.

A book

Unraveling Faculty Burnout: Pathways to Reckoning and Renewal, by Rebecca Pope-Ruark (Johns Hopkins University Press, 2022). Pope-Ruark’s book focuses on women in academia and draws on many interviews to provide insights into burnout. The electronic version is available through KU Libraries.

A quote

From Pope-Ruark’s book.

I learned to offer myself grace and self-compassion, but it took a while, just as it had taken a while for my burnout to reach the level of breakdown. Once I was able to shift my mind-set away from needing external validation to understanding myself and my authentic needs, I was able to understand Katie Linder when she said, “It’s important for me to have that openness to growth, asking, ‘What am I supposed to be learning through the situation?’ even if it’s really hard or it’s not ideal or even great.”

If you don’t feel like devoting time to a book right now, consider Pope-Ruark’s article Beating Pandemic Burnout in Inside Higher Ed (27 April 2020).

A final thought

Take care of yourself. And find serenity wherever you can.

Three square pictures. A sunrise over mounds of foliage; a large cat and a small cat sitting on a rock; an abstract sunset.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications.

By Doug Ward

When Turnitin activated its artificial intelligence detector this month, it provided a substantial amount of nuanced guidance.

Montage of gophers and men trying to hit moles that pop up from the ground at a university quad
Trying to keep ahead of artificial intelligence is like playing a bizarre game of whack-a-mole.

The company did a laudable job of explaining the strengths and the weaknesses of its new tool, saying that it would rather be cautious and have its tool miss some questionable material than to falsely accuse someone of unethical behavior. It will make mistakes, though, and “that means you’ll have to take our predictions, as you should with the output of any AI-powered feature from any company, with a big grain of salt,” David Adamson, an AI scientist at Turnitin, said in a video. “You, the instructor, have to make the final interpretation.”

Turnitin walks a fine line between reliability and reality. On the one hand, it says its tool was “verified in a controlled lab environment” and renders scores with 98% confidence. On the other hand, it appears to have a margin of error of plus or minus 15 percentage points. So a score of 50 could actually be anywhere from 35 to 65.

The tool was also trained on older versions of the language model used in ChatGPT, Bing Chat, and many other AI writers. The company warns users that the tool requires “long-form prose text” and doesn’t work with lists, bullet points, or text of less than a few hundred words. It can also be fooled by a mix of original and AI-produced prose.

There are other potential problems.

A recent study in Computation and Language argues that AI detectors are far more likely to flag the work of non-native English speakers than the work of native speakers. The authors cautioned “against the use of GPT detectors in evaluative or educational settings, particularly when assessing the work of non-native English speakers.”

The Turnitin tool wasn’t tested as part of that study, and the company says it has found no bias against English-language learners in its tool. Seven other AI detectors were included in the study, though, and, clearly, we need to proceed with caution.

So how should instructors use the AI detection tool?

As much as instructors would like to use the detection number as a shortcut, they should not. The tool provides information, not an indictment. The same goes for Turnitin’s plagiarism tool.

So instead of making quick judgments based on the scores from Turnitin’s AI detection tool on Canvas, take a few more steps to gather information. This approach is admittedly more time-consuming than just relying on a score. It is fairer, though.

  • Make comparisons. Does the flagged work have a difference in style, tone, spelling, flow, complexity, development of argument, use of sources and citations than students’ previous work? We often detect potential plagiarism that way. AI-created work often raises suspicion for the same reason.
    • Try another tool. Submit the work to another AI detector and see whether you get similar results. That won’t provide absolute proof, especially if the detectors are trained on the same language model. It will provide additional information, though.
  • Talk with the student. Students don’t see the scores from the AI detection tool, so meet with the student about the work you are questioning and show them the Turnitin data. Explain that the detector suggests the student used AI software to create the written work and point out the flagged elements in the writing. Make sure the student understands why that is a problem. If the work is substantially different from the student’s previous work, point out the key differences.
  • Offer a second chance. The use of AI and AI detectors is so new that instructors should consider giving students a chance to redo the work. If you suspect the original was created with AI, you might offer the resubmission for a reduced grade. If it seems clear that the student did submit AI-generated text and did no original work, give the assignment a zero or a substantial reduction in grade.
  • If all else fails … If you are convinced a student has misused artificial intelligence and has refused to change their behavior, you can file an academic misconduct report. Remember, though, that the Turnitin report has many flaws. You are far better to err on the side of caution than to devote lots of time and emotional energy on an academic misconduct claim that may not hold up.

No, this doesn’t mean giving up

I am by no means condoning student use of AI tools to avoid the intellectual work of our classes. Rather, the lines of use and misuse of AI are blurry. They may always be. That means we will need to rethink assignments and other assessments, and we must continue to adapt as the AI tools grow more sophisticated. We may need to rethink class, department, and school policy. We will need to determine appropriate use of AI in various disciplines. We also need to find ways to integrate artificial intelligence into our courses so that students learn to use it ethically.

If you haven’t already:

  • Talk with students. Explain why portraying AI-generated work as their own is wrong. Make it clear to students what they gain from doing the work you assign. This is a conversation best had at the beginning of the semester, but it’s worth reinforcing at any point in the class.
  • Revisit your syllabus. If you didn’t include language in your syllabus about the use of AI-generated text, code or images, add it for next semester. If you included a statement but still had problems, consider whether you need to make it clearer for the next class.

Keep in mind that we are at the beginning of a technological shift that may change many aspects of academia and society. We need to continue discussions about the ethical use of AI. Just as important, we need to work at building trust with our students. (More about that in the future.)  When they feel part of a community, feel that their professors have their best interests in mind, and feel that the work they are doing has meaning, they are less likely to cheat. That’s why we recommend use of authentic assignments and strategies for creating community in classes.

Detection software will never keep up with the ability of AI tools to avoid detection. It’s like the game of whack-a-mole in the picture above. Relying on detectors does little more than treat the symptoms of a much bigger problem, and over-relying on them turns instructors into enforcers.

The problem is multifaceted, and it involves students’ lack of trust in the educational system, lack of belonging in their classes and at the university, and lack of belief in the intellectual process of education. Until we address those issues, enforcement will continue to detract from teaching and learning. We can’t let that happen.


Doug Ward is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communications at the University of Kansas.

CTE’s Twitter feed