By Doug Ward

Here’s a secret about creating a top-notch assessment plan:

Make sure that it involves cooperation among faculty members, that it integrates assignments into a broader framework of learning, and that it creates avenues for evaluating results and using them to make changes to courses and curricula.

Lorie Vanchena, Nina Vyatkina and Ari Linden of the department of Germanic languages and literatures accepted the Degree-Level Assessment Award from Stuart Day, interim vice provost for academic affairs.

Actually, that’s not really a secret – really, it’s just good assessment practice – but it was the secret to winning a university assessment award this year. Judges for both the Degree-Level Assessment Award and the Christopher H. Haufler KU Core Innovation Award cited the winners’ ability to cooperate, integrate and follow up on their findings as elements that set them apart from other nominees.

The Department of Germanic Languages and Literatures won this year’s degree-level assessment award, and the Department of Curriculum and Teaching won this year’s Haufler award. The awards were announced at last week’s annual Student Learning Symposium. Each comes with $5,000.

The German department focused its plan on two 300-level courses that serve as a gateway to the major, and on its capstone course. Stuart Day, the acting vice provost for academic affairs, said the University Academic Assessment Committee, which oversees the award, found the plan thorough, manageable and meaningful. It is one of the strongest assessment plans in place at the university, he said. It emphasizes substantive learning outcomes, uses a variety of methods for assessment, and includes a plan for making ongoing improvements.

Reva Friedman accepted the Haufler KU Core Innovation Award from DeAngela Burns-Wallace, vice provost for undergraduate studies.

DeAngela Burns-Wallace, vice provost for undergraduate studies, said the plan created by curriculum and teaching had similar characteristics, using a rich approach that integrates active learning, problem solving and critical thinking. The department created a “strong and intentional feedback loop for course improvement,” she said, and created a clear means for sharing results throughout the department.

So there again is that secret that isn’t really a secret: A strong assessment plan needs to include cooperation among colleagues, integration of assignments and pedagogy, and follow-ups that lead to improvements in the curriculum.

That sounds simple, but it’s not. Reva Friedman, associate professor of curriculum and teaching, and Lorie Vanchena, associate professor of Germanic languages and literatures, both spoke about the deep intellectual work that went into crafting their plans. That work involved many discussions among colleagues and some failed attempts that eventually led to strong, substantive plans.

“Everything we’re doing informs everything else we’re doing,” Friedman said.

She also offered a piece of advice that we all need to hear.

“All of us have our little castles with moats around them, and we love what we do,” she said. “But we need to partner in a different way.”

A new resource for teaching media literacy

In a world of “alternative facts,” we all must work harder to help students learn to find reliable information, challenge questionable information, and move beyond their own biases. To help with that, KU Libraries recently added a media literacy resource page to its website. Instructors and students will find a wealth of useful materials, including definitions, evaluation tools, articles and websites.


Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism. You can follow him on Twitter @kuediting.

By Doug Ward

All too often, we pursue teaching as an individual activity. We look at our classes as our classes rather than as part of continuum of learning. And we are often ill-prepared to help other instructors engage in a course’s evolution when they take it over. We may pass along course material, but rarely do we pass along the background, context, and iterations of a course’s development.

In a recent portfolio for the Center for Teaching Excellence, Holly Storkel and Megan Blossom explain how they did exactly that, demonstrating the benefits of collaboration in improving learning and in keeping the momentum of improvement intact.

Holley Storkel in her office in the Dole Human Development Center
Holley Storkel in her office in the Dole Human Development Center

Storkel, a professor and chair of speech-language-hearing, added active learning activities to a 400-level class called Language Science, a required undergraduate class on the basic structure of the English language. The changes were intended to help students improve their critical thinking and their interpretation of research articles. Blossom, who was a graduate teaching assistant for the class, built on that approach when she later took over as an instructor.

Storkel had taught the class many times and had been mulling changes for several years to help students improve their ability to find and work with research.

“I decided they should start reading research articles and get more familiar with that: understand how to find a research article, understand how to get it from the library, have basic skills of how to read a research article,” Storkel said in an interview. “And this class is supposed to be kind of the sophomore-junior level so that then, as they move to the junior-senior level, they would have the skills to find a variety of papers and do the synthesis across the papers and where that sort of things is the next level up. But I figured, ‘You can’t synthesize information if you didn’t understand what it is to begin with.’ ”

Blossom, who is now an assistant professor at Castleton University in Vermont, taught the same class three semesters later, building on Storkel’s work but making several changes based on the problem areas that she and Storkel identified. She reduced the number of research articles that students read in an attempt to give them more time in class for discussion. She also added pre-class questions intended to help students better prepare for in-class discussions, worked to make those discussions more interactive, and provided structured questions to help students assess articles.

In later discussions, Blossom let students guide the conversations more, having them work in pairs to interpret a particularly challenging article. To gain a better understanding of methods, students also created experimental models like those used in the article. Blossom pooled their results and had students compare the differences in their findings.

In their course portfolio, Storkel and Blossom said the changes improved class discussions about research and helped instructors devote more one-on-one attention to students in class. That was especially helpful for students who struggled with concepts. They also said the process itself provided benefits for students.

The benefits of collaboration

In a recent interview, Storkel said that collaboration was crucial in gaining a shared understanding of what students were learning from the class and where they were struggling. Rather than telling Blossom what to do, they talked through how they might make it better. She suggested that others use the same approach to improving classes.

“I think one thing that I would say to that is sort of sharing what you know so that you can get on the same page,” Storkel said. “Look at some student work and say, ‘Here’s how I taught the class. Here’s what the performance on this assignment looked like. They were doing pretty well with this but there were some struggles here, and so that might be something you want to think about if you’re going to keep some of these activities, or even if you’re doing different activities this seems to be a hard concept for them to learn or this process seems to be the part that’s really a stumbling block.’ ”

Storkel suggested that faculty engage in more conversations about the courses they teach and use course portfolios to make shared information more visible.

Portfolios provide a means to look at a class “and say, ‘What skills are people taking away from this? Where am I having a challenge?’ ” Storkel said, adding: “It’s already in a format then that is shareable and that’s more than just, ‘Here are my lecture notes’ or ‘Here are my slides. Here’s the syllabus.’ Here’s what actually happened. I think having rich records that can be easily handed off is good.”

Assessment also provides opportunities for increased sharing of experiences in courses, Storkel said.

“That might be another place where you can have a conversation around teaching, and then it might not even be attached to a particular class but more, ‘Here’s a particular skill. Students aren’t always getting it.’ So as I approach this class where that skill needs to be incorporated or we expect that to happen, now I’ve some idea of what might be challenging or not.”

It all starts with a willingness to share experiences, to put defensiveness aside, and to focus on what’s best for students.


Doug Ward is the associate director of the Center for Teaching Excellence and an associate professor of journalism. You can follow him on Twitter @kuediting.

Award winners, Student Learning Symposium, by Lu Wang
Chris Brown and Bob Hagen accepted the university degree-level assessment award for work that they and others have done in the environmental studies program. Chris Fischer, right, accepted the Chris Haufler Core Innovation Award on behalf of the physics department. Joining them at the Student Learning Symposium on Saturday were Provost Jeff Vitter, left, and Haufler, second from right. (Photo by Lu Wang)

By Doug Ward

Chris Brown sees assessment as a way to build community.

It brings together faculty members for much-needed discussions about learning. It helps departments explain to colleagues, administrators, and accreditors the value of a degree. And it helps create a better learning environment for students as courses undergo constant evaluation and revision.

“Assessment is not a box to check off on a thing that says you’re done and now you can stop thinking about it,” said Brown, director of the environmental studies program at KU. “It’s about people who are engaged in an ongoing process of learning themselves about how to do their jobs better.”

Brown’s program received the university’s degree-level assessment award at Saturday’s Student Learning Symposium in Lawrence. He was joined by two colleagues, Bob Hagen and Paul Stock, in accepting the award, which comes with $5,000 the program can use in advancing teaching and assessment.

Brown said everyone at KU was “basically taking crash courses” in assessment, which he describes as a series of questions about student learning:

  • How do you document?
  • What do you document?
  • How do you decide what’s valuable to document and what’s not valuable to document?
  • What changes do you need to make based on the evidence you’ve gathered?

Moving from informal to formal

Instructors in all departments have been engaging in informal assessment for years, Brown said.

“It’s every time we talk to each other about one way we think we could have done things better for a particular course, or all the times we’ve looked at our curriculum and decided to make changes,” he said. “The degree-level assessment we’ve been doing has taken that to a formal level.”

Faculty members in environmental studies began focusing on that formal assessment process a few years ago when the program did a self-study as part of an external review, Brown said. That forced them to take a hard look at what students were learning and why they thought the degree was valuable.

“We’re an interdisciplinary major,” Brown said. “Our foundational course should cover all the divisions of the college – the natural sciences, the social sciences and the humanities – as it relates to environmental studies. So there were a bunch of different moments that came together and really piqued people’s interest across our faculty and really say, ‘What do we want with this degree?’”

As it created a formal assessment process, environmental studies looked first at core writing skills, largely because instructors weren’t happy with the final projects students were turning in for upper-level courses. It was clear students were struggling with collecting evidence, structuring arguments, and making those arguments clear in their written work, he said. So faculty members broke larger assignments into smaller segments and gave more feedback to students along the way as they moved toward their final projects. Doing so has led to a dramatic improvement in those projects.

It has also led to opportunities for instructors to share their successes and struggles in classes. They also freely share class material with colleagues. Brown says that openness allowed him to teach an environmental ethics course for the first time with meaningful and successful results.

“I could not have done that if I weren’t in conversations with colleagues,” Brown said. “That’s what this comes down to.”

Brown makes assessment sound easy.

“Once the formal process began, it really helped solidify that we do need to get together at specific faculty meetings as a whole group,” he said.  “When I call those faculty meetings, I don’t have to pull teeth. Everybody comes. It’s not difficult. Perhaps it’s the nature of the major. People seek out contact across these various fields because it’s an interesting and rewarding conversation. Assessment has given us one more reason to come together and talk about what we value.”

Finding colleagues to help

He urges others interested in moving assessment forward to seek out like-minded colleagues, those with whom you are already having discussions about teaching.

“It really doesn’t have to start with any greater number of people than two,” Brown said. “Start there if that’s all you have.”

Talk about goals for students and goals for your major. Determine how you know students and the major are meeting those goals. Then think about how you can gather meaningful information and use that information in ways that lead to greater success. Then carry that conversation forward with other colleagues, including those in other departments. Draw on the many workshops and discussions at CTE.

“That’s hundreds of colleagues from various fields who are eager to talk with you about what you do and to help you and others see that what we’re doing with teaching and learning is intellectual work,” Brown said.

Again, assessment loops back to the idea of building community.


The lighter side of assessment

A short film that helped lead off Saturday’s Student Learning Symposium showed that assessment isn’t always serious business.

By Doug Ward

Let’s peer into the future – the near future, as in next semester. Or maybe the semester after that.

You’ll be teaching the same course that is wrapping up this week, and you’ll want to make some changes to improve student engagement and learning. Maybe some assignments tanked. Maybe you need to rearrange some elements to improve the flow of the course. Maybe you need to give the course a full makeover. By the time the new semester rolls around, though, the previous one will be mostly a blur.

So why not take a few minutes now to reflect on the semester? While you’re at it, why not solicit feedback from students?

Six question marks of different colors
Clker.com

To help, here are 20 questions to ask yourself and your students. This isn’t an exhaustive list. Rather, it’s a way to think about what you’ve accomplished (or haven’t) and how you can do better.

Learning and assessment

Use of class time

Assignments

  • What assignments or discussion topics worked best?
  • Which ones flopped? Why?
  • How might you improve the way you use Blackboard or other online resources?

Some questions to ask your students

I also like to spend time talking with students about the class. Sometimes I do that as a full class discussion. Other times, I use small groups. Either way, I ask some general questions about the semester:

  • What worked or didn’t work in helping you learn?
  • What would help next time?
  • How has your perspective changed since the beginning of the class?
  • What will you take away from the course?
  • How did the format of the class affect your learning and your motivation?

Sometimes students don’t have answers right away, so I encourage them to provide feedback in the self-evaluations I ask them to write, or in their course evaluations.

I promised 20 questions, so I’ll end with one more: What questions would you add to the list?


Doug Ward is an associate professor of journalism and the associate director of  the Center for Teaching Excellence. You can follow him on Twitter @kuediting.

By Doug Ward

Sylvia Manning offers an insightful characterization of a college education that summarizes the challenges all of us in higher education face today. In a paper for the American Enterprise Institute, she writes:

The reality is that no one can guarantee the results of an educational process, if only because a key element is how the student engages in that process. The output or outcome measures that we have are crude and are likely to remain so for considerable time to come. For example, the percentage of students who graduate from an institution tells us next to nothing about the quality of the education those students received.

Poster that says "Just because kids know how to use Twitter, Snapchat, and Instragram doesn't mean they how how to use technology to enhance their learning."
A good message about students and technology from Sean Junkins, via Twitter: http://bit.ly/1yFYfY5

Manning is right. In a piece for Inside HigherEd last year, I argued that students and administrators had become too caught up in the idea of education as a product. Far too many students see a diploma, rather than the learning that goes into it, as their primary goal. I tell students that I can’t make them learn. My job is to provide the environment and the guidance to help them learn. They have to decide for themselves whether they want to take advantage of the resources I provide – and to what degree. Only after they do that can learning take place.

Colleges and universities face a similar conundrum. They have come under increasing pressure to provide ways to measure their effectiveness. As Manning says, though, they have struggled to find effective ways to do that. Most focus on graduation rates and point to the jobs their graduates get. Many, like KU, are working at decreasing the number of students who drop or fail classes. Those are solid goals, but they still don’t tell us anything about what students have learned.

I’m not convinced that we can do that we can truly do that at a university level, at least not in the form of simplistic numeric data that administrators and legislators seem to want. There’s no meaningful way to show that student learning grew X percent this semester or that critical thinking increased at a rate of X over four years, although critics of higher education argue otherwise.

A portfolio system seems the best bet. It provides a way for students to show the work they have done during their time in college and allows them to make their own case for their learning. Portfolios also provide a means for students to demonstrate their potential to employers. By sampling those portfolios, institutions can then get a broad overview of learning. With rubrics, they can create a statistic, but the real proof is still qualitative rather than quantitative.

As an instructor, I see far more value in the nuances of portfolios, projects and assignments than I do in the rigid numerical data of tests and quizzes. Until that thinking gains a wider acceptance, though, we’ll be stuck chasing graduation rates and the like rather than elevating what really matters: learning.

A defense of liberal arts, along with a challenge

Without a backbone of liberal arts, science and technology lack the ability to create true breakthroughs. That’s what Leon Botstein, president of Bard College, argues in The Hechinger Report. Botstein makes a strong case, but he also issues a stinging rebuke to programs that refuse to innovate.

“Students come to college interested in issues and questions, and ready to tackle challenges, not just to “major” in a subject, even in a scientific discipline,” Botstein writes. “…What do we so often find in college? Courses that correspond to narrow faculty interests and ambitions, cast in terms defined by academic discourse, not necessarily curiosity or common sense.”

Bravo!

He argues for fundamental changes in curricula and organization of faculty, but also in the way courses are taught. The only aspect of education “that is truly threatened by technology is bad teaching, particularly lecturing,” he says. Instead, technology has expanded opportunities for learning but has done nothing to diminish the need for discussion, argument, close reading and speculation. He calls for renewed attention in helping students learn to use language and to use liberal arts to help students become literate in the sciences.

I’d be remiss if I didn’t bring up Botstein’s comparison of teaching and learning to sex, along with the slightly sensational but certainly eye-grabbing headline that accompanied his article: “Learning is like sex, and other reasons the liberal arts will remain relevant.”

Related: At Liberal Arts Colleges, Debate About Online Courses Is Really About Outsourcing (Chronicle of Higher Education)

Briefly …

College instructors are integrating more discussions and group projects into their teaching as they cut down on a lecture-only approach, The Chronicle of Higher Education reports. … David Gooblar of PedagogyUnbound offers advice on handling the seemingly never-ending task of grading … Stuart Butler of the Brookings Institution suggests ways to “lower crazy high college costs.” They include providing better information to students, revamping accreditation, and allowing new models of education to compete with existing universities.


Doug Ward is an associate professor of journalism and the associate director of  the Center for Teaching Excellence. You can follow him on Twitter @kuediting.

By Doug Ward

Assessment often elicits groans from faculty members.

It doesn’t have to if it’s done right. And by right, I mean using it to measure learning that faculty members see as important, and then using those results to revise courses and curricula to improve student learning.

In a white paper for the organization Jobs for the Future, David T. Conley, a professor at the University of Oregon, points out many flaws that have cast suspicion on the value of assessment. He provides a short but fascinating historical review of assessment methods, followed by an excellent argument for a drastic change in the ways students are assessed in K-12. He also raises important issues for higher education. The report is titled A New Era for Educational Assessment.

Conley says that the United States has long favored consistency in measuring something in education over the ability to measure the right things. Schools, he says, “have treated literacy and numeracy as a collection of distinct, discrete pieces to be mastered, with little attention to students’ ability to put those pieces together or to apply them to other subject areas or real-world problems.”cover of a new era for educational assessment

One reason standardized testing has recently come under scrutiny, he says, is that new research on the brain has challenged assumptions about fixed intelligence. Rather, he says, researchers have come to an “understanding that intellectual capacities are varied and multi-dimensional and can be developed over time, if the brain is stimulated to do so.” Relatedly, they have found that attitudes toward learning are as important as aptitude.

The Common Core has also put pressure on states to find alternatives to the typical standardized test. The Core’s standards for college readiness include such elements as the ability to research and synthesize information, to develop and evaluate claims, and to explain, justify and critique mathematical reasoning – complex abilities that defy measurement with multiple-choice questions. Schools have been experimenting with other means to better measure sophisticated reasoning include, Conley writes. They include these:

  • Performance tasks that require students to parse texts of varying lengths and that may last from 20 minutes to two weeks. (KU’s Center for Education Testing & Evaluation has been working on one such test.)
  • Project-centered assessment, which gives students complex, open-ended problems to solve.
  • Portfolios, which collect a wide range of student work to demonstrate proficiency in a wide range of subjects.
  • Collaborative problem-solving, which sometimes involves students working through a series of online challenges with a digital avatar.
  • Metacognitive learning strategies, which Conley describes as ways “learners demonstrate awareness of their own thinking, then monitor and analyze their thinking and decision-making processes” and make adjustments when they are having trouble. Measuring these strategies often relies on self-reporting, something that has opened them to criticism.

Conley sees opportunities for states to combine several forms of assessment to provide a deeper, more nuanced portrait of learning. He calls this a “profile approach” and says it could be used not only by teachers and administrators but also colleges and potential employers. He asks, though, whether colleges and universities are ready to deal with these more complex measurements. Higher education has long relied on GPAs and test scores for deciding admissions, and more nuanced assessments would require more time to evaluate and compare. He says, though, that “the more innovative campuses and systems are already gearing up to make decisions more strategically and to learn how to use something more like a profile of readiness rather than just a cut score for eligibility.”

Conley raises another important issue for higher education. Over the past decade, high schools have focused on making students “college and career ready,” although definitions of those descriptions have been murky. Because of that, educators have “focused on students’ eligibility for college and not their readiness to succeed there.” Conley and others have identified key elements of college readiness, he says. Those include such things as hypothesizing and strategizing, analyzing and evaluating, linking ideas, organizing concepts, setting goals for learning, motivating oneself to learn, and managing time.

The takeaway? Assessment is moving in a more meaningful direction. That’s good news for both students and wary faculty members.


Doug Ward is an associate professor of journalism and the associate director of  the Center for Teaching Excellence. You can follow him on Twitter @kuediting.

By Doug Ward

At a meeting of the CTE faculty ambassadors last week, Felix Meschke brought up a challenge almost every instructor faces.

Meschke, an assistant professor of finance, explained that he had invited industry professionals to visit his class last semester and was struck by how engaged students were. They asked good questions, soaked up advice from the professionals, and displayed an affinity for sharing ideas with speakers from outside the university.

The interaction was marvelous to watch, Meschke asked, but how could he assess it? He could ask a question on an exam, he said, but that didn’t seem right. The content of the discussions wasn’t as important as the discussions themselves and the opportunities those discussions brought to students.

In a sense, Meschke had answered his own question: His observations were a form of assessment. I suggested that he log those observations so he could provide documentation if he needed it. No, that wouldn’t provide a numerical assessment, but it would provide the kind of assessment he needed to make decisions on whether to do something similar in the future.

Paint pots and letter blocks
WordPerfect

All too often we think of assessment as something we do for someone else: for administrators, for accreditors, for legislators. Assessment is really something we need to do for ourselves, though. Thinking of it that way led to an epiphany for me a few years ago. Like so many educators, I approached assessment with a sense of dread. It was one more thing I didn’t have time for.

When I started thinking of assessment as something that helped me, though, it didn’t seem nearly so onerous. I want to know how students are doing in my classes so I can adapt and help them learn better. I want to know whether to invite back guest speakers. I want to know whether to repeat an assignment or project. I want to know what students report about their own learning. All of those things are natural parts of the teaching process.

That sort of thinking also helped me to realize that assessment doesn’t have to be quantitative. Assessments like quiz and exam grades can indeed point to strengths and weaknesses. If a large majority of students fails an exam, we have to ask why. Was there a problem in the way students learned a particular concept? A flaw in the wording of the exam? A lack of studying by students?

I rarely give exams, though. Rather, I use things like projects, journals and participation.

I use rubrics to grade the projects and journals, but the numbers don’t tell me nearly as much as the substance of the work. Only through a qualitative assessment do I get a sense of what students gained, what they didn’t gain, and what I need to rethink in future semesters.

In the class Meschke described, students applied their learning through active participation. Trying to put a numerical value on that would in some ways cheapen the engagement the students showed and the opportunities they gained in interacting with professionals. Observing those interactions provided excellent feedback to Meschke, though, and by writing a brief summary of his those observations, he could provide documentation for others.

The message was clear: Do it again next semester.

And when it comes to assessment, the message is clear, as well: Do it for yourself.

Additional resources

Portfolio Assessment: An Alternative to Traditional Performance Evaluation Methods in the Area Studies Programs, by Mariya Omelicheva

Assessment Resources for departments and programs at KU

Combining Live Performance and Traditional Assessments to Document Learning, by the School of Pharmacy


Doug Ward is an associate professor of journalism and the associate director of  the Center for Teaching Excellence. You can follow him on Twitter @kuediting.