Innovations move assessment in a positive direction


Assessment often elicits groans from faculty members.

It doesn’t have to if it’s done right. And by right, I mean using it to measure learning that faculty members see as important, and then using those results to revise courses and curricula to improve student learning.

In a white paper for the organization Jobs for the Future, David T. Conley, a professor at the University of Oregon, points out many flaws that have cast suspicion on the value of assessment. He provides a short but fascinating historical review of assessment methods, followed by an excellent argument for a drastic change in the ways students are assessed in K-12. He also raises important issues for higher education. The report is titled A New Era for Educational Assessment.

Conley says that the United States has long favored consistency in measuring something in education over the ability to measure the right things. Schools, he says, “have treated literacy and numeracy as a collection of distinct, discrete pieces to be mastered, with little attention to students’ ability to put those pieces together or to apply them to other subject areas or real-world problems.”

The cover of a new era of educational assessmentOne reason standardized testing has recently come under scrutiny, he says, is that new research on the brain has challenged assumptions about fixed intelligence. Rather, he says, researchers have come to an “understanding that intellectual capacities are varied and multi-dimensional and can be developed over time, if the brain is stimulated to do so.” Relatedly, they have found that attitudes toward learning are as important as aptitude.

The Common Core has also put pressure on states to find alternatives to the typical standardized test. The Core’s standards for college readiness include such elements as the ability to research and synthesize information, to develop and evaluate claims, and to explain, justify and critique mathematical reasoning – complex abilities that defy measurement with multiple-choice questions. Schools have been experimenting with other means to better measure sophisticated reasoning include, Conley writes. They include these:

  • Performance tasks that require students to parse texts of varying lengths and that may last from 20 minutes to two weeks. (KU’s Center for Education Testing & Evaluation has been working on one such test.)
  • Project-centered assessment, which gives students complex, open-ended problems to solve.
  • Portfolios, which collect a wide range of student work to demonstrate proficiency in a wide range of subjects.
  • Collaborative problem-solving, which sometimes involves students working through a series of online challenges with a digital avatar.
  • Metacognitive learning strategies, which Conley describes as ways “learners demonstrate awareness of their own thinking, then monitor and analyze their thinking and decision-making processes” and make adjustments when they are having trouble. Measuring these strategies often relies on self-reporting, something that has opened them to criticism.

Conley sees opportunities for states to combine several forms of assessment to provide a deeper, more nuanced portrait of learning. He calls this a “profile approach” and says it could be used not only by teachers and administrators but also colleges and potential employers. He asks, though, whether colleges and universities are ready to deal with these more complex measurements. Higher education has long relied on GPAs and test scores for deciding admissions, and more nuanced assessments would require more time to evaluate and compare. He says, though, that “the more innovative campuses and systems are already gearing up to make decisions more strategically and to learn how to use something more like a profile of readiness rather than just a cut score for eligibility.”

Conley raises another important issue for higher education. Over the past decade, high schools have focused on making students “college and career ready,” although definitions of those descriptions have been murky. Because of that, educators have “focused on students’ eligibility for college and not their readiness to succeed there.” Conley and others have identified key elements of college readiness, he says. Those include such things as hypothesizing and strategizing, analyzing and evaluating, linking ideas, organizing concepts, setting goals for learning, motivating oneself to learn, and managing time.

The takeaway? Assessment is moving in a more meaningful direction. That’s good news for both students and wary faculty members.


Doug Ward is an associate professor of journalism and the associate director of  the Center for Teaching Excellence. You can follow him on Twitter @kuediting.

Posted on by Doug Ward
Tagged assessment