Opinion
Accountability Opinion

We Aren’t Using Assessments Correctly

By John Hattie — October 27, 2015 5 min read

Much of the testing discussion in the United States today is grounded on several widely accepted notions: that we first must get the actual assessment instrument right, that there is an important distinction between “formative” and “summative” assessment, that teachers need to understand the language of assessment, and that we should drop tests on schools like “precision bombs” for the purpose of measuring a student’s performance and progress.

These notions are misguided, as decades of research from around the world on what matters most in student learning demonstrates. In fact, the major purpose of assessment in schools should be to provide interpretative information to teachers and school leaders about their impact on students, so that these educators have the best information possible about what steps to take with instruction and how they need to change and adapt.

BRIC ARCHIVE

So often we use assessment in schools to inform students of their progress and attainment. Of course this is important, but it is more critical to use this information to inform teachers about their impact on students. Using assessments as feedback for teachers is powerful. And this power is truly maximized when the assessments are timely, informative, and related to what teachers are actually teaching.

We hardly need more data—schools are awash with data. Instead, we need better interpretations of these data. When we built the New Zealand school assessment system 15 years ago, we started by designing the reports, rather than the tests. The reports were piloted to ensure we met our own two criteria:

Did the teachers and school leaders interpret the reports correctly? (And if not, we needed to change the reports.)

Was there a consequential action from the reports?

After many iterations of the reports, we were able to meet these criteria, and then, and only then, we backfilled them with great assessment items. The reports, not the tests, are what matter to teachers and school leaders; they are hungry for information to help them know their impact on their students. At the system level, policymakers are happy to receive continual information on the status of learning in the country without all the negativity that comes with high-stakes testing. Given that the New Zealand assessment system is voluntary, it’s worth noting that a majority of teachers and schools still use it today, 15 years later.

Students can be beneficiaries of assessment, provided we start by asking, 'Who owns the test data?'"

Until we see tests as aids to enhance teaching and learning, and not primarily as barometers of how much a student knows now, on this day, on this test, then developing more tests will add little, and will remain an expensive distraction. Educators need to understand what each student already knows, and where that student needs to go next in the teaching process. They need to be experts in using an array of interventions to help get their students to success, and to evaluate the impact that they’ve made.

As an education researcher, I know this because I spent more than 15 years collecting nearly 1,200 meta-analyses of 65,000 education studies focused on the learning of more than 250 million students around the world. Because almost any education intervention can claim some positive effect on learning, I developed a threshold to determine effect.

Far and away the most effective teaching intervention we found was what I call “visible learning": raising the quality of the feedback teachers receive about their impact. Expert teachers assess the visible impact they have on their students, constantly monitor learning and seek feedback about their teaching, and then evaluate and adjust their teaching methods based on these findings. Teachers show students how to self-assess their own performance, and discuss how they can improve. These are the major influences for raising achievement.

Students can be beneficiaries of assessment, provided we start by asking, “Who owns the test data?” The answer must be the students—it is their schooling, their lives, their futures that are at stake in classrooms—and so the focus should be first on developing “student assessment capabilities.” It is then incumbent on teachers to teach students how to know when the best time to assess their learning is, and how to interpret the information from the tests. This is the core idea of what we call formative interpretation—students learning to know what to do next in light of their progress.

The difference between formative and summative is important. As the University of Illinois assessment expert Bob Stake once put it, “When the cook tastes the soup, it is formative; when the guests taste the soup, it is summative.” Both can be valuable, and it is crucial to note that they do not refer to the tests. The same measure can be used for formative or summative interpretations.

So to speak of formative assessment is misleading, as it is the interpretation during the learning that is critical. But if we instead jump straight to “student assessment capabilities,” we avoid false distinctions, we place the students back at the center, and we are happy only when students understand their own learning based on test evidence. This is a powerful call to action for teachers—to learn about their own impact from assessments, and to teach students to do what they are asked themselves to do in their teaching, which is to evaluate the impact of learning.

This also means we do not need the notion of “teacher assessment literacy"—why should teachers be asked to learn the language of measurement people? Instead, we measurement people should learn how to speak in the language of learning and teaching and provide interpretations that are in turn correctly interpreted by teachers, with consequential actions and decisions. Similarly, we need reports from student assessments that help students understand their own progress in learning—what they can do, what they cannot yet do, where to go next. If only tests were created with these aims.

Assessment can be powerful in classes, schools, and school systems, but we need to stop the high-stakes methods and the overreliance on developing tests to maximize precision. We need to instead focus on the power of assessment as feedback to help teachers maximize their impact, and ask teachers to teach students how to interpret their own information from assessments.

Follow the Education Week Commentary section on Facebook and Twitter.
A version of this article appeared in the October 28, 2015 edition of Education Week as The Effective Use of Testing: What the Research Says

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Equity, Care and Connection: New SEL Tools and Practices to Support Students and Adults
As school districts plan to welcome students back into buildings for the upcoming school year, this is the perfect time to take a hard look at both our practices and our systems to build a
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Classroom Technology Webinar
Here to Stay – Pandemic Lessons for EdTech in Future Development
What technology is needed in a post pandemic district? Learn how changes in education will impact development of new technologies.
Content provided by AWS
School & District Management Live Online Discussion A Seat at the Table: Strategies & Tips for Complex Decision-Making
Schools are working through the most disruptive period in the history of modern education, facing a pandemic, economic problems, social justice issues, and rapid technological change all at once. But even after the pandemic ends,

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Accountability Opinion Absenteeism Is the Wrong Student Engagement Metric to Use Right Now
In a post-pandemic era for school accountability, let’s focus on measuring what matters.
Sara Johnson, Annette Anderson & Ruth R. Faden
4 min read
Figure being erased.
Getty
Accountability Biden Education Team Squashes States' Push to Nix All Tests but Approves Other Flexibility
The department has telegraphed its decision to deny states' requests to cancel federally mandated tests for weeks.
3 min read
A first-grader learns keyboarding skills at Bayview Elementary School in San Pablo, Calif on March 12, 2015. Schools around the country are teaching students as young as 6 years old, basic typing and other keyboarding skills. The Common Core education standards adopted by a majority of states call for students to be able to use technology to research, write and give oral presentations, but the imperative for educators arrived with the introduction of standardized tests that are taken on computers instead of with paper and pencils.
The U.S. Department of Education denied some states' requests to cancel standardized tests this year. Others are seeking flexibility from some testing requirements, rather than skipping the assessments altogether.
Eric Risberg/AP
Accountability Explainer Will There Be Standardized Tests This Year? 8 Questions Answered
Educators want to know: Will the exams happen? If so, what will they look like, and how will the results be used?
12 min read
Students testing.
Getty
Accountability Opinion What Should School Accountability Look Like in a Time of COVID-19?
Remote learning is not like in person, and after nine months of it, data are revealing how harmful COVID-19 has been to children's learning.
6 min read
Image shows a speech bubble divided into 4 overlapping, connecting parts.
DigitalVision Vectors/Getty and Laura Baker/Education Week