Assessment Q&A

Picking an Interim Assessment? Do This First, Say School Leaders

By Sarah Schwartz — July 31, 2023 5 min read
Illustration of papers and magnifying glass
  • Save to favorites
  • Print

One of the many responsibilities that falls to school district leaders is whether to use an interim assessment—one that’s given every couple of months—to measure student progress.

These tests can serve several different purposes, including predicting performance on state exams and identifying subsets of skills for which students might need support.

Picking the right tool is a high-stakes decision. Teachers may use the results of these tests to adjust their instruction or determine which students will receive interventions. But it can also be hard to identify exactly what test will best suit a district’s specific purposes.

Last year, the nonprofit curriculum reviewer EdReports announced that it would start releasing reviews of interim assessments as well, judging their technical quality and usability. These kinds of outside evaluations are hard to come by right now, as they’re often proprietary, created by private companies.

But earlier this year, the organization put the plan on hold indefinitely, because not enough assessment companies agreed to participate.

See also

A small group of diverse middle school students sit at their desks with personal laptops in front of each one as they work during a computer lab.
E+/Getty

It was disappointing news for Christine Droba, the assistant superintendent of teaching and learning in North Palos School District 117 in Palos Hills, Ill.

“An external review would be huge,” she said, removing some of the burden of assessing validity and reliability of these tests from teachers’ and other educators’ shoulders.

Droba and North Palos superintendent Jeannie Stachowiak spoke with Education Week about how their district chooses interim assessments, and what they did after discovering that the test they were using wasn’t aligned with the year-end test in Illinois. This state test is used for federal accountability purposes.

They also shared their advice for other school leaders wondering about the alignment of interim assessments to their teaching.

This interview has been edited for length and clarity.

How does your district use interim assessments?

Droba: We use our interim assessments as a tool to predict how students are going to do on the state assessment. It is something that we use to identify which students need enrichment, which students need additional support in terms of: Are they going to be ready for the end of year benchmark?

We look at, where’s grade-level proficiency? How close is it to our target? How many students are below that level? What do they need to do to get to the end of the year benchmark? Which students are going to start intervention? That’s our fall assessment. By the winter, we track progress from fall to winter, and then revise any plans that we have—if we need to do more support in this area, or maybe less support in this area.

And then the spring testing session is really done to track growth from fall to spring, and we also use the spring assessment to continuously make sure that the interim assessment is aligned to the state assessment. We’re always looking at: Are these numbers showing us the same thing?

How did you figure out that your interim assessments weren’t aligned?

Stachowiak: [The assessment we were using] is not directly aligned to Illinois State Standards.

We had teachers, understandably, taking a look at some of the things on the [interim] assessment and beefing up their instruction in those areas. However, those were not areas that were target areas for assessment on a state assessment. So they were working very hard to make sure students met standards on an interim assessment that was really not aligned.

We started to look for other potential assessments that would be better aligned, which was when we made the switch [to a new interim test]. We have a data coordinator in the district who meets with our leadership team, constantly. And we are looking to do a data dive to make sure that [the new interim assessment] is a better predictor for our students.

Droba: We worked with our data coordinator to take the assessment from the spring and then the [state] assessment data for the same group of kids. And he ran a correlational study to figure out what was the correlation between the two data sets. I believe that number was around 0.7 or 0.8, which is very high. He was basically saying that these numbers are correlated.

That was similar to the research that [the interim assessment provider] already presented to us. Their correlations were a little bit higher than what he found with our data set. But it was still high enough that we were like, “Yeah, let’s move forward, this is still good. It’s in alignment.”

What advice would you give to other districts that may be having similar questions?

Droba: I would say that you need to have clarity on your goals and priorities. First and foremost, we made it very, very clear that the state test is what’s measuring the state standards. It’s what our whole curriculum and system is built on.

[The interim assessment] is a tool to predict how kids are going to do on the state assessment. So we have very clear priorities. If you don’t have that, it can be very confusing. Which assessment? What am I looking for? What’s the purpose of the assessment? You really need to have clarity, first on the purpose of the assessment and how you want to use it.

Stachowiak: We value the state assessment, because we believe it measures what our teachers teach and what our students should learn. And then based on that value, we create goals for the school district. We share those goals with the Board of Education, obviously. We share those goals with the teachers, so that at all of our professional learning, community meetings, everything that we’re doing with our staff, that’s the goal in mind—to make sure that the students are going to achieve those goals that we expect.

If you don’t have that goal in mind, and that alignment, it’s really difficult to make sure that everyone is sharing and doing the same thing and valuing not only the state assessments, but then whatever interim assessments you’re using to measure.

Is there information that you would want publicly available about interim assessments that you don’t have access to?

Droba: An external review would be huge.

A lot of what we get is from the company itself. They’re going to give us this report that says, “Yes, it’s aligned to IAR. Yes, we do that.” They do their own research. Having an external reviewer would help just make sure that their methods were valid, that everything meets the high-quality standards that you would expect.

We read through the reports that they provided to us, and then we piloted the program to make sure that in practice, it was what we wanted it to be. But having the external review would just provide a set of eyes that was not the company.

We work with teachers on our review committee, and they know the usability of it. But to have a research company explore the validity and reliability of an assessment, that just allows [teachers] to review that instead of having to actually do the review themselves.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way
Recruitment & Retention Webinar EdRecruiter 2025 Survey Results: The Outlook for Recruitment and Retention
See exclusive findings from EdWeek’s nationwide survey of K-12 job seekers and district HR professionals on recruitment, retention, and job satisfaction. 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week