Assessment

Which States Expect the Most or Least From Students?

By Stephen Sawchuk — May 22, 2018 2 min read
  • Save to favorites
  • Print

Two new reports out suggest that, on average, states continue to hold students to higher expectations on state tests than they did in the No Child Left Behind era.

But some states continue to expect far less from students compared to their peers in other states, potentially giving parents and teachers a skewed sense of children’s skills.

(They’re looking at you, Texas, Iowa, and Virginia.)

Both reports use a similar methodology looking at the percentage of students who are deemed “proficient” in reading and math in grades 4 and 8 on the state test, compared to the National Assessment of Educational Progress for the same grade levels. Then they look at the gaps in proficiency rates. (If the tests had the same level of difficulty, there would be no gap.)

Confused? Thinking to yourself: “Why would anyone compare these two things?”

Here’s the background: For years, policy folks lamented that pressure from No Child Left Behind encouraged states to set lower “cutoff scores"— the score at which a student is deemed proficient. Basically, this made tests easier and state test results look better than they really were.

Since NAEP is taken by a sample of students in every state, it’s a convenient benchmark to compare state results.

The first report, from the journal Education Next, studies these changes in detail. In 2017, EdNext found that the average gap in proficiency rates between state tests and NAEP was about 9 percentage points—far smaller than the 37 points just a decade earlier.

As states adopted the Common Core State Standards and their attempts to qualify for federal Race to the Top funds, they appear to have started to ratchet up expectations on the tests. That pinnacled around 2015.

It’s important to note in this context that state exams aren’t based on the same content framework as NAEP, and they have different areas of emphasis. These analyses don’t get into that topic. Additionally, NAEP defines proficient as students mastering “challenging” content; some critics find that bar too high, and therefore misleading to use as an appropriate benchmark.)

Examining the data over time, EdNext commends states likeTennessee, Georgia, Illinois, Kansas, and Maryland, all of whom closed the gap between state and NAEP scores by more than 45 points.

But there’s a catch: There doesn’t seem to be a relationship between states setting higher standards and average growth in students’ state test scores. So setting the bar higher doesn’t, in and of itself, guarantee better scores, Education Next’s analysts conclude.

The second report, released a week ago from the nonprofit Achieve, focuses specifically on the change from 2015 to 2017. It too finds that most states held to or have even set a higher test-score bar over this time period, but there were some exceptions.

Three states—Alaska, Arkansas, and Ohio—increased the gap in one grade and subject combination by at least 10 percentage points, it states.

Echoing the the Education Next tally, Iowa, Texas, and Virginia had the widest gaps overall in 2017, the report found, with those gaps spanning more than 30 percentage points for all four grade and subject-area combinations.

Texas had the single largest proficiency gap, with nearly 56 percentage points separating the percent of students who were proficient on NAEP in 8th grade math and the percent proficient on the state exam at that level.

The Bill & Melinda Gates Foundation and the Walton Family Foundation helped fund the research. They also support coverage of continuous improvement and parent choice in Education Week, which retains sole editorial control.

A version of this news article first appeared in the Curriculum Matters blog.