Sample Common-Core Test Items Inspire Worry, Hope

By Liana Loewus — January 09, 2013 3 min read
  • Save to favorites
  • Print

While teachers are getting to know and become more comfortable with the Common Core State Standards, many still feel the presence of a dark, amorphous cloud hanging over them: the assessments.

At the What Works in Urban Schools conference last weekend (sponsored by KIPP: NYC, TNTP, Google, and Scholastic, among others), representatives from the New York City Department of Education led educators through a comparison of the old and new assessments, answering questions about what the differences mean for teachers. Surprisingly, and unlike the other common-core sessions at the conference, this presentation had a small turnout—only about 30 of the more than 1,000 conference attendees. But the group was quite vocal, firing off questions from the get-go that indicated a mix of optimism and concern.

Jenny Hanson, program manager in the NYCDOE Office of Assessment Portfolio, opened the session by asking how many people in the room had a “strong understanding” of the standards. Nearly every hand went up. She then asked how many had a “strong understanding of the PARCC [i.e., Partnership for Assessment of Readiness for College and Careers] assessments and what they propose to do and look like.” Just a few hands remained in the air.

Like some other states, New York has begun the instructional shift to the common standards and, in waiting for finalized PARCC assessments, is incorporating common-core-aligned items into state tests this year. The state will transition fully to the PARCC assessments in 2014-15. Several states will also participate in field testing the new assessments in 2013-14, said Anthony Benners, a senior psychometrician for the NYCDOE.

Attendees looked at several sample test questions from the current NYC state tests alongside sample questions from PARCC assessment. The math samples addressed elementary-level fractions. While the state test asked students to match simple fractions with pictures, the PARCC test asked students to place three fractions—two of which were improper—on an integer number line. Teachers in the room voiced concern that the PARCC question required students to employ a variety of discrete skills (rather than one isolated skill), making it much more difficult to show mastery. One teacher attendee said such high-level math questions would necessitate “a mind-shift in the way we teach.”

When asked whether a teacher would be able to look at the test data and determine which discrete skill a student struggled with, Banners said PARCC “will be providing that type of information.” But in an interview after the session, Hanson said that no one really knows yet exactly what the assessment data will look like or what teachers will be able to glean from it.

During the session, teachers also acknowledged that, though scary, the assessments would lead to deeper and more real-world-applicable learning. A special education teacher noted: “One of the positive things about it is you can’t teach tricks. They either know it or they don’t. ... However, I’m trying to figure out how realistic this would be for my population of students.”

When comparing samples from the PARCC English/language arts assessment to samples from the New York state test, attendees noticed, above all, a leap in text complexity. Several teachers reacted with surprise at the difficulty of the passage, with one pointing out that she herself had never read the full text of the work referred to in the sample, Ovid’s Metamorphoses.

“I know this makes you a little nervous and uncomfortable, and I get that,” said Hanson. “We’re definitely seeing a jump in complexity. But this change needs to happen.”

Related Tags:

A version of this news article first appeared in the Teaching Now blog.