Assessment

Assessment Governing Board Crafts Definition of ‘Prepared for College’

By Sarah D. Sparks — August 20, 2013 4 min read
  • Save to favorites
  • Print

The governing board for the tests known as “the nation’s report card” has marked its own definition of what makes a student academically prepared for college.

At a meeting here this month, members of the National Assessment Governing Board, which supervises the National Assessment of Educational Progress, voted 17-2 to adopt language that will define the new “college prepared” scores in reading and mathematics on the assessment.

The language will be used for reporting in the next 12th grade NAEP, whose results will be announced in spring 2014. Those results will include a nationally representative sample of seniors as well as state-level results for 13 states that volunteered to give the test to more students.

The final definition is fairly limited, with members voting to say that the percentages of students performing at or above 163 out of 300 in mathematics and 302 out of 500 in reading on the 12th grade assessments would be “plausible estimate[s] of the percentage of students who possess the knowledge, skills, and abilities [in those subjects] to make them academically prepared for college.”

The two members who voted against the measure Aug. 3 were Andrés A. Alonso and Rebecca Gagnon.

Mr. Alonso, a professor of practice at the Harvard Graduate School of Education and a former chief executive of the Baltimore public schools, argued that the research was not strong enough to set particular cutoff scores for college preparation. By contrast, Ms. Gagnon, the director of the Minneapolis board of education, argued that NAGB should take a firmer stance. She said the cutoff scores are “reasonable” estimates rather than “plausible.”

The new definitions are based on more than 30 studies, including several comparing the content and predictive value of the federally sponsored NAEP with those of college-placement assessments, such as the SAT, the ACT, and Accuplacer, as well as longitudinal studies in Florida of how students who performed at different levels on NAEP later fared in freshman-level college courses.

Researchers used the federal High School Transcript Study, a 2009 study linking outcomes between NAEP and the SAT, and a longitudinal study of Florida students to compare performance on NAEP in reading and math with the test scores considered “college-readiness benchmarks” in the act and the SAT in 2005 and 2009.

‘Aspirational’ Level

In both subjects, the researchers found students who met the “proficient” achievement level on NAEP—176 out of 300 in math and 302 out of 500 in reading—also scored at or above the college-readiness benchmark scores on the SAT and the ACT. In 2009, 38 percent of 12th graders scored at or above proficient in reading; only 26 percent reached proficiency in math.

Ready for Higher Education?

NAEP achievement levels are now being aligned with benchmarks on whether students are prepared for college. Seniors who score “proficient” in reading will be considered college-ready. The college-ready benchmark for math falls between proficient and “basic.”

Source: National Center for Education Statistics

Chester E. Finn Jr., who was the chairman of NAGB when the NAEP achievement levels were first approved, said at a symposium this summer in Washington that the “proficient” level was always intended to be “aspirational,” while “ ‘basic’ was supposed to show you were literate and could make your way through the subway system.”

“Now, 23 years later, when college and career readiness is on everyone’s lips, ... lo and behold, the pretty-clear conclusion reached is NAEP ‘proficient’ comes pretty darn close to college preparedness,” said Mr. Finn, the president of the Thomas B. Fordham Institute, a Washington-based research group.

To get a more nuanced look at how students of different performance levels fared in college, the researchers tracked students by using Florida’s K-20 student longitudinal system.

Based on the Florida data, students who earned at least a 298 out of 500 in reading or 162 out of 300 in math—reading nearly at the “proficient” level and math in the “basic” range—also at least met the ACT or SAT college-placement benchmarks, had a first-year college grade point average of at least 2.67, and were placed in nonremedial courses in math and literature.

In a parallel effort to set career-readiness benchmarks within NAEP, the governing board also had studied ways to connect NAEP to readiness for work as an automotive master mechanic, computer-support specialist, heating and air-conditioning technician, licensed practical nurse, and pharmacy technician, but it was not able to draw conclusions about how performance on NAEP would relate to such careers.

For example, among NAEP’s math-framework objectives, 64 percent to 74 percent were “not evident as prerequisite” in any of the training required for the careers studied, a finding Cornelia Orr, the board’s executive director, called “quite shocking.”

“This was a very hard task, but it was very revealing,” Ms. Orr said. “We found no evidence that someone prepared for job training is academically prepared for college. That said, someone prepared for college is certainly prepared for job training.”

Writing Questioned

The governing board plans to conduct more linking studies between NAEP and the SAT and the ACT; longitudinal studies in Florida, Illinois, Massachusetts, Michigan, and Texas; and a linking study with the act’s Explore, an assessment for 8th graders, in Kentucky and Tennessee.

However, Achieve, a Washington-based college-readiness advocacy group, wrote in a July 30 letter to NAGB that NAGB’s college-preparedness benchmarks don’t gauge how well students are prepared for college-level writing and questions in that subject released by NAEP “do not come close to assessing [the] skill set” involved in writing based on multiple sources.

NAGB has not officially responded to the letter, but there are also moves to develop more-detailed descriptions of the skills and NAEP questions that the “college-prepared” cutoff scores represent. “The one thing I’ve been concerned about from the very beginning of this research is its applicability to real life,” said board member W. James Popham, a professor emeritus of education and information studies at the University of California, Los Angeles. “Real examples for real people would be useful.”

A version of this article appeared in the August 21, 2013 edition of Education Week as Assessment Governing Board Crafts Definition of ‘Prepared for College’

Events

Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Privacy & Security Webinar
Navigating Cybersecurity: Securing District Documents and Data
Learn how K-12 districts are addressing the challenges of maintaining a secure tech environment, managing documents and data, automating critical processes, and doing it all with limited resources.
Content provided by Softdocs

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty
Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+
Assessment Opinion What's the Best Way to Grade Students? Teachers Weigh In
There are many ways to make grading a better, more productive experience for students. Here are a few.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Spotlight Spotlight on Assessment
This Spotlight will help you evaluate effective ways to offer students feedback, learn how to improve assessments for ELs, and more.