Assessment

Plans Advance to Link NAEP to College, Work Readiness

By Sean Cavanagh — December 01, 2008 5 min read
  • Save to favorites
  • Print

Federal officials have taken a major step toward using the test known as “the nation’s report card” to judge 12th graders’ preparation for college and the job market.

The board that sets policy for that exam, the National Assessment of Educational Progress, has voted to accept a report of an expert panel with recommendations for linking student performance on the influential test with skills identified as important by college officials and employers.

Many of the details regarding how, exactly, NAEP would be used to judge students’ college and workforce preparation have yet to be worked out.

But the vote of the National Assessment Governing Board, taken Nov. 21 at the panel’s quarterly meeting here, essentially allows the board to arrange to have more detailed studies to determine how “preparedness” could be reflected on the NAEP scale. Those studies are likely to include comparing the content of NAEP with that of widely used tests of postsecondary preparation, such as the ACT and the SAT, and workplace-placement exams, such as WorkKeys. The board might also seek to collect data from two- and four-year colleges and vocational programs on their standards for placing students in regular, as opposed to remedial, courses.

The board’s goal is to have measures of student preparedness in place for 12th grade NAEP testing next year, so that information could be publicly reported in 2010.

The NAEP reports average student scores for the nation, states, some cities, and individual demographic groups. In theory, incorporating preparedness on NAEP would allow the public to look at 12th graders’ average reading or math scores, which are currently presented on a scale of 0 to 500, and judge how well equipped they are to handle the demands of college and of certain occupations.

Improved Motivation?

Governing-board officials also hope that by providing a clear assessment of students’ ability to handle postsecondary and on-the-job challenges, they can provide clearer information to parents, policymakers, and the general public about how to interpret naep scores, said Mary Crovo, the board’s interim executive director.

Board members have grown increasingly concerned about 12th graders’ lack of motivation to take NAEP seriously, given the distractions students face during their senior year and the competing demands placed on them by other tests and academic factors.

“The question is, how do we make the 12th grade report much more effective?” said governing-board member David P. Driscoll, the former commissioner of education in Massachusetts, after the board’s vote. Judging students’ skills for life after high school, he said, “would add meaning to our reporting, for us, for students, for their parents.”

Currently, the 12th grade NAEP is given at the national level, though 11 states have agreed to take part in an assessment that will produce individual results for those jurisdictions.

Policymakers and college officials have voiced frustration over students arriving in need of remedial coursework. Employers say too many young workers lack useful skills. But opinions differ on how exactly to define “preparedness” for college and work, in terms of knowing which academic and applied skills are most useful. (Diplomas Count, June 12, 2007.)

The panel recommended that NAEP be used as a gauge of students’ “academic preparedness for college and workplace training.” That means students have the knowledge and skills to qualify to enter a credit-bearing course on the way to a four-year degree, or to qualify for placement in a job-training program, which could include an apprenticeship, a community-college technical program, or a vocational program.

The report on preparedness was completed by a seven-member technical panel formed by the governing board. That panel was chaired by Michael W. Kirst, a professor emeritus at Stanford University.

Targeted Options

Members of the governing board also began a tentative investigation of whether it is possible to use “targeted” or “adaptive” testing on NAEP, which would mark a major departure from how the exam is administered now.

A committee of the board heard a presentation from John Mazzeo, an associate vice president of the Educational Testing Service, a Princeton, N.J.-based nonprofit research and testing organization. Targeted or adaptive testing, sometimes called tailored testing, involves giving students of different ability levels exams that are written to different levels of difficulty. Currently, NAEP gives all students with a testing population exams with the same demands.

Board members are interested in whether targeted testing could help them reduce the disparities among the numbers of students with disabilities or with limited English skills who states and cities choose to exclude from testing or provide with special accommodations. Critics have said those disparities undermine the value of NAEP scores. (“States Struggle to Meet Achievement Standards for ELLs,” July 16, 2008.)

Targeted testing could potentially allow those students to more fully participate in the NAEP. One targeted-testing model, though a potentially costly one, would involve giving NAEP by computer, Mr. Mazzeo told the committee. Students who answered particular questions correctly would move on to more difficult ones, or, if they tripped up, to easier questions.

By tailoring the test in that way, the thinking goes, administrators can judge struggling students’ academic skills more precisely, as opposed to simply receiving a string of wrong answers. For that reason, targeted testing might help provide federal officials with more detailed information about the performance of students scoring at the lowest levels.

For instance, when students from Puerto Rico took the 2003 and 2005 naep in math, their scores were so low that federal officials had difficulty interpreting the results.

The board is already considering other options for dealing with exclusions and accommodations, such as setting national rules on them; using “full population estimates” to adjust for exclusions; and attaching “cautionary flags” to scores if exclusions rose above a specified level.

A version of this article appeared in the December 03, 2008 edition of Education Week as Plans Advance to Link NAEP to College, Work Readiness

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
Student Success Strategies: Flexibility, Recovery & More
Join us for Student Success Strategies to explore flexibility, credit recovery & more. Learn how districts keep students on track.
Content provided by Pearson
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Shaping the Future of AI in Education: A Panel for K-12 Leaders
Join K-12 leaders to explore AI’s impact on education today, future opportunities, and how to responsibly implement it in your school.
Content provided by Otus
Student Achievement K-12 Essentials Forum Learning Interventions That Work
Join this free virtual event to explore best practices in academic interventions and how to know whether they are making a difference.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Trump Admin. Abruptly Cancels National Exam for High Schoolers
The cancellation raised concerns that federal spending cuts will affect long-term data used to measure educational progress.
3 min read
Illustration concept: data lined background with a line graph and young person holding a pencil walking across the ups and down data points.
iStock/Getty
Assessment From Our Research Center Do State Tests Accurately Measure What Students Need to Know?
Some educators argue that state tests don't do much more than evaluate students' ability to perform under pressure.
2 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+