Assessment

Harvard, Boston Educators Team Up on Test-Data Book

By Lynn Olson — April 26, 2005 6 min read
  • Save to favorites
  • Print

In an unusual collaboration, faculty members and students from Harvard University’s graduate school of education have teamed up with educators from the Boston school system to write a book on how to use data to improve instruction.

Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning will be published by the Harvard Education Press in November. All royalties from the book will go to the education school to work with the 58,300-student Boston public schools. The book grew out of a yearlong workshop designed to help the district’s teachers and administrators learn ways of making more productive use of student test results and other data.

Experts say the book’s partnership between researchers and educators could help bridge the gaps that exist between testing and instruction.

“This can create a balance between the technical limitations of assessments, and the real, very specific needs of classroom teachers and administrators,” said Stanley Rabinowitz, the director of assessment and standards development services for WestEd, a federally financed regional educational laboratory based in San Francisco. He noted that writing by researchers alone often fails to address teachers’ needs, such as how to prevent overgeneralizing student performance from a few test items.

Yearlong Workshop

In the 2001-02 school year, Harvard economist Richard J. Murnane took a year off from his university teaching job to work with the Boston district. (“In the Trenches,” October 16, 2002.) During that time, Mr. Murnane helped the administration and the Boston Plan for Excellence, a local foundation, establish MyBPS, intranet software that gives principals and teachers access to state and other testing data for their own students and schools, and lets them perform basic analyses.

To help school people actually examine the data to improve instruction, Mr. Murnane developed a yearlong workshop designed to bridge the worlds of academia and practice. In that graduate workshop—now led by faculty member Kathryn Parker Boudett and doctoral students Elizabeth City and Liane Moody—participants are assigned to school-based teams that include Boston educators and Harvard graduate students.

Harvard doctoral student Elizabeth City and Gerardo J. Martinez, principal of Mary E. Curley Middle School, share ideas at the Boston school about how educators can use assessment results more effectively to improve instruction.

During the 2003-04 school year, Mr. Murnane invited a contingent of Harvard faculty members, graduate students, and Boston educators—who all have a deep interest in how to make constructive use of student-assessment results—to meet regularly with the idea of writing a book for a broader audience.

“U.S. students are tested a great deal,” Mr. Murnane said. “The results, in principle, could provide information useful for improving teaching and learning. However, this happens in very few schools. We wanted to write a book aimed at helping school-based educators to learn to use student-assessment results to improve instruction and to enhance student learning.”

In addition to two other Harvard professors—Daniel M. Koretz, an assessment expert, and John B. Willett, a statistician—the group includes five Boston educators and 10 doctoral students, all of whom previously worked in schools. Some of the participants had also been involved in the yearlong workshop.

“In teaching the course, we realized there was not a book that did what we wanted to do, in terms of teaching people how to use data,” said Ms. City, who is one of the book’s editors, along with Mr. Murnane and Ms. Boudett. “We were forever creating our own lessons that we thought were more appropriate for doing the things we were trying to do.”

Improvement Cycle

At the start of the school year, educators often describe feeling overwhelmed by the amount of data and where to dive in, Ms. Boudett said. So the workshop and the forthcoming book are structured around an “improvement cycle,” with the tools to use at each step along the way.

While the process might look different in different schools, the cycle helps educators: identify patterns in data; choose key issues to investigate; dig deeper into multiple data sources; agree on a problem and explore its causes; examine current classroom practices; draw up a plan to change those practices; carry out that plan; and then assess the results of those actions.

To keep the discussion grounded in reality and help connect the different chapters, the book also is built around vignettes that are composites of the authors’ experiences working in real Boston elementary, middle, and high schools. Each chapter is typically co-written by some combination of a doctoral student, a Harvard faculty member, and a district principal.

Once a month, the group writing the book meets to review draft chapters and hammer out the members’ collective wisdom. At this month’s meeting, the group critiqued two draft chapters on how to devise an action plan and carry it out. Two doctoral students, who had written one of the drafts, were struggling with how such plans actually are made in schools and what they look like.

“For this chapter, I have difficulty—I think because I’ve never had this job in a school,” Mr. Murnane said. Turning to the three principals at the meeting—Gerardo J. Martinez from Mary E. Curley Middle School, Mary Russo from Richard J. Murphy Elementary School, and Jane E. King from John W. McCormack Middle School—he asked, “Can you give us some guidance about what you do?”

“There’s the formal plan, and there’s figuring out how to get it up off the paper,” explained Ms. Russo, pulling out a real example that she had brought from her own school and passing it around the room.

After the principals discussed the process they go through at their own schools, the group decided it would be helpful if the book included sample plans for two of the fictional schools described in the vignettes—Clark Elementary and Franklin High—to make such experiences concrete.

In an interview later in the evening, Ms. Russo said one advantage to the collaboration is that much of what educators do is “tacit” knowledge. “We do it,” she said. “[But] we often don’t articulate what we’re doing—we’re so involved in the doing, the action.”

‘Serious Concerns’

Some of the book’s authors are not necessarily fans of standardized testing.

Sarah E. Fiarman, who taught for nearly eight years in Cambridge, Mass., elementary schools, said: “I came into this group right out of having been a classroom teacher, and I came in rabidly opposed to standardized tests because I thought they did not provide a complete picture of students.”

“I still have serious concerns about a lot of the ways standardized tests are used,” she added, “but I’m gaining new appreciation for ways test data can be used that can be helpful.”

Similarly, doctoral student Jennifer Price had been an administrator in the 1,200-student Lincoln-Sudbury school district, which is in an affluent suburb of Boston and has been no fan of the state testing program: the Massachusetts Comprehensive Assessment System, or MCAS.

“The Lincoln-Sudbury approach was to ignore MCAS as much as possible,” she said.

While Ms. Price said she hasn’t “been converted,” the experience has given her new respect for Boston school administrators and a deeper appreciation for how to use assessment information effectively to make changes.

Finding the language that reflects both the opportunities and the dangers of using test data in schools continues to be one of the group’s greatest challenges, said Mr. Murnane, the Juliana W. and William Foss Thompson professor of education and society. Two of the implicit assumptions underlying the book are that the tests themselves measure skills important for students to learn, he said, and that the goal is to really improve teaching and learning, not just to boost scores.

At its April meeting, for example, the group decided to add two new book chapters—on analyzing student work beyond test scores, and on investigating teachers’ classroom practices—as a way to delve deeper into data.

“We’re adding more things that are not just reliant on state annual tests,” said Ms. City, who noted that the book is constantly evolving as the group learns more. “We’re still learning a tremendous amount simultaneous to the writing of it.”

Related Tags:

Interesting Ideas?
Send suggestions for possible research section stories to Debra Viadero at Education Week, 6935 Arlington Road, Bethesda, MD 20814; e-mail: dviadero@epe.org.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Literacy Success: How Districts Are Closing Reading Gaps Fast
67% of 4th graders read below grade level. Learn how high-dosage virtual tutoring is closing the reading gap in schools across the country.
Content provided by Ignite Reading
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by Boys Town

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Explainer What Is Standards-Based Grading, and How Does It Work?
Schools can retool to make instruction more personalized and student-centered. But grading is a common sticking point.
11 min read
A collage of two faceless students sitting on an open book with a notebook and laptop. All around them are numbers, math symbols and pieces of an actual student transcript.
Nadia Radic for Education Week
Assessment Letter to the Editor Are Advanced Placement Exams Becoming Easier?
A letter to the editor reflects on changes to the College Board's Advanced Placement exams over the years.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Assessment Opinion ‘Fail Fast, Fail Often’: What a Tech-Bro Mantra Can Teach Us About Grading
I was tied to traditional grading practices—until I realized they didn’t reflect what I wanted students to learn: the power of failure.
Liz MacLauchlan
4 min read
Glowing light bulb among the crumpled papers of failed attempts
iStock/Getty + Education Week
Assessment See How AP Exam Scores Have Changed Over Time
The College Board adopted a new methodology for scoring AP exams which has resulted in higher passing rates.
1 min read
Illustration concept: data lined background with a line graph and young person holding a pencil walking across the ups and down data points.
iStock/Getty