Opinion
Assessment Commentary

When 19 Heads Are Better Than One

By Kathryn Parker Boudett, Elizabeth A. City & Richard J. Murnane — December 06, 2005 7 min read
Using data effectively does not mean getting good at crunching numbers. It means getting good at working together.

How can educators use the piles of student-assessment results that land on their desks to improve learning in their schools? Two years ago, a group of school leaders from the Boston public schools and faculty members and doctoral students from Harvard University’s graduate school of education agreed to meet monthly to wrestle with this topic. (“Harvard, Boston Educators Team Up on Test-Data Book,” April 27, 2005.) We began with a somewhat vague notion of producing a book that would help teachers and administrators take the high road in responding to student-assessment results—improve instruction, rather than engage in “drill and kill.” We guessed that 19 heads would be better than one in producing such a book.

In our first year, we floundered. We were a collection of polite professionals who did our best to take time from our busy schedules to talk about using data in schools. We read each other’s musings on the topic of the day and then spent many meetings trying to outline a book that would compile our respective contributions. Each of us learned something, but by the end of the year it was not clear “we” had made much progress toward our collective goal.

In our second year, everything changed. We identified three editors, who proposed reorienting the book. We scrapped the original idea of a collection of chapters about various relevant topics. The new focus would be addressing what our target audience—we identified it as school leaders, broadly defined—needed to know and be able to do in order to use data effectively. We found a publisher, who then convinced us to adopt an extremely aggressive deadline. Perhaps most important, we created a very deliberate process for involving everyone in getting the work done.

Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning is the fruit of our labors. The central message of the book is that using data effectively does not mean getting good at crunching numbers. It means getting good at working together to gain insights from student-assessment results and to use the insights to improve instruction. We put our 19 heads together to come up with a set of tools and strategies for making collaborative analysis of assessment data a catalyst for improving student learning. Reflecting on our process, we realize that many of the principles we advocate in the book are those that proved critical to making our own collaborative endeavor successful.

For many years, people have argued for increased cooperation between school practitioners and researchers. But as we learned in year one of our book project, just putting a bunch of teachers, administrators, and professors in the same room won’t guarantee progress in solving knotty educational problems.

— Nip Rogers

BRIC ARCHIVE

We see the lessons we learned as relevant to other collaborative efforts, including the Strategic Education Research Partnership, or SERP, a new endeavor resulting from a 2003 National Academy of Sciences report. SERP aims to usher in a new paradigm for improving education by providing a structure for researchers and practitioners to collaborate in conducting rigorous studies in authentic school settings. We boil our lessons learned down to three: focus on a concrete goal, balance democracy and dictatorship, and revel in disagreement.

Focus on a concrete goal.Once we decided to create a step-by-step guide for school leaders, our collective task became clearer. We realized that we needed to distill from our collective experience a compelling process for improvement. When we looked at the areas of expertise of each group member, we began to see who might contribute to each chapter on each phase of the process.

We wanted to offer our readers a clear process for structuring the work of improvement, but we were also determined to avoid writing a preachy “how to.” After all, the vast majority of our authors had spent years of their lives working as teachers, counselors, or administrators in schools, so we knew we had the real potential to write a compelling text for practitioners. To do this, we set about the task of creating two case-study schools to use throughout the book to breathe life into the improvement process. Invariably, first drafts of our chapters featured vignettes of leaders from these schools devising ingenious solutions to daunting problems. To keep true to our goal of writing a book that would really resonate with educators, we eventually revised these vignettes to illustrate the challenges, tensions, failures, and general “messiness” involved in using data in schools. We then worked to help readers see how to learn from that messiness.

Balance democracy and dictatorship. By the end of our first year, members seemed to tire of the constant appeal to the group for consensus. A few even approached those of us who had convened the meetings and implored, “Just tell us what to do next!”

So, over the summer, the three of us who were to become the editors made some major decisions about where to take the work. When we reconvened in the fall, we announced the focus for the book and a formal process for getting the work done. We assigned every member of the group to a team responsible for writing a specific chapter. We instituted a rigorous review process for each chapter: The full group would review and comment on teams’ initial chapter outlines, and then offer feedback on a chapter draft later in the year. We distributed a calendar showing due dates for the different phases of each chapter. Finally, we asked that authors entrust their revised work to us editors, who would sew the story together and ensure that the book read with one voice.

Did our colleagues mutiny at being whipped into shape? Quite the contrary. Attendance at meetings soared. Deadlines were met. We became a true community of practice like the ones we were advocating in our book, with members accountable to each other.

We discovered that points of tension were our sources of greatest learning.

Revel in disagreement. How can a book with so many authors provide a coherent story line? To make this happen, we had to devote time to really hashing out the sources of our many disagreements. Even though individuals to this day disagree on some issues, all of our authors endorse what’s in our collective book. Initially, there were times when we wondered whether we could find common ground. For example, our assessment experts argued that focusing a school discussion on individual test items was a dangerous practice. Some of our school principals explained that discussing specific items was one of their most powerful strategies for engaging faculty members in looking at assessment results. After much discussion, we settled on recommending that school leaders use individual items as catalysts for discussion, not as a basis for decisions.

We also had many intense discussions about language. For example, we initially assumed that we would make use of the popular notion that a key step in data analysis is to identify the root cause of student learning difficulties. But many in our group took issue with the term “root cause.” Is a school leader’s job to uncover the deepest causes of achievement problems, or to treat the causes over which she has most control? In the end we decided to abandon the term entirely, in favor of more precise language that focused on the work school faculties could do.

We also learned that agreement on general principles did not always map to agreement on detailed practices. For example, everyone agreed that devoting a great deal of scarce instructional time to “teaching to the test” was a bad idea. However, some of our group maintained that, given the high stakes for students, familiarizing students with the format of mandatory state tests and providing opportunities to practice test-taking skills were appropriate. On this question we also discovered that “researchers” and “practitioners” aren’t monolithic categories—quite often members of these somewhat arbitrary groups held differing views.

Keeping the focus on sources of disagreements kept the sessions lively, fruitful, and intense. We learned that adopting protocols to structure conversations helped assure that the intense discussions generated light as well as heat. We discovered that points of tension were our sources of greatest learning, and we had to change our book outline many times over the second year to accommodate what we were learning together. We realized that tensions don’t go away until you resolve them, and we were fortunate that our community of practice was strong enough to support frank conversations that led to resolutions we all could live with.

Most problems in education today are too complicated for individuals to solve alone. But when bringing together researchers and practitioners to work on tough issues, our experience is that the collaborative venture itself needs care and attention if it is to produce results. Only then are 19 heads better than one.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Measuring & Supporting Student Well-Being: A Researcher and District Leader Roundtable
Students’ social-emotional well-being matters. The positive and negative emotions students feel are essential characteristics of their psychology, indicators of their well-being, and mediators of their success in school and life. Supportive relationships with peers, school
Content provided by Panorama Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Making Digital Literacy a Priority: An Administrator’s Perspective
Join us as we delve into the efforts of our panelists and their initiatives to make digital skills a “must have” for their district. We’ll discuss with district leadership how they have kept digital literacy
Content provided by Learning.com
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
How Schools Can Implement Safe In-Person Learning
In order for in-person schooling to resume, it will be necessary to instill a sense of confidence that it is safe to return. BD is hosting a virtual panel discussing the benefits of asymptomatic screening
Content provided by BD

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion Alternatives to Standardized Tests During a Pandemic Year
Three educators suggest alternatives to federally mandated standardized testing during this year undercut by COVID-19.
7 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Opinion AP Exams Can't Be Business as Usual This Year
The College Board seems unconcerned with the collateral damage of its pandemic approach, writes an assistant superintendent of curriculum and instruction.
Pete Bavis
5 min read
Illustration of large boat in turbulent waters with other smaller boats falling into the abyss.
iStock/Getty Images Plus
Assessment Federal Lawmakers Urge Miguel Cardona to Let States Cancel Tests, Highlighting Discord
A letter from Democratic members to the new education secretary calls for an end to the "flawed" system of annual standardized exams.
3 min read
Jamaal Bowman speaks to reporters after voting at a polling station inside Yonkers Middle/High School in Yonkers, N.Y. on June 23, 2020.
Jamaal Bowman speaks to reporters after voting at a polling station inside Yonkers Middle/High School in Yonkers, N.Y. on June 23, 2020.
John Minchillo/AP
Assessment How Two Years of Pandemic Disruption Could Shake Up the Debate Over Standardized Testing
Moves to opt out of state tests and change how they're given threaten to reignite fights over high-stakes assessments.
9 min read
Image of a student at a desk.
patat/iStock/Getty