Opinion
Assessment Opinion

When 19 Heads Are Better Than One

By Kathryn Parker Boudett, Elizabeth A. City & Richard J. Murnane — December 06, 2005 7 min read
  • Save to favorites
  • Print
Using data effectively does not mean getting good at crunching numbers. It means getting good at working together.

How can educators use the piles of student-assessment results that land on their desks to improve learning in their schools? Two years ago, a group of school leaders from the Boston public schools and faculty members and doctoral students from Harvard University’s graduate school of education agreed to meet monthly to wrestle with this topic. (“Harvard, Boston Educators Team Up on Test-Data Book,” April 27, 2005.) We began with a somewhat vague notion of producing a book that would help teachers and administrators take the high road in responding to student-assessment results—improve instruction, rather than engage in “drill and kill.” We guessed that 19 heads would be better than one in producing such a book.

In our first year, we floundered. We were a collection of polite professionals who did our best to take time from our busy schedules to talk about using data in schools. We read each other’s musings on the topic of the day and then spent many meetings trying to outline a book that would compile our respective contributions. Each of us learned something, but by the end of the year it was not clear “we” had made much progress toward our collective goal.

In our second year, everything changed. We identified three editors, who proposed reorienting the book. We scrapped the original idea of a collection of chapters about various relevant topics. The new focus would be addressing what our target audience—we identified it as school leaders, broadly defined—needed to know and be able to do in order to use data effectively. We found a publisher, who then convinced us to adopt an extremely aggressive deadline. Perhaps most important, we created a very deliberate process for involving everyone in getting the work done.

Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning is the fruit of our labors. The central message of the book is that using data effectively does not mean getting good at crunching numbers. It means getting good at working together to gain insights from student-assessment results and to use the insights to improve instruction. We put our 19 heads together to come up with a set of tools and strategies for making collaborative analysis of assessment data a catalyst for improving student learning. Reflecting on our process, we realize that many of the principles we advocate in the book are those that proved critical to making our own collaborative endeavor successful.

For many years, people have argued for increased cooperation between school practitioners and researchers. But as we learned in year one of our book project, just putting a bunch of teachers, administrators, and professors in the same room won’t guarantee progress in solving knotty educational problems.

— Nip Rogers

BRIC ARCHIVE

We see the lessons we learned as relevant to other collaborative efforts, including the Strategic Education Research Partnership, or SERP, a new endeavor resulting from a 2003 National Academy of Sciences report. SERP aims to usher in a new paradigm for improving education by providing a structure for researchers and practitioners to collaborate in conducting rigorous studies in authentic school settings. We boil our lessons learned down to three: focus on a concrete goal, balance democracy and dictatorship, and revel in disagreement.

Focus on a concrete goal.Once we decided to create a step-by-step guide for school leaders, our collective task became clearer. We realized that we needed to distill from our collective experience a compelling process for improvement. When we looked at the areas of expertise of each group member, we began to see who might contribute to each chapter on each phase of the process.

We wanted to offer our readers a clear process for structuring the work of improvement, but we were also determined to avoid writing a preachy “how to.” After all, the vast majority of our authors had spent years of their lives working as teachers, counselors, or administrators in schools, so we knew we had the real potential to write a compelling text for practitioners. To do this, we set about the task of creating two case-study schools to use throughout the book to breathe life into the improvement process. Invariably, first drafts of our chapters featured vignettes of leaders from these schools devising ingenious solutions to daunting problems. To keep true to our goal of writing a book that would really resonate with educators, we eventually revised these vignettes to illustrate the challenges, tensions, failures, and general “messiness” involved in using data in schools. We then worked to help readers see how to learn from that messiness.

Balance democracy and dictatorship. By the end of our first year, members seemed to tire of the constant appeal to the group for consensus. A few even approached those of us who had convened the meetings and implored, “Just tell us what to do next!”

So, over the summer, the three of us who were to become the editors made some major decisions about where to take the work. When we reconvened in the fall, we announced the focus for the book and a formal process for getting the work done. We assigned every member of the group to a team responsible for writing a specific chapter. We instituted a rigorous review process for each chapter: The full group would review and comment on teams’ initial chapter outlines, and then offer feedback on a chapter draft later in the year. We distributed a calendar showing due dates for the different phases of each chapter. Finally, we asked that authors entrust their revised work to us editors, who would sew the story together and ensure that the book read with one voice.

Did our colleagues mutiny at being whipped into shape? Quite the contrary. Attendance at meetings soared. Deadlines were met. We became a true community of practice like the ones we were advocating in our book, with members accountable to each other.

We discovered that points of tension were our sources of greatest learning.

Revel in disagreement. How can a book with so many authors provide a coherent story line? To make this happen, we had to devote time to really hashing out the sources of our many disagreements. Even though individuals to this day disagree on some issues, all of our authors endorse what’s in our collective book. Initially, there were times when we wondered whether we could find common ground. For example, our assessment experts argued that focusing a school discussion on individual test items was a dangerous practice. Some of our school principals explained that discussing specific items was one of their most powerful strategies for engaging faculty members in looking at assessment results. After much discussion, we settled on recommending that school leaders use individual items as catalysts for discussion, not as a basis for decisions.

We also had many intense discussions about language. For example, we initially assumed that we would make use of the popular notion that a key step in data analysis is to identify the root cause of student learning difficulties. But many in our group took issue with the term “root cause.” Is a school leader’s job to uncover the deepest causes of achievement problems, or to treat the causes over which she has most control? In the end we decided to abandon the term entirely, in favor of more precise language that focused on the work school faculties could do.

We also learned that agreement on general principles did not always map to agreement on detailed practices. For example, everyone agreed that devoting a great deal of scarce instructional time to “teaching to the test” was a bad idea. However, some of our group maintained that, given the high stakes for students, familiarizing students with the format of mandatory state tests and providing opportunities to practice test-taking skills were appropriate. On this question we also discovered that “researchers” and “practitioners” aren’t monolithic categories—quite often members of these somewhat arbitrary groups held differing views.

Keeping the focus on sources of disagreements kept the sessions lively, fruitful, and intense. We learned that adopting protocols to structure conversations helped assure that the intense discussions generated light as well as heat. We discovered that points of tension were our sources of greatest learning, and we had to change our book outline many times over the second year to accommodate what we were learning together. We realized that tensions don’t go away until you resolve them, and we were fortunate that our community of practice was strong enough to support frank conversations that led to resolutions we all could live with.

Most problems in education today are too complicated for individuals to solve alone. But when bringing together researchers and practitioners to work on tough issues, our experience is that the collaborative venture itself needs care and attention if it is to produce results. Only then are 19 heads better than one.

Related Tags:

Commenting has been disabled on edweek.org effective Sept. 8. Please visit our FAQ section for more details. To get in touch with us visit our contact page, follow us on social media, or submit a Letter to the Editor.


Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Webinar
6 Key Trends in Teaching and Learning
As we enter the third school year affected by the pandemic—and a return to the classroom for many—we come better prepared, but questions remain. How will the last year impact teaching and learning this school
Content provided by Instructure
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Curriculum Webinar
How Data and Digital Curriculum Can Drive Personalized Instruction
As we return from an abnormal year, it’s an educator’s top priority to make sure the lessons learned under adversity positively impact students during the new school year. Digital curriculum has emerged from the pandemic
Content provided by Kiddom
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Equity & Diversity Webinar
Leadership for Racial Equity in Schools and Beyond
While the COVID-19 pandemic continues to reveal systemic racial disparities in educational opportunity, there are revelations to which we can and must respond. Through conscientious efforts, using an intentional focus on race, school leaders can
Content provided by Corwin

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Proven Techniques for Assessing Students with Technology
Dr. Doug Fisher’s proven assessment techniques help your students become active learners and increase their chances for higher learning g...
Content provided by Achieve3000
Assessment Long a Testing Bastion, Florida Plans to End 'Outdated' Year-End Exams
Florida Gov. Ron DeSantis said the state will shift to "progress monitoring" starting in the 2022-23 school year.
5 min read
Florida Governor Ron DeSantis speaks at the opening of a monoclonal antibody site in Pembroke Pines, Fla., on Aug. 18, 2021.
Florida Gov. Ron DeSantis said he believes a new testing regimen is needed to replace the Florida Standards Assessment, which has been given since 2015.
Marta Lavandier/AP
Assessment Spotlight Spotlight on Assessment in 2021
In this Spotlight, review newest assessment scores, see how districts will catch up with their supports for disabled students, plus more.
Assessment 'Nation's Report Card' Has a New Reading Framework, After a Drawn-Out Battle Over Equity
The new framework for the National Assessment of Educational Progress will guide development of the 2026 reading test.
10 min read
results 925693186 02
iStock/Getty