Opinion
Assessment Opinion

When 19 Heads Are Better Than One

By Kathryn Parker Boudett, Elizabeth A. City & Richard J. Murnane — December 06, 2005 7 min read
  • Save to favorites
  • Print
Using data effectively does not mean getting good at crunching numbers. It means getting good at working together.

How can educators use the piles of student-assessment results that land on their desks to improve learning in their schools? Two years ago, a group of school leaders from the Boston public schools and faculty members and doctoral students from Harvard University’s graduate school of education agreed to meet monthly to wrestle with this topic. (“Harvard, Boston Educators Team Up on Test-Data Book,” April 27, 2005.) We began with a somewhat vague notion of producing a book that would help teachers and administrators take the high road in responding to student-assessment results—improve instruction, rather than engage in “drill and kill.” We guessed that 19 heads would be better than one in producing such a book.

In our first year, we floundered. We were a collection of polite professionals who did our best to take time from our busy schedules to talk about using data in schools. We read each other’s musings on the topic of the day and then spent many meetings trying to outline a book that would compile our respective contributions. Each of us learned something, but by the end of the year it was not clear “we” had made much progress toward our collective goal.

In our second year, everything changed. We identified three editors, who proposed reorienting the book. We scrapped the original idea of a collection of chapters about various relevant topics. The new focus would be addressing what our target audience—we identified it as school leaders, broadly defined—needed to know and be able to do in order to use data effectively. We found a publisher, who then convinced us to adopt an extremely aggressive deadline. Perhaps most important, we created a very deliberate process for involving everyone in getting the work done.

Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning is the fruit of our labors. The central message of the book is that using data effectively does not mean getting good at crunching numbers. It means getting good at working together to gain insights from student-assessment results and to use the insights to improve instruction. We put our 19 heads together to come up with a set of tools and strategies for making collaborative analysis of assessment data a catalyst for improving student learning. Reflecting on our process, we realize that many of the principles we advocate in the book are those that proved critical to making our own collaborative endeavor successful.

For many years, people have argued for increased cooperation between school practitioners and researchers. But as we learned in year one of our book project, just putting a bunch of teachers, administrators, and professors in the same room won’t guarantee progress in solving knotty educational problems.

— Nip Rogers

BRIC ARCHIVE

We see the lessons we learned as relevant to other collaborative efforts, including the Strategic Education Research Partnership, or SERP, a new endeavor resulting from a 2003 National Academy of Sciences report. SERP aims to usher in a new paradigm for improving education by providing a structure for researchers and practitioners to collaborate in conducting rigorous studies in authentic school settings. We boil our lessons learned down to three: focus on a concrete goal, balance democracy and dictatorship, and revel in disagreement.

Focus on a concrete goal.Once we decided to create a step-by-step guide for school leaders, our collective task became clearer. We realized that we needed to distill from our collective experience a compelling process for improvement. When we looked at the areas of expertise of each group member, we began to see who might contribute to each chapter on each phase of the process.

We wanted to offer our readers a clear process for structuring the work of improvement, but we were also determined to avoid writing a preachy “how to.” After all, the vast majority of our authors had spent years of their lives working as teachers, counselors, or administrators in schools, so we knew we had the real potential to write a compelling text for practitioners. To do this, we set about the task of creating two case-study schools to use throughout the book to breathe life into the improvement process. Invariably, first drafts of our chapters featured vignettes of leaders from these schools devising ingenious solutions to daunting problems. To keep true to our goal of writing a book that would really resonate with educators, we eventually revised these vignettes to illustrate the challenges, tensions, failures, and general “messiness” involved in using data in schools. We then worked to help readers see how to learn from that messiness.

Balance democracy and dictatorship. By the end of our first year, members seemed to tire of the constant appeal to the group for consensus. A few even approached those of us who had convened the meetings and implored, “Just tell us what to do next!”

So, over the summer, the three of us who were to become the editors made some major decisions about where to take the work. When we reconvened in the fall, we announced the focus for the book and a formal process for getting the work done. We assigned every member of the group to a team responsible for writing a specific chapter. We instituted a rigorous review process for each chapter: The full group would review and comment on teams’ initial chapter outlines, and then offer feedback on a chapter draft later in the year. We distributed a calendar showing due dates for the different phases of each chapter. Finally, we asked that authors entrust their revised work to us editors, who would sew the story together and ensure that the book read with one voice.

Did our colleagues mutiny at being whipped into shape? Quite the contrary. Attendance at meetings soared. Deadlines were met. We became a true community of practice like the ones we were advocating in our book, with members accountable to each other.

We discovered that points of tension were our sources of greatest learning.

Revel in disagreement. How can a book with so many authors provide a coherent story line? To make this happen, we had to devote time to really hashing out the sources of our many disagreements. Even though individuals to this day disagree on some issues, all of our authors endorse what’s in our collective book. Initially, there were times when we wondered whether we could find common ground. For example, our assessment experts argued that focusing a school discussion on individual test items was a dangerous practice. Some of our school principals explained that discussing specific items was one of their most powerful strategies for engaging faculty members in looking at assessment results. After much discussion, we settled on recommending that school leaders use individual items as catalysts for discussion, not as a basis for decisions.

We also had many intense discussions about language. For example, we initially assumed that we would make use of the popular notion that a key step in data analysis is to identify the root cause of student learning difficulties. But many in our group took issue with the term “root cause.” Is a school leader’s job to uncover the deepest causes of achievement problems, or to treat the causes over which she has most control? In the end we decided to abandon the term entirely, in favor of more precise language that focused on the work school faculties could do.

We also learned that agreement on general principles did not always map to agreement on detailed practices. For example, everyone agreed that devoting a great deal of scarce instructional time to “teaching to the test” was a bad idea. However, some of our group maintained that, given the high stakes for students, familiarizing students with the format of mandatory state tests and providing opportunities to practice test-taking skills were appropriate. On this question we also discovered that “researchers” and “practitioners” aren’t monolithic categories—quite often members of these somewhat arbitrary groups held differing views.

Keeping the focus on sources of disagreements kept the sessions lively, fruitful, and intense. We learned that adopting protocols to structure conversations helped assure that the intense discussions generated light as well as heat. We discovered that points of tension were our sources of greatest learning, and we had to change our book outline many times over the second year to accommodate what we were learning together. We realized that tensions don’t go away until you resolve them, and we were fortunate that our community of practice was strong enough to support frank conversations that led to resolutions we all could live with.

Most problems in education today are too complicated for individuals to solve alone. But when bringing together researchers and practitioners to work on tough issues, our experience is that the collaborative venture itself needs care and attention if it is to produce results. Only then are 19 heads better than one.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Teaching Students to Use Artificial Intelligence Ethically
Ready to embrace AI in your classroom? Join our master class to learn how to use AI as a tool for learning, not a replacement.
Content provided by Solution Tree
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Webinar
Empowering Students Using Computational Thinking Skills
Empower your students with computational thinking. Learn how to integrate these skills into your teaching and boost student engagement.
Content provided by Project Lead The Way
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
IT Infrastructure & Management Webinar
The Reality of Change: How Embracing and Planning for Change Can Shape Your Edtech Strategy
Promethean edtech experts delve into the reality of tech change and explore how embracing and planning for it can be your most powerful strategy for maximizing ROI.
Content provided by Promethean

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week