School & District Management Opinion

Lessons Learned from Developing a ‘No Surprises’ Policy for Releasing Research Findings

By Urban Education Contributor — September 25, 2017 4 min read
  • Save to favorites
  • Print

This week we are hearing from the Madison Education Partnership (MEP). This post is by Dominique Bradley, MEP Program Manager.

The Madison Education Partnership previously blogged about research on Kindergarten for 4-year-olds and the benefits of research-practice partnership work.

Today’s post is written from the researcher perspective. Stay tuned: Thursday we will hear from a practitioner.

In our first year of full operation, the Madison Education Partnership (MEP) has been focused on research about the district’s four-year-old kindergarten program (read our original blog post in Education Week). We have been tackling questions about enrollment patterns, kindergarten readiness, and equitable access to programming in the Madison Metropolitan School District.

We are now coming to an exciting point in the evolution of our research: public release and dissemination of our first research brief. We prepared for the process of release early on by crafting a “no surprises” policy and developing internal guiding documents detailing the process of release. It turns out, however, that we had underestimated the subtle ways our organizational cultures and structures would shape the implementation of this policy. Ultimately, this experience made us more aware of our organizational assumptions and orientations and led to improvements in our release strategy for future research. In this post, we hope to share our story and the lessons learned from this experience.

Developing the “no surprises” policy

One of the challenges of creating and sustaining a research-practice partnership (RPP) is navigating the differing organizational cultures and organizational structures in school districts and universities. Our partnership is unique in that it is co-directed equally by university and district representatives. With our shared leadership structure, we identified early on that a clear path for communication would be paramount to working together in a way that would build trust between the university and the district.

We sought to head off any issues generated by our research by developing a “no surprises” policy in which the district, university leadership, and MEP constituents (our Advisory Group and Steering Committee) would be notified in advance of any findings being released from MEP. We developed a detailed process map that laid out how and by whom our research briefs would be created, vetted, and released. The “no surprises” rule was also included in other documents and memoranda of agreement. Specifically, the policy includes an embargoed release period which allows the district to have a chance to ask questions and raise concerns about our work in advance of publication, and MEP the opportunity to respond prior to public dissemination of findings. This policy also maintains that the district does not have the power to suppress MEP findings and any concerns they raise would be considered advisory.

Implementing the “no surprises” policy: Surprising complexities

Once our report had been vetted for interpretation and clarity of language, we disseminated it in an embargoed release to district leadership and MEP constituents. Two things we had not been previously attuned to quickly became apparent: 1) Research briefs in academia differ from internal school district reports in both structure and purpose, which can cause confusion. 2) Our release plan had neglected to account for district sub-units that would ultimately be engaged with our findings; in this case the program lead and office of research and evaluation.

As is common in academic writing, we posed a few questions in the conclusion of our first report we thought were fairly innocuous and suggested some directions for further inquiry for the district. Although the report is overwhelmingly positive in tone, some members of the district administration keyed on these questions with a level of concern that surprised us. The approach we had used in our report, indicating further directions for inquiry, was an uncommon practice in the district and was in large part what had triggered the concerned response. District reports are generally written in response to very specific questions posed for particular policy reasons and do not commonly suggest additional directions for research. Further, the questions we asked in our report also queued off work for other sub-units within the district in an effort to answer those questions; in this case the Office of Research and Program Evaluation - the office of our district co-Director.

Lessons Learned

As Penuel and colleagues (2015) discuss in their article “Conceptualizing Research-Practice Partnerships as Joint Work at the Boundaries,” to form a functional partnership partners must be willing to learn the intricacies and assumptions carried by each organization and incorporate practices that blend familiar and unfamiliar elements of each organization. In developing our “no surprises” policy we were focused on keeping communication clear and consistent.

However, we needed to go beyond that focus and think through the assumptions each partner organization would be working under when interacting with our reports in order to better identify where we needed to ‘blend’ elements of our organizations in our research releases. In response to this experience, we modified our brief release plan to incorporate key district staff at earlier points in the process and clearly laid out the style of brief the district could expect from us going forward.

These were simple but important lessons for our team as we continue to implement this policy in a way that works for both organizational partners.

Related Tags:

The opinions expressed in Urban Education Reform: Bridging Research and Practice are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.