Today’s post is the practitioner perspective on Monday’s post: Lessons Learned from Developing a ‘No Surprises’ Policy for Releasing Research Findings.
The Madison Education Partnership (MEP) — a research-practice partnership between the Madison Metropolitan School District (MMSD) and the Wisconsin Center for Education Research at the University of Wisconsin-Madison — has had a busy year. We have put organizational structures in place, taken on new research both internally and through supporting faculty projects, and engaged our two organizations and greater community in a discussion of early childhood options and school readiness. Now, we have reached an exciting time: researchers have findings to release and district staff can’t wait to learn more. We have articulated a clear “no surprises” policy and created a process map for dissemination. Should be easy, right?
In my capacity at the school district, I have seen how research dissemination works there. Over the years, I have learned how to get findings from my office, the Research & Program Evaluation Office (RPEO), to district leadership, schools, and the community. I know which decision-makers must see it, how much ownership they need to have (from just wanting an FYI to editing text), and the various checkpoints that must be reached in a particular order to make it to the end. Navigating that process has become second-nature to the RPEO team. As a co-director of the partnership, I figured the process would be that much easier for our MEP team. We have someone on the inside (me), who knows the ins and outs of getting things done in our district. No surprises? No problem — we have this covered.
And then reality set in. Research-practice partnerships are sticky business. Every time you think you have reached clarity in a process, you realize there are new challenges, unexpected hurdles and yes, even surprises. I have learned how “no surprises” can mean very different things to the university than to the district. Two recent situations brought this to light.
In the first situation, our MEP research team had just completed a brief and sent it to the district on embargoed release. According to our process map, this means a two week period where representatives from the district and the university have a chance to review the final brief, ask questions, and prepare for potential press. The MEP team assumed this would be smooth sailing. After all, our team had vetted the brief multiple times with our Steering Committee and the findings were pretty non-controversial and positive for MMSD. However, it wasn’t quite that simple. During the embargoed release, district leadership identified the questions posed for further research at the end of the brief (in this case, what kids left the district after 4K) as ones to which MMSD needed answers prior to release. As an independent organization, MEP was under no obligation to provide those. But the need for those answers meant an emergency request of RPEO (no simple feat, given the workload there) and the district choosing to not sign off on the release initially. By using a standard writing convention in academic literature — posing questions you do not intend to answer — the partnership had accidentally stumbled into territory where it was creating unexpected work for others on quick timelines. As the person who had to head into RPEO the next day and tell my colleagues that, I was not popular and neither was MEP.
The second situation came when a faculty member, whose research was supported by MEP, let the partnership know they planned to present preliminary findings to a city committee. MEP staff initially were thrilled: our supported project would get more publicity and the faculty member would have a chance to get feedback on the potential impact of the work. Seems like a win for everyone... until we considered what the spirit of “no surprises” means. For the district, it means they can count on the partnership to bring research findings to them first. For MMSD, a difficult part of working with external researchers has traditionally been that they talk about the district more than they talk with it. In creating MEP, we wanted to change that narrative by having conversations that always begin with MMSD and move outward from there. For this researcher to talk with the city first — even if only to highlight preliminary findings — meant falling back into the same old patterns before we had a chance to establish new norms.
How did we resolve these challenges? In the first, we crafted a new path for releasing briefs. We engaged in a tough, internal conversation about what we needed to change (such as creating an early review for RPEO, so they could plan for the work to come) and what we would not change (such as not allowing the district to delay release because of future research questions). In the second, we looped back with our supported researchers to request they first meet with district program staff to review findings, even if that meant building in additional time and presenting incomplete work. We plan to codify those expectations explicitly for all researchers working under MEP to ensure that they know what’s to come and to establish this new norm.
“No surprises” policies can seem pretty straightforward: When you finish something, send it to all parties with time to review prior to release — nothing too complicated about that. But what we have learned is that the spirit of “no surprises” goes far beyond just alerting people prior to research release and not blocking releases when bad news comes. At its core, the policy embodies respecting each other’s processes, anticipating the impact the work can have on other parts of the organization, and fostering an environment where partners are valued first and foremost. We continue to learn — in surprising ways — how to make that a reality.
The opinions expressed in Urban Education Reform: Bridging Research and Practice are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.