Opinion
School & District Management Opinion

Making Sense of School Improvement Program Evaluations (II): The Case of TEEM

By Marc Dean Millot — January 05, 2008 3 min read
  • Save to favorites
  • Print

(Readers’ please note: The December 20 posting generated a great deal of email. A comment worth making is one that should be posted on the blog. Emails to me that are not prefaced with “not for publication” are subject to posting.)

On December 20 I posted a piece on Edvance’s review of the Texas Early Education Model. The bottom line of that work, which covered only the first two years of a four-year effort, was equivocal:

There was considerable variation both between and within communities with regards to student performance and teacher outcomes. For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures. TEEM did lead to overall improvement for teachers, although there was considerable variation, with teachers in both control and treatment groups obtaining both positive and negative difference scores on the teacher outcome measure.

Staci Hupp of the Dallas Morning News translated this into “no proof that most children fared better in TEEM than in conventional preschool programs.” How should policymakers and taxpayers read the results? Like Hupp’s headline - “Landmark preschool program isn’t paying off”? And how should we think about school improvement program evaluation?
Many found the evaluation’s finding discouraging. Quite a few edbizbuzz reader despise the program and the provider - and let other readers know.

As someone with experience in the evaluation of education programs on a large scale, I found this part of the Edvance report intruiging:

“For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures.”

What was different about the two groups of communities? The Edvance evaluation tells us nothing about this. But we know from other research (for example, see here, here and here) that outcomes relate to the quality of implementation and implementation relates to the quality of teacher and agency support. This also relates to improvements for teachers - it’s quite unreasonable to expect teachers who do not buy into a program to improve by measures designed by that program. If the communities with superior performance had higher levels of program implementation and higher levels of support, it would not be accurate to imply that the program wasn’t working. However, we might infer that the program is only likely to work where it’s wanted, so the idea that it should become a statewide preschool strategy is flawed.

The advocates of TEEM are probably shooting themselves in the foot by pushing for statewide implementation, because they are almost certainly assuring mediocre results “on average.” But opponents equally shortsighted, because it’s quite likely that teachers and district administrators who share a belief in TEEMs efficacy will use it to the benefit of higher student performance.

There’s nothing overly complicated about this logic.

If you really believe in a diet program and find it fits your life style, you are more likely to use it, and so lose weight. Maybe there’s a plan out there that will allow you to lose even more weight, but if you don’t like it you won’t use it. And if you don’t use it, you won’t lose weight.

School improvement is no different. The products and services are not pills; they are programs. If teachers don’t like them, if administrators won’t provide the support, their benefits are purely theoretical. Providers who want to demonstrate high levels of effectiveness should not be eagerly accepting clients who will merely impose their programs on teaching staffs. District administrators who think they can obtain advertised results by imposing a program on teachers are fools. Teachers who don’t protest the imposition of programs they will not implement faithfully are setting themselves up for failure.

It would be nice if more research would focus on this problem, because it lies at the core of program efficacy.

Related Tags:
Research Opinion

The opinions expressed in edbizbuzz are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Webinar
Reflections on Evidence-Based Grading Practices: What We Learned for Next Year
Get real insights on evidence-based grading from K-12 leaders.
Content provided by Otus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Creating Resilient Schools with a Trauma-Responsive MTSS
Join us to learn how school leaders are building a trauma-responsive MTSS to support students & improve school outcomes.
School & District Management Live Online Discussion A Seat at the Table: We Can’t Engage Students If They Aren’t Here: Strategies to Address the Absenteeism Conundrum
Absenteeism rates are growing fast. Join Peter DeWitt and experts to learn how to re-engage students & families.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Quiz Quiz Yourself: When Are District and School Leaders Most Likely to Read Emails?
Wondering when district and school leaders are most likely to check their emails? Take our quick quiz and discover the ideal times to send your messages for better engagement.
MB Data Emails 031622 GettyImages 1170828052
DenEmmanuel/iStock/Getty
School & District Management Opinion You Shouldn’t Have to Sacrifice Your Health to Be a Good School Leader
Far too many principals suffer from trying to carry a crushing responsibility alone. I was one of them.
Joshua Ray
4 min read
A blue balloon rises above a group of orange balloons. Metaphor for leadership finding themselves alone at the top.
Vanessa Solis/Education Week via Canva
School & District Management What These New Principals Did to Get the Hang of Being in Charge
Three new principals share their tips to tackle the tricky first year on the job.
7 min read
Image of leaders traveling to a door made out of an upward arrow.
Yutthana Gaetgeaw/iStock/Getty
School & District Management Download How Schools Can Prepare for Sexually Explicit Deepfakes (DOWNLOADABLE)
Three steps administrators should take before a student creates a harmful image with AI.
1 min read
Hand showing phone with face hologram and glowing circle. Social media impersonation. Concept of face swapping, deep fake and personal information protection.
iStock/Getty Images Plus