Opinion
School & District Management Opinion

Making Sense of School Improvement Program Evaluations (II): The Case of TEEM

By Marc Dean Millot — January 05, 2008 3 min read
  • Save to favorites
  • Print

(Readers’ please note: The December 20 posting generated a great deal of email. A comment worth making is one that should be posted on the blog. Emails to me that are not prefaced with “not for publication” are subject to posting.)

On December 20 I posted a piece on Edvance’s review of the Texas Early Education Model. The bottom line of that work, which covered only the first two years of a four-year effort, was equivocal:

There was considerable variation both between and within communities with regards to student performance and teacher outcomes. For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures. TEEM did lead to overall improvement for teachers, although there was considerable variation, with teachers in both control and treatment groups obtaining both positive and negative difference scores on the teacher outcome measure.

Staci Hupp of the Dallas Morning News translated this into “no proof that most children fared better in TEEM than in conventional preschool programs.” How should policymakers and taxpayers read the results? Like Hupp’s headline - “Landmark preschool program isn’t paying off”? And how should we think about school improvement program evaluation?
Many found the evaluation’s finding discouraging. Quite a few edbizbuzz reader despise the program and the provider - and let other readers know.

As someone with experience in the evaluation of education programs on a large scale, I found this part of the Edvance report intruiging:

“For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures.”

What was different about the two groups of communities? The Edvance evaluation tells us nothing about this. But we know from other research (for example, see here, here and here) that outcomes relate to the quality of implementation and implementation relates to the quality of teacher and agency support. This also relates to improvements for teachers - it’s quite unreasonable to expect teachers who do not buy into a program to improve by measures designed by that program. If the communities with superior performance had higher levels of program implementation and higher levels of support, it would not be accurate to imply that the program wasn’t working. However, we might infer that the program is only likely to work where it’s wanted, so the idea that it should become a statewide preschool strategy is flawed.

The advocates of TEEM are probably shooting themselves in the foot by pushing for statewide implementation, because they are almost certainly assuring mediocre results “on average.” But opponents equally shortsighted, because it’s quite likely that teachers and district administrators who share a belief in TEEMs efficacy will use it to the benefit of higher student performance.

There’s nothing overly complicated about this logic.

If you really believe in a diet program and find it fits your life style, you are more likely to use it, and so lose weight. Maybe there’s a plan out there that will allow you to lose even more weight, but if you don’t like it you won’t use it. And if you don’t use it, you won’t lose weight.

School improvement is no different. The products and services are not pills; they are programs. If teachers don’t like them, if administrators won’t provide the support, their benefits are purely theoretical. Providers who want to demonstrate high levels of effectiveness should not be eagerly accepting clients who will merely impose their programs on teaching staffs. District administrators who think they can obtain advertised results by imposing a program on teachers are fools. Teachers who don’t protest the imposition of programs they will not implement faithfully are setting themselves up for failure.

It would be nice if more research would focus on this problem, because it lies at the core of program efficacy.

Related Tags:

The opinions expressed in edbizbuzz are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

Mathematics K-12 Essentials Forum Helping Students Succeed in Math
Student Well-Being Live Online Discussion A Seat at the Table: The Power of Emotion Regulation to Drive K-12 Academic Performance and Wellbeing
Wish you could handle emotions better? Learn practical strategies with researcher Marc Brackett and host Peter DeWitt.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management In Their Own Words This Custodian Got Students to Stop Vandalizing and Take Pride in Their School
Andy Markus, the 2025 Education Support Professional of the Year, helped boost behavior and engagement in his Utah district.
5 min read
Andy Markus, the head custodian at Draper Park Middle School, in Draper, Utah, sits for a portrait during the National Education Association's 2025 Representative Assembly in Portland, Ore., on July 3, 2025. Markus was named the 2025 NEA Education Support Professional (ESP) of the Year.
Andy Markus, the head custodian at Draper Park Middle School, in Draper, Utah, sits for a portrait during the National Education Association's 2025 representative assembly in Portland, Ore., on July 3, 2025. Markus was named the 2025 NEA Education Support Professional of the Year for his mentorship of students.
Kaylee Domzalski/Education Week
School & District Management What the Research Says About School Boards: How Much Conflict Really Is There?
Plus, how competitive are board elections? How much do teachers' union endorsements matter?
7 min read
Houston ISD's appointed school board votes on the "District of Innovation" status during their monthly work session meeting at HISD Central Office on Sept. 7, 2023 in Houston.
Houston's appointed school board takes a vote during a meeting on Sept. 7, 2023 in the district's central office. A number of studies from recent years have answered questions about school boards' makeup, how competitive board elections are, whether conflict is on the rise, and more.
Karen Warren/Houston Chronicle via AP
School & District Management Opinion How a Weekly Email to My Staff Made Me a Better District Leader
Writing helps make sense out of what feels messy and focus us on what's most important.
George Philhower
5 min read
Blue hand holding red pen.
DigitalVision Vectors/Getty + Education Week
School & District Management What Superintendents Say Went Right—and Wrong—This Past School Year
Superintendents who shared their reflections with Education Week cited local successes, as well as funding uncertainty.
1 min read
Miniature people sitting on a bar graph.
iStock/Getty