Opinion
School & District Management Opinion

Making Sense of School Improvement Program Evaluations (II): The Case of TEEM

By Marc Dean Millot — January 05, 2008 3 min read
  • Save to favorites
  • Print

(Readers’ please note: The December 20 posting generated a great deal of email. A comment worth making is one that should be posted on the blog. Emails to me that are not prefaced with “not for publication” are subject to posting.)

On December 20 I posted a piece on Edvance’s review of the Texas Early Education Model. The bottom line of that work, which covered only the first two years of a four-year effort, was equivocal:

There was considerable variation both between and within communities with regards to student performance and teacher outcomes. For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures. TEEM did lead to overall improvement for teachers, although there was considerable variation, with teachers in both control and treatment groups obtaining both positive and negative difference scores on the teacher outcome measure.

Staci Hupp of the Dallas Morning News translated this into “no proof that most children fared better in TEEM than in conventional preschool programs.” How should policymakers and taxpayers read the results? Like Hupp’s headline - “Landmark preschool program isn’t paying off”? And how should we think about school improvement program evaluation?
Many found the evaluation’s finding discouraging. Quite a few edbizbuzz reader despise the program and the provider - and let other readers know.

As someone with experience in the evaluation of education programs on a large scale, I found this part of the Edvance report intruiging:

“For about half of the communities, students in the treatment groups (with TEEM) improved more than students in the control groups (without TEEM), and for the other half of the communities students in the control groups improved more than the students in the treatment groups on the student outcome measures.”

What was different about the two groups of communities? The Edvance evaluation tells us nothing about this. But we know from other research (for example, see here, here and here) that outcomes relate to the quality of implementation and implementation relates to the quality of teacher and agency support. This also relates to improvements for teachers - it’s quite unreasonable to expect teachers who do not buy into a program to improve by measures designed by that program. If the communities with superior performance had higher levels of program implementation and higher levels of support, it would not be accurate to imply that the program wasn’t working. However, we might infer that the program is only likely to work where it’s wanted, so the idea that it should become a statewide preschool strategy is flawed.

The advocates of TEEM are probably shooting themselves in the foot by pushing for statewide implementation, because they are almost certainly assuring mediocre results “on average.” But opponents equally shortsighted, because it’s quite likely that teachers and district administrators who share a belief in TEEMs efficacy will use it to the benefit of higher student performance.

There’s nothing overly complicated about this logic.

If you really believe in a diet program and find it fits your life style, you are more likely to use it, and so lose weight. Maybe there’s a plan out there that will allow you to lose even more weight, but if you don’t like it you won’t use it. And if you don’t use it, you won’t lose weight.

School improvement is no different. The products and services are not pills; they are programs. If teachers don’t like them, if administrators won’t provide the support, their benefits are purely theoretical. Providers who want to demonstrate high levels of effectiveness should not be eagerly accepting clients who will merely impose their programs on teaching staffs. District administrators who think they can obtain advertised results by imposing a program on teachers are fools. Teachers who don’t protest the imposition of programs they will not implement faithfully are setting themselves up for failure.

It would be nice if more research would focus on this problem, because it lies at the core of program efficacy.

Related Tags:

The opinions expressed in edbizbuzz are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Your Questions on the Science of Reading, Answered
Dive into the Science of Reading with K-12 leaders. Discover strategies, policy insights, and more in our webinar.
Content provided by Otus
Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management How Principals Use the Lunch Hour to Target Student Apathy
School leaders want to trigger the connection between good food, fun, and rewards.
5 min read
Lunch hour at the St. Michael-Albertville Middle School West in Albertville, Minn.
Students share a laugh together during lunch hour at the St. Michael-Albertville Middle School West in Albertville, Minn.
Courtesy of Lynn Jennissen
School & District Management Opinion Teachers and Students Need Support. 5 Ways Administrators Can Help
In the simplest terms, administrators advise, be present by both listening carefully and being accessible electronically and by phone.
10 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
School & District Management Opinion When Women Hold Each Other Back: A Call to Action for Female Principals
With so many barriers already facing women seeking administrative roles, we should not be dimming each other’s lights.
Crystal Thorpe
4 min read
A mean female leader with crossed arms stands in front of a group of people.
Vanessa Solis/Education Week via Canva
School & District Management Opinion Want a Leadership Edge? You Already Have What You Need
School leaders are faced daily with challenging situations. Here's how to prevent the tail from wagging the dog in responding.
Danny Bauer
4 min read
Screen Shot 2024 04 05 at 5.35.06 AM
Canva