Mich. Choice Story Needs Better Context
To the Editor:
Your front-page story on the Inkster, Mich., experience with charter schools (“Facing an Uncertain Future Under Choice,” Sept. 29, 1999) could have been far more informative, had you included information from a publication by the Center for Education Reform.
Part three of the center’s ongoing series, Charter Schools: A Progress Report, is entitled “The Ripple Effect, A Cresting Wave.” This research demonstrates the impact charters are having on traditional schools and districts. Using it for your story would have put into better context the situation Inkster finds itself facing, due to the loss of students.
The paper illustrates in districts nationwide the positive impact charter schools have had on their host districts and neighboring traditional public schools. Curriculum has been modified, public-private partnerships have been formed, and new business practices have been put in place. Public schools around the country have been taking notice and changing the way they conduct business. This is being done with the clear understanding that if they don’t act, more and more parents will make the same decision that hundreds of Inkster parents have and leave for schools that have a plan geared for success.
While you quote a study by Eric Rofes on the impact of charters on traditional schools, the CER’s paper is more comprehensive and builds on Mr. Rofes’ earlier work.
Inkster, Mich., has been losing students steadily for more than 30 years. In the 1980s, enrollment was down more than 50 percent from 1970 levels. The emergence of charter schools has done nothing more than illustrate that education can be and is being done better. Inkster was caught flat-footed. Only now have school officials there realized the need to prove themselves and compete to show they can do the job for the city’s children. Can you think of anything better to fight over?
Center for Education Reform
Texans Challenge ‘Exemplary’ Program
To the Editor:
Your recent article about the U.S. Department of Education’s list of “Exemplary and Promising Mathematics Programs” (“Education Department Is Set To Release Its List of Recommended Math Programs,” Oct. 6, 1999) includes Connected Mathematics, developed with National Science Foundation funding at the University of Michigan for grades 6-8.
I am writing on behalf of the parents of over 600 students in the Plano (Texas) Independent School District who have signed petitions requesting an alternative to this math program, which is currently being used in our district. We could not more vehemently disagree with the “exemplary” label and strongly oppose our tax dollars’ being spent to endorse a program we feel seriously fails to prepare our children to continue with higher math.
Parents in Plano have actively opposed the use of this program in our district for over a year and a half. Despite widespread parent opposition, the program was adopted for use in all of our middle schools. And the district refuses to provide an alternative despite 600 petitions to do so. A federal lawsuit was recently filed by six Plano parents against the district because of this issue.
Considering the controversy it has generated in our city as well as in Palo Alto, Calif.; Okemos and Bloomfield Hills, Mich.; Hixson, Tenn.; and Montgomery County, Md., to name just a few, Connected Mathematics’ “exemplary” rating in Washington seems ludicrous.
When our middle school math teachers were polled on the program’s adoption last year, the vote was not exactly an overwhelming endorsement. And 23 of our middle schools’ 94 math teachers did not return to teach this year. Can this almost 25 percent turnover rate be attributed solely to Connected Math? Probably not, but it was certainly a contributing factor.
In a time of such dramatic shortages of qualified math and science teachers, it seems counterproductive to us that a program of dubious academic merit was adopted. During the three years that Connected Math was piloted in four of our middle schools, there was no significant improvement or decline on the only consistently administered test in our district, the Texas Assessment of Academic Skills.
We don’t think that the Education Department should be spending federal money to evaluate or promote “fuzzy” math for districts around the country. Neither do we think that the department’s agenda does anything to promote the ideals of parental involvement and local control--or, for that matter, the learning of real math.
Kudos for Ed. Dept. Stance on Class Size
To the Editor:
Regarding the Commentary by Casey J. Lartigue Jr. (“Politicizing Class Size,” Sept. 29, 1999), I would note the following: University of Rochester economist Eric Hanushek’s “277 separate published studies on the effect of teacher-pupil ratios and class-size averages” are, in fact, 277 effect sizes drawn from fewer than 60 separate studies; Mr. Hanushek has not made public how many effect sizes were drawn from each study; Mr. Hanushek’s summaries do not include Tennessee’s Project STAR or Wisconsin’s Project SAGE, two of the best and largest studies of the topic; and most or all of Mr. Hanushek’s studies are studies of teacher-pupil ratios for school districts--a statistic shown repeatedly to be quite different from the number of pupils in a classroom.
Yes, indeed, the U.S. Department of Education changed its position on this matter--in light of some of the most solid data ever collected in education research. They are to be commended.
‘Standardistos': No Convert Here
To the Editor:
In “Confessions of a ‘Standardisto,’” Oct. 6, 1999, Scott Thompson writes the following:
“As the University of Pittsburgh researcher Lauren Resnick has explained, whereas the old approach holds time as the constant with results varying, a standards-based approach holds standards as the constant and time as the variable. In other words, all students are expected to meet challenging standards (a significant departure from the past, when whole categories of students were “tracked” into programs with dumbed-down curricula), but when it comes to pacing, scheduling, and support inside and outside the classroom, it all becomes highly individualized (also a significant departure from the past).”
In response to Ms. Resnick’s “standards constant/time variable” theoretical approach, how long should a child be required to spend on each standard or group of standards before he is allowed to move on? What if the child becomes frustrated by a particular standard or group of standards and his level of percent-mastery starts to decrease instead of increase? If time is always variable and the standards are always constant, is the time spent on mastering the standards ever too long? Will this type of reform have the effect of limiting exposure for lower-achieving students, thus increasing the achievement gap? Furthermore, isn’t the ability to effectively “individualize instruction” highly questionable?
I have yet to find any evidence that the schools of old with desks in rows were based on the “factory model.” Yet it has been established that this new approach, “standards-based education,” is based on an industrial model, one in which our children are viewed as “products” coming off the public school assembly line to be measured for “quality” through state assessments. A former state board of education member from Vermont put it this way: “The schools produce the products for business and industry.”
I am a Vermont parent who has actively researched and written about education reform for over five years. My concern and allegiance rest solely with the children and their parents. Although Susan Ohanian and I disagree in some ways (she supports a progressive approach, I support a traditional approach), I stand firmly with her in opposition to “standards-based education” and all those education reform opportunists whose primary motivation is making big bucks off their half-baked theories, rather than serving children.