Evidence on Effect of Culture-Based Teaching Called Thin
Many educators of language-minority students say they teach more effectively when they align their instruction with their students’ culture.
And some states have teacher-credentialing policies based on a similar assumption: California requires all teachers to be trained in understanding students’ culture, for example, and Florida mandates that all elementary school teachers receive training in cross-cultural communication.
Yet few research studies have actually examined whether culture-based instruction affects the achievement of such students.
A research review by the National Literacy Panel on Language-Minority Children and Youth, commissioned by the federal Institute of Education Sciences and published by Lawrence Erlbaum Associates in 2006, concluded, for instance, that not one study showed that culture-based education improved achievement in reading and writing.
While some studies make claims that student literacy improves with culture-based education, those studies have design problems, said Claude Goldenberg, an education professor at Stanford University, a co-author of the review’s chapters on the topic.
But other researchers who study culture-based instruction say providing empirical evidence of its effectiveness is difficult. They contend that research on the effects of culture-based instruction on student learning should continue to give a high priority to qualitative methods as well as experimental or quasi-experimental methods.
A number of studies show “culture-based education systematically produces greater student engagement, greater parent involvement, better attendance rates, lower dropout rates, better graduation rates, and general satisfaction of all participants, as opposed to a standard, traditional program based on mainstream models,” said Roland G. Tharp, a research professor at the University of California, Berkeley, who has studied culture-based education for decades, particularly in indigenous communities in the United States.
Mr. Tharp conducted one of the studies recognized by some experts in the field as providing empirical evidence that culture-based education improves students’ reading, but he believes a lack of such evidence shouldn’t prevent school administrators from promoting such education. “If it’s a good idea and there’s no evidence it does harm, then let them do it,” Mr. Tharp said.
Mr. Goldenberg and Diane August, a senior research scientist at the Washington-based Center for Applied Linguistics, believe the field needs more studies that are better designed to examine specific cultural accommodations and focus on student outcomes. Ms. August co-edited the National Literacy Panel’s report and also co-wrote the section on culture-based education, along with Mr. Goldenberg and Robert S. Rueda, an education professor at the University of Southern California.
“Although a significant amount of work has been done on sociocultural factors involved in teaching and learning, much of it is plagued by methodological and theoretical problems,” the authors write.
The most common methodological problem in the studies, they say, is “unsubstantiated claims.” Also, they contend, the results of some studies are misinterpreted in research literature by others.
A 1981 study of reading lessons for Hawaiian-native children, conducted by Kathryn Hu-Pai Au and Jana M. Mason, for example, is often cited to show that cultural accommodations improve reading outcomes. Mr. Goldenberg considers it to be one of the best studies of culture-based education and student achievement.
In fact, though, the study measured only student engagement and participation during reading lessons—not reading outcomes, Mr. Goldenberg and his co-authors say in the 2006 report.
In an evaluation of the same program in Hawaii that Ms. Au and Ms. Mason used for studying reading lessons, Mr. Tharp reported in 1982 that the program produced positive, if modest, reading gains, the National Literacy Panel researchers say.
In an interview last month, Mr. Tharp said his study showed that culture-based education improved students’ test scores in reading.
But the National Literacy Panel researchers contend that Mr. Tharp’s evaluation proves only that the over all program sparked a rise in reading scores and doesn’t show to what degree cultural accommodations caused the increase.
Ms. August would like to see studies that carefully separate out cultural accommodations for examination. Examples of such accommodations are teaching students with reading materials based on their culture or using education goals set by members of their cultural community, she said.
“It’s hard to figure out: Is it the cultural accommodation or the teaching that matters?” Ms. August said in an interview.
One of the major funders for research on the education of language-minority students is the U.S. Department of Education’s Institute of Education Sciences, or IES. For example, it is paying for a $5.4 million study that compares the effectiveness of bilingual and English-only methods in teaching English-language learners.
But the IES staff could point to only one group of researchers that the institute has funded to study the effects of cultural accommodations other than language on student achievement. That study focuses on math, not literacy.
The institute supported a team led by Jerry Lipka, a professor of education in the geography department at the University of Alaska Fairbanks, to examine how a math curriculum based on the Yup’ik culture of Alaska affected math achievement among students of all cultures and backgrounds in selected Alaska schools. It also recently granted Mr. Lipka’s team $1.5 million to create and test additional culturally based materials.
The institute hasn’t seen many applications for studies on the effectiveness of culture-based education, said Bruce Friedland, a spokesman for the IES. “It would not be impossible to do a culture-based-instruction type of research project with random assignment and rigorous evidence-based research,” he said.
Still, Luis C. Moll, an education professor at the University of Arizona, in Tucson, said he suspects it would be a waste of time to apply for IES funding because the institute stresses how students perform on multiple-choice tests.
“Whether kids engage more with the material, increase their participation, whether they are willing to explore new topics, whether the teacher is able to create flexibility for innovation in the classroom—all that is missed when you focus on test scores,” he said.
Mr. Moll has helped develop a framework for educators’ becoming familiar with “funds of knowledge” in Mexican communities that can lead to improvements in how they teach children of Mexican heritage.
William G. Demmert Jr., an education professor at Western Washington University, in Bellingham, Wash., agrees that the federal institute’s criteria for underwriting education research aren’t a good fit for the study of culture-based education in indigenous communities.
Mr. Demmert, who is a member of the Oglala Sioux and Alaska Tlingit tribes, is participating in a quasi-experimental study of language-and culture-based instruction in four schools in indigenous communities that is financed by the Princeton-based Educational Testing Service and the private Kamehameha Schools in Hawaii.
Mr. Lipka’s work, however, shows that a researcher of culture-based education can design an experimental study that meets the IES’ funding criteria.
His two-year study of a math curriculum developed in consultation with Yup’ik elders shows significant gains in math for 2nd graders in selected Alaska public schools who used two of the curriculum modules, compared with the performance of students who didn’t use them.
Mr. Goldenberg, the Stanford professor, argues that even if researchers of culture-based education don’t want to design studies to match the IES criteria, they should improve the quality of their research by including comparisons of groups of students and focusing on learning outcomes.
“Outcomes are not limited to standardized tests,” he said. “It can be student writing. You can interview students about what they’ve learned. You can examine their work products.
“The problem with these studies [examined by the National Literacy Panel] is they didn’t look at outcomes.”
Coverage of education research is supported in part by a grant from the Spencer Foundation.
Vol. 27, Issue 17, Page 8