Includes updates and/or revisions
A review of state policies by researchers from the University of California, Los Angeles, raises questions about the validity of the use of home-language surveys as a step to identify students eligible for special help in learning English.
While it’s ubiquitous in schools across the country, the practice of educators’ giving a home-language survey to parents or students who are believed to speak a language other than English at home is not mandated by the federal government. Federal law does, however, require that states somehow identify students who need extra services to learn English, and many schools use such a survey to find out whether children’s English skills should be tested.
But the wording of questions on the surveys and how the questionnaires are carried out vary so much among states, and the validity of the information gathered from them is so unproved, that the researchers suggest it might be best for home-language surveys to be abandoned. They say a study should be conducted to explore if schools should instead use a short language-screening tool for all students.
In their study, conducted with a grant from the U.S. Department of Education, Alison L. Bailey, a professor of education for UCLA, and Kimberly R. Kelly, a doctoral student in education at the university, spell out how home-language surveys can both underidentify and overidentify students who need support to learn English. The study was published this week as a “white paper” by Evaluating the Validity of English-Language-Proficiency Assessments, or EVEA, the partnership that got the study grant.
“My two main questions to the state agencies involved are: Do you know how many students are initially misidentified each year? And how quickly do you rectify the situation for students and their families?” Ms. Bailey wrote in an e-mail message to Education Week.
The UCLA researchers contend that the quality of information taken from home-language surveys is “rarely scrutinized.” In some cases, children who need special help may be missed by the survey, leaving it up to their teachers to request further testing, which might lead to delays in students’ receiving special help. At the same time, they write, sometimes students who are already proficient in English or even are native speakers of English are identified as in need of testing by the surveys.
“The bottom line should be the use of an instrument that gives us the most valid and reliable way of identifying those students in the general K-12 population who are potentially in need of English-language services,” said Ms. Bailey. “If that proves to be a short language screener given to the entire K-12 population to be able to initially identify candidate students, then absolutely, I would advocate this.”
In their paper, the researchers list the questions on surveys in a number of states with high numbers of ELLs. For example, the questions on the survey for California, which has the most ELLs of any state, are: “Which language did your child learn when he/she first began to talk?” and “Which language is most often spoken by adults in the home (parents, guardians, grandparents, or any other adults)?” Children must take an English-language-proficiency test if the response to either question is a language other than English.
States such as California that focus their questions on the first language of students might overidentify for testing children whose first language isn’t English but who have subsequently become proficient in that language, the researchers say. California’s second question may also lead tooveridentification, the researchers say, because it focuses on what language any adult living in the household of the child may speak, regardless of what language the child speaks.
At the same time, the researchers say that Arizona’s decision to recently change its survey to ask only one question may lead to underidentification of students who need special help to learn English.
Arizona’s survey traditionally asked parents what primary language was spoken in the home, the language most often spoken by the student, and the student’s first language. If parents responded with a language other than English to any of those questions, a student was given an English-proficiency test to see if he or she qualified for ELL services. The new survey asks parents only one question: “What is the primary language of the student?”
A separate study by two Stanford University researchers, Claude Goldenberg and Sara Rutherford Quach, not mentioned by the UCLA researchers, seems to indicate that the new version of the home-language survey has led to underidentification of ells. The Stanford researchers studied students in two Arizona school districts to see if the new survey is likely missing those who need English-language services. Of the 6,234 students tested by one of the districts in grades K-5 during the 2008-09 school year, the researchers calculated that 1,540 would not have been identified by the new version of the home-language survey. And of those students the new survey failed to pick up, 72 percent were found by the researchers to be less than proficient in English and in need of services.
The fact that Arizona changed its home-language survey is an issue that has been raised in a pending federal court case about Arizona’s approach to ELLs. The U.S. Supreme Court heard the case, Horne v. Flores, in April 2009 and remanded it back to the U.S. District Court in Tucson in June 2009. An evidentiary hearing for the case is scheduled to begin in the U.S. District Court on Sept. 1. (“Researchers’ ELL Data Subpoenaed in Arizona Court Case,” this issue.)
Another problem with some home-language surveys, according to the ucla researchers, is that questions may be based on an assumption that if a child is “dominant” in English, he or she is proficient in the language, which they say may not be the case.
“Some students, while more dominant in English than another language,” they write, “may not have received extensive exposure to English nor reached a level of English proficiency sufficient for learning academic content in English.”
The paper provides basic information about the home-language-survey practices in all 50 states but recommends that a more-comprehensive study of state policies be conducted. Some states mandate that all school districts use a particular home-language survey while others provide simply a “sample” form. Some states require the use of a home-language survey but don’t specify what questions should be on it. And still others don’t mandate the use of a survey at all.
To ensure more uniformity, the paper recommends that the U.S. Department of Education provide more guidance on the issue.
It also calls for states to collect data on how accurately their home-language surveys are identifying the students who are in need of English-language services.
Jamal Abedi, a professor of education at the University of California, Davis, also has been critical about how home-language surveys have been applied. He’s on the advisory board for EVEA but wasn’t an adviser for the study by Ms. Bailey and Ms. Kelly.
In a paper published in 2008 by the National Council on Measurement in Education, Mr. Abedi noted that “parents may give inconsistent information [on home-language surveys] for a variety of reasons, including concerns over equity of opportunity for their children, citizenship issues, and poor comprehension of the survey form or interview.”
Unlike the UCLA research team, though, he doesn’t suggest the home-language survey should be abandoned. “One thing that could be done is to take a look at these variations and see what could be done to improve the quality,” he said. For example, he said, he hasn’t heard of any states that have done studies to see if parents interpret the questions on surveys as intended by the writers, a standard practice for questions on standardized student assessments. Mr. Abedi said school districts should give the home-language survey to all students, not just those perceived to speak or to be exposed to a language other than English at home.
At least one state, New York, is taking a closer look at how school districts are using its home-language survey. Pedro J. Ruiz, who oversees programs for ELLs in New York, reports that the state just added new questions to its home-language survey to gather more information about students’ educational andlanguage needs. And New York education officials have drafted a guidance document for the whole process of identifying ELLs that will be tested this school year and finalized next summer.
Robin M. Lisboa, the president of the National Council of State Title III Directors, an association for state education officials who oversee programs for ELLs, said in an e-mail that any scrutiny that might make the information garnered from home-language surveys more accurate is welcome.
She said that additional federal guidance would be important if the goal is to identify students uniformly across states for language support services. Federal officials could help state officials “by defining what the purpose of a home-language survey is or at least what they would like to see the home-language survey accomplish,” she said.
A version of this article appeared in the August 25, 2010 edition of Education Week as Researchers: ELL Surveys Are Flawed