Weigh Proficiency, Assess Content
Students who are still working to master the English language are being held to the same reading and math proficiency targets as native English-speakers.
Susan B. Martin, who directs the English-Language Learning Program for Lewiston, Maine’s school district, thinks the federal No Child Left Behind Act is a good idea. Its application, however—especially when it comes to the testing requirements for English-learners—is another matter, she says.
“The original idea behind NCLB is that we should treat all kids the same—all kids are entitled to the same set of standards,” says Martin, whose district of 5,000 students includes hundreds of African refugees. “Where it’s gone astray is assuming that all kids, including ELL kids, can meet those standards in the same amount of time.”
The challenge for Lewiston, and for thousands of districts nationwide, is to satisfy two very different mandates of the federal law: assessing how well non-English-speakers are learning the language, while holding them to the same reading and math proficiency targets required of native English-speakers.
For Lewiston, a city of about 35,000 some 45 minutes north of Portland, that means working with refugees from Somalia and other African countries whose families began settling there in early 2001. About 17 percent of the district’s students are now English-learners, and many of them, Martin says, had no experience with formal schooling in any language.
And yet, under NCLB, English-language learners must be included in regular state mathematics and reading tests designed for native English-speakers. English-learners must also be tested annually on an English-proficiency assessment.
“Our [limited-English-proficient] kids who have been in school less than three years may have made great progress, but they aren’t going to be at grade level,” Martin says. “We’re having 20-year-olds walking into high school and wanting to go to school, and we welcomed them, but they’re not going to graduate with a high school diploma.”
Largely because of how their English-learners perform, schools in Lewiston with large immigrant populations have been labeled as unable to meet achievement goals under NCLB, says Martin.
“If I went to Africa and studied for two years and I had to take the SAT [in an African language],” she noted, “I probably wouldn’t do very well, and it wouldn’t be because the school was failing or because I wasn’t very smart.”
Over the seven years since NCLB was signed into law, a growing chorus of educators and researchers has been pointing out what they see as a sometimes-glaring contradiction: requiring students still learning the basics of English to demonstrate their mastery of content on tests usually written in English.
Jamal Abedi, an education professor at the University of California, Davis, and an expert on the testing of English-learners, calls the tension between demonstrating English-language proficiency and demonstrating proficiency in reading, math, and science “one of the most fundamental issues for English-language learners.”
“If they’re not at the level of proficiency to understand assessment questions, how would you expect [the assessments] to give valid outcomes for those kids?” he asks.
Under the U.S. Department of Education’s interpretation of the law, recently arrived students whose English-proficiency tests grant them ELL status may be exempt from one administration of their state’s annual reading/language arts assessment, and their math scores do not need to be included for adequate-yearly-progress purposes for one year. But many educators see that grace period as woefully inadequate.
“You can’t wait for them to master language before you teach content,” says Jessica Loose, the lead English-as-a-second-language teacher for the Dare County school district in North Carolina, which has about 300 students identified as English-learners among its 4,700 students. “NCLB is said to be research-based, but research shows it takes five to seven years for someone to learn a language. We’re in the unfortunate position of doing the best we can.”
A recent study finds that all 50 states and the District of Columbia are providing assessment accommodations to English-language learners. Such accommodations, which are intended to reduce language-based barriers to demonstrating content knowledge, include direct linguistic support in English or a student’s native language, as well as indirect linguistic support in the form of extra time to complete a test. The majority of states offer all three forms of accommodations to ELL students.
Not that there aren’t good tools, Loose says, citing the Sheltered Instruction Observation Protocol model that her district uses to simultaneously teach ELL students academic content and English. That model was developed by California State University-Long Beach researchers Jana Echevarria and Mary Ellen Vogt with Deborah J. Short, a researcher at the Washington-based Center for Applied Linguistics.
Loose, who has taught in the district for seven years, has added some inventive instruction of her own. In addition to the 5th grade math and 1st grade language arts classes she co-teaches at Manteo Elementary School in Manteo, N.C., she leads a one-hour pullout class for 1st graders that she calls ESL Math Literacy
“I take it as a given that the [ELL] objective of NCLB—100 percent proficiency—is close to impossible to meet, and then I put all my energy and all my work into mastering content and language objectives as best we can,” she says.
Part of the problem, Abedi says, lies in the test materials typically available.
“These assessments are mostly field-tested for mainstream students,” he notes. And some accommodations—special conditions or allowances permitted in an effort to level the assessment playing field for ELL students—“don’t really help,” he adds, because they were developed for students with disabilities, not for English-learners.
The George Washington University Center for Equity and Excellence in Education examined state assessment policies for accommodating English-language learners in content-area tests. Its study, released last fall, found that “all states have more distance to go in having ELL assessments that are responsive,” says Charlene Rivera, the executive director of the Arlington, Va.-based center.
That also goes for the English-proficiency side of the ELL-assessment equation, says H. Gary Cook, a researcher at the Wisconsin Center for Education Research, part of the University of Wisconsin-Madison.
“We, the research community, haven’t focused on the needs of these students as well as we could have or possibly should have,” says Cook, a former director of Wisconsin’s office of educational accountability and a former official at Harcourt Educational Measurement, now part of Pearson Education. “Academic language proficiency ... is a very ill-defined domain.”
That ambiguity is reflected in the hodgepodge of proficiency definitions. There are four main groupings of states, each with its own English-language-proficiency assessment.
The World-Class Instructional Design and Assessment consortium, known as WIDA, is the largest of the bunch, comprising 19 states mostly in the East and Midwest, including Illinois, New Jersey, North Carolina, and Virginia.
Several states in the South and Midwest, including Louisiana and Iowa, use versions of the English Language Development Assessment, which was developed by the American Institutes for Research for the Council of Chief State School Officers.
A handful of scattered states use Language Assessment Scales Links, which was developed by Monterey, Calif.-based test-maker CTB/McGraw-Hill. A different handful of states use some version of the Stanford English Language Proficiency Test, which was developed by Harcourt Assessment Inc., now part of Pearson.
And some states—including California and New York, which have large ELL populations—have developed their own, entirely separate assessments.
The picture is made even more complex by variations within any given school’s or state’s ELL population. Tennessee isn’t known as a hotbed of linguistic diversity, but Jan Lanier, the state’s ESL coordinator, says that in a typical year, between 115 and 130 native languages are represented there.
The challenges inherent in trying to assess these students’ content mastery are made even more difficult by the fact that a Tennessee does not have native-language-instruction programs.
“They’re required to take the math and science test the first year they’re here, and that [science test] is ... very language-intensive,” Lanier says of her state’s ELL students. “It puts them at a disadvantage.”
Even within each native language, there can be a wide range of literacy skills.
“Many of our students come from excellent educational backgrounds, but they’re tested in English, so that the [Florida Comprehensive Assessment Test] becomes a measurement not of their mastery of the material, but of their ability to express it in English,” says Ann Jackman, the president of the Sunshine State TESOL of Florida association. Florida does not provide native-language assessments for ELLs.
“That’s our biggest frustration,” adds Jackman, who’s also an instructional specialist for the 173,000-student Palm Beach County school district, 14 percent of whose pupils are ELL students, representing 142 native languages or dialects.
At the other end of the native-language-proficiency spectrum are those students collected under the umbrella term “students with interrupted formal education,” or SIFE.
“We have a lot of children who fall under that acronym,” says Lanier of Tennessee, referring to refugee students who have seldom attended classes. “They’re having to learn school culture and test-taking culture at the same time.”
She says some of her teachers worry about the effect the assessments are having on students.
“Teachers are very concerned that we may be causing test anxiety by forcing these kids to take these tests when we know they’re not quite ready for [them], but we have no choice—we’re mandated to do that by federal law.”
Some Federal Leeway
Elissa Leonard, a U.S. Department of Education spokeswoman, says that the NCLB law allows the use of native-language assessments of content if a state has one. Thirteen states—California, Colorado, Delaware, Kansas, Massachusetts, Michigan, Nebraska, New Jersey, New Mexico, New York, Oregon, Pennsylvania, and Texas—offer native-language assessments in some grades or subjects, according to the Editorial Projects in Education Research Center.
Still, Leonard says, the department is constrained by the law.
“The amount of flexibility the Department of Education has in implementing the NCLB core content assessment requirements for ELLs is determined by the wording of the federal statute,” she says. The law requires that all student subgroups, including ELLs, be assessed in the core content areas of reading or language arts, mathematics, and science.
Leonard adds that the department pushed for expanded assessment options for ELL students in its NCLB reauthorization proposal, but notes that reauthorization did not happen in 2008.
Edynn Sato, the director of research and English-language-learner assessment for WestEd, a San Francisco-based research group, says that accommodations—and the educators who use them—have to strike a delicate balance.
“What are we really trying to get the student to engage with—the English language of the test ... or the language of the content?” she asks. “What kind of supports are we able to provide ELLs without misrepresenting the content to the students or simplifying the content below grade level? We don’t want to dumb down the content.”
Cook, of the Wisconsin Center for Education Research, is not a fan of everything that’s in the NCLB law, but he does credit it with throwing a light on the specific assessment needs of ELL students.
“Before NCLB, very few states had their own ELL assessments,” he says. “Now the assessments are associated with the state standards, and they’re designed for ELLs. That’s a good thing.”
Vol. 28, Issue 17, Pages 35-36Published in Print: January 8, 2009, as Weigh Proficiency, Assess Content