A new study from the University of Georgia shows that if test developers include graphics with math items that are relevant to the question asked, they can partly close the performance gap on those test items between English-proficient students and English-language learners.
The study had a sample of 3,000 students, including about 400 ELLs, from a large suburban school district in the South. My colleague Sarah Sparks, who heard a presentation of the research findings at the American Educational Research Association conference, in New Orleans, reports that English-proficient students outperformed ELLs on questions without a relevant graphic by 7.9 percent. But when ELLs had a useful graphic, that gap closed to 2.8 percent, and ELLs outperformed English-proficient students on 28 graphic questions.
I’ve reported already on this blog how Rebecca Kopriva, a senior scientist at the Wisconsin Center for Education Research, has been developing math and science test items that are heavy on graphics rather than text. She’s created them as part of a project called Obtaining Necessary Parity through Academic Rigor, or ONPAR, which has a web site where you can see a sample of some of the items.
The University of Georgia study suggests her work is on the right track.
A version of this news article first appeared in the Learning the Language blog.