Common Adaptive Tests to Address Special Needs
Questions raised about adaptive assessments
Computer-adaptive testing, in theory, should allow educators to pinpoint more accurately the achievement levels of students with disabilities, to focus on areas where those students need help.
Designed to provide each student with an individualized test, computer-adaptive testing gives students with disabilities more questions they can get right, preventing frustration, and can provide support to aid students as they take the exams, assessment experts say.
Advocates for such students say they are excited about the potential of adaptive tests to both engage students and measure their learning more accurately. But those advocates are also watching to make sure that the designers of computer-adaptive tests—which are being created by the Smarter Balanced Assessment Consortium to assess performance on the Common Core State Standards—understand that just because students may lag in one area, that doesn’t preclude them from being on grade level or advanced in another area.
“Our biggest fear is that the test will lock out kids with disabilities because of the structure,” says Lindsay E. Jones, the senior director of policy and advocacy services for the Washington-based Council for Exceptional Children. “Students with disabilities do not demonstrate typical learning patterns. Their skills may jump around.”
The Consortium for Citizens with Disabilities recently drew up a policy statement on computer-adaptive testing and emphasized that students should be tested on the full range of grade-level content regardless of their proficiency levels entering the test.
“A poorly designed adaptive test can deny students an opportunity to demonstrate their knowledge across the grade-level content,” the statement says. “It is important to keep in mind that difficulty and cognitive complexity are not the same.”
Computer-adaptive tests are structured to provide test questions based on previous student answers; for example, if a student answers a question correctly, he or she will then be presented with a more difficult question. Advocates worry that students who score low in one area, though, may never be presented with harder questions in other areas.
“The test must allow students to view a range of complexity, or else you’re going to cognitively discriminate against them,” Jones says.
Font Adjustments and Translations
Magda Chia, the director of support for underrepresented students at Smarter Balanced, says the coalition is aware of the potential problem. All students will start the test in the middle of the difficulty range, she says, instead of basing the starting level of their exams on past performance.
“The computer will start off at a wide-angle lens, and as the student responds, the computer lens gets more and more focused,” she says.
Bill Stewart, the assessment and special-projects coordinator for the 2,000-student Gladstone school district in Oregon, where adaptive testing is used for special education students, says the method has worked well.
Experience has shown educators “not to be presumptive about what they can and cannot do,” he says. Adaptive testing “lets the kids and their skill determine how high and low they can go, not the teacher’s expectations.”
Most students with disabilities in states that have chosen to support Smarter Balanced adaptive testing will take those tests, Chia says. Two separate coalitions are developing common-core assessments for the 1 percent of students with the most severe cognitive disabilities.
Because a majority of students in states that have joined the Smarter Balanced coalition will take the adaptive tests, Chia says, the goal is to make them more accessible to all students whether or not they have disabilities. The test font size will be adjustable, for example, to allow enlarged type if needed. Students will have the ability to highlight sections as they go through the test and may use an option eliminator to take away answer options a student has discarded to help with focus.
In addition, some parts of the tests will feature translations for English-language learners, glossaries may be available to define certain words, and software can vocally describe charts and graphics for students with visual impairments.
“We’re really focusing on accessibility,” Chia says. “And these are things that could help any student.”
Oregon, which has been using adaptive testing for nine years, instituted a Braille version last school year for its students with visual impairments. The process involved culling questions from the test’s 18,000-item bank that would not work with such students. The result was 16,600 test items that could be translated using a refreshable Braille display, or could be read or described to a student using special software.
“We looked through every single item and ranked each according to accessibility,” says Holly Carter, an assessment policy analyst for the Oregon Department of Education. “We were able to identify the gaps that did exist to make sure we had that full depth and breadth included in the Braille pool.”
Vol. 06, Issue 01, Pages 14-15
Get more stories and free e-newsletters!
- Superintendent of Schools
- Cape Elizabeth School Department, Cape Elizabeth, ME
- Rogers Public Schools, Rogers, AK
- Speech/Language Pathologist
- Prince William County Schools, Multiple Locations
- Library Media Specialist
- Prince William County Schools, Manassas, VA
- Head of School - Rindge Avenue Upper School
- Cambridge Public Schools, Cambridge, MA