Adaptive Tech., Secure Browsers Aim to Curb Student Cheating
Adaptive tests, secure browsers, and plagiarism-detecting software are making it harder for students to cheat the system
When it comes to curbing student cheating, technology can be both a blessing and a curse.
The blessing: As new technologies make assessments more adaptive, students will be less likely to see the same test questions as their peers. The curse: Students' technological skills are often ahead of those of educators, putting them in a position to figure out how to use digital tools to cheat in new and different ways.
Some recent incidents show how this game of cat-and-mouse is playing out, raising questions about whether cheating will be worse or better as states and districts move to greater use of online testing as part of the new common-core assessments, as well as for classroom and district testing unrelated to the new academic standards.
In California, for example, hundreds of state assessments were flagged for review last school year after students posted photos of exams or test questions on social-networking sites like Facebook, Instagram, and Twitter. In a Michigan district, a student texted a photo of an exam answer key to friends, resulting in punishments for the students involved.
Cheating by both students and adults—like the recent scandal in the Atlanta public schools in which educators are alleged to have erased students' incorrect answers and replaced them with correct ones to boost scores—have always been a problem, but high-tech devices are changing the landscape. Cellphones, pen cameras, and high-tech manipulation allow students and adults to inappropriately copy and distribute test questions in very fast and efficient ways.
But new technology has also made it more of a challenge for those who might try to cheat on assessments.
"There's less opportunity to cheat in a computer-based environment," insisted Sharolyn Sorrels, the director of educational indicators for the 42,000-student Tulsa, Okla., district, which recently moved primarily to online testing for state assessments for grades 6 and up. "I feel much more secure with computer testing than a paper test," she said.
While several states have tested students online for years, a huge new crop of districts is moving toward computer-based testing as the Common Core State Standards requirements kick in. Those standards call for districts to test students online by the 2014-15 school year. That common-core push for online testing, experts say, is likely to lead to more online testing in general in schools.
Customizing Test Questions
The nature of adaptive testing, which chooses test questions for each student based on previous test answers, makes it likely that each student's exam will be unique, said Brandt Redd, the chief technology officer for the Smarter Balanced Assessment Consortium, one of two main coalitions developing common-core testing. Smarter Balanced's test is adaptive, tapping into a bank of 21,000 questions for exams, so students can't glance at one another's screens and expect to see the same questions and then copy answers, he said.
In addition, districts using Smarter Balanced tests are required to use a secure browser, developed with open-source code by the Washington-based American Institutes for Research, to administer the tests, Mr. Redd said. The browser locks down the digital devices students are using to take the tests, preventing them from going elsewhere on the Internet to find answers, access apps to help them cheat, or take screen shots to share with other students. The browser interfaces with the testing system, periodically checking to make sure the student-testing devices have remained secure during the testing periods, he said.
The browser has been certified to work with most of today's devices and operating systems, including Mac, Windows, and iOS. It will also work with most tablets—though those devices are more of a challenge since there is less consistency in operating systems among them, Mr. Redd said.
Smarter Balanced is also providing a certification for emerging devices as new technology develops, he said. The onus is on device-makers to make sure the browser works.
What's more, the test questions are stored using online security measures to prevent hackers from accessing them, Mr. Redd said.
The other major consortium developing common-core assessments, the Partnership for Assessment of Readiness for College and Careers, or PARCC, has also built secure browsers and, since its test is not adaptive, is making recommendations to schools and districts about how computers and testing centers should be set up to prevent cheating, said Jeffrey Nellhaus, the director of policy, research, and design for PARCC. The consortium's manual recommends, for example, putting a cardboard barrier or a curtain between computers to prevent cheating.
While adaptive tests, secure browsers, and plagiarism-detecting software are helping to make the testing process more secure, experts caution that educators should not be lulled into a false sense of security.
1. Collect cellphones from students before testing begins to thwart photographs being taken of tests or questions, to prevent texting others for answers, and to block access to the Internet.
2. If a test is given online, make sure students are using a secure browser that prevents them from surfing the Internet or accessing apps.
3. Consider hiring a company that monitors social-networking sites such as Instagram or Twitter during testing windows to determine if test questions or photos of exams have been posted.
4. Use adaptive tests. Some experts say that because adaptive testing means each student's exam is likely to be different, this method makes it much more difficult to cheat.
5. If using online tests in which students all get the same questions, consider placing cloth, plastic, or cardboard barriers between computer monitors so students can't see each other's screens.
6. Provide in-depth training for proctors and do spot checks to make sure they're adhering to protocol.
"The biggest threat to test security is the cellphone," said Ray Nicosia, the executive director of the office of testing integrity for the Educational Testing Service, a test-development and -administration company based in Princeton, N.J. "There are so many things the cellphone brings into play: copying, communication, text-messaging, cameras, videos."
A 2011 survey by the San Francisco-based Common Sense Media, a nonprofit organization that studies the effects of technology and media on young people, found that 83 percent of 13- to 18-year-olds had cellphones and 35 percent of those cellphone owners said they had used their phones to cheat in school.
Schools typically request that students stow or give up cellphones during testing, and many schools even collect devices before a test begins, returning them after the testing.
But that's no guarantee students won't sneak one in, Mr. Nicosia said. So the ETS works with companies that monitor the Internet and scan for key words included in a test to ensure questions are not copied and then posted on the Web. And some of its tests are only given in centers that are certified as secure.
But the ETS is starting to experiment with new approaches that could be a harbinger of the future for districts, such as voice-recognition software to prevent imposter test-takers and bring-your-own-device models for test-takers. A bring-your-own-device pilot test allowed test-takers to bring laptops—no tablets or other devices were permitted—to sit for the company's Test of English as a Foreign Language exam, or TOEFL, to evaluate English proficiency. A secure browser was loaded onto the laptops for the test, which blocked users from visiting the web or outside apps. The browser was successful in preventing cheating, Mr. Nicosia said.
Designing Better Tests
In California, where hundreds of state tests have recently been called into question because test questions were posted on social-networking sites, state testing experts have been hiring companies for three years to do social-media checks on testing day to make sure questions don't show up on Facebook or Instagram, said Diane Hernandez, the director of the assessment development and administration division for the California education department, which tests 4 million students annually and has only recently started to move toward widespread online testing. "We want to ensure the validity and reliability of these tests," she said.
At Belding High School in Belding, Mich., during the past school year, a student used a cellphone to take a photo of an exam answer key and text it to other students.
"It was a modern version of somebody making a photocopy of the test and giving it to someone else," said Principal Brett Zuver.
The students involved were disciplined, and the school has upgraded its anti-cheating efforts. When students take quizzes on their school-issued iPads, teachers use an app to lock down the devices during testing, so they can't browse the Internet, Mr. Zuver said. "It's a whole different world these days," he said. "It's important for students to learn appropriate use on these devices."
Even as some districts are preparing to fight high-tech cheating, they're also examining the quality of assessments.
In the 651,000-student Los Angeles Unified School District, chief technology officer Themistocles Sparangis said technology provides more of a digital footprint to trace cheaters, such as catching students who post stolen test questions to their Facebook pages. Plus, students who try to tamper with an online testing system may be traced to a particular account or computer, he said.
But, Mr. Sparangis said, educators should appreciate that poorly designed tests that do not measure critical thinking are likely to be easier to cheat on. For instance, he said if a test is trying to determine whether students have memorized the state capitals, and a student sneaks a cellphone in to track down that information online, the problem is not just about the cheating, but also the quality of the assessment.
"If a student can go on the Internet with their smartphone and answer some question and submit it, as wrong as that behavior is, I would question what the test is really testing," Mr. Sparangis said. "It makes you reconsider the kind of test you're giving. We should be trying to drive those higher-order thinking skills and deeper comprehension analysis."
Vol. 33, Issue 25, Pages 34-35