Science Opinion

Putting Virtual Assessments to the Test

By Pendred Noyce — June 27, 2011 5 min read

With everyone from the nation’s top CEOs to President Obama stressing the importance of science, technology, engineering, and mathematics, or STEM, learning in order to prepare students for a competitive 21st-century workforce, we need better measures of how well students are mastering those subjects. Science and other complex subjects are not served well by conventional testing; answering A, B, C, D, or “all of the above” doesn’t lend itself to measuring science proficiency, scientific thinking, or deeper knowledge and understanding.

While traditional paper-and-pencil testing gauges student knowledge on distinct facts or concepts, virtual performance assessments allow students to actually use scientific inquiry and problem-solving through interactions with their virtual environments. In a VPA, students, represented by computer-generated icons, or avatars, make a series of choices. They tackle authentic science problems, investigate causal factors, and choose which virtual experiments to conduct in a virtual lab. The assessment is no longer focused on a single right answer, but on the result of decisions and knowledge applied by the student. This approach allows a finer measure of students’ understanding and provides a truer assessment of what students know and don’t know about complex science content.

An exciting VPA model is being developed and tested by Jody Clarke-Midura and Christopher Dede at the Harvard Graduate School of Education, with funding from the federal Institute of Education Sciences. The goal of the VPA project is to provide all states with a new model of statewide assessment in the form of valid technology-based performance assessments linked to the National Science Education Standards for middle schoolers.

In the Dede and Clarke-Midura model, a student logs on to a computer, selects an avatar, and enters the virtual world of a science experiment. She’s given an aerial view of the space, in this case a farm with several ponds. The camera then focuses on a six-legged frog, which prompts the student to wonder what could be causing such a mutation. The assessment then begins with several farmers offering the student various hypotheses about the cause of the mutation. The student is told to select her own explanation and back it up with evidence. To do this, the student must consider a hypothesis, make decisions about the type of data that would support her claim, decide whether to consult prior research, collect evidence in a virtual backpack, examine data, modify her ideas as necessary, and, finally, make an evidence-supported claim.

While traditional paper-and-pencil testing gauges student knowledge on distinct facts or concepts, virtual performance assessments allow students to actually use scientific inquiry and problem-solving.”

Students with varying proficiency will, of course, take different approaches. The beauty of the VPA model is that it’s perfectly suited for such different problem-solving strategies. A virtual performance assessment can gauge how well students reason based on available evidence. It can reveal how students gather evidence and whether they select data that are relevant or irrelevant to the hypothesis they are investigating, something a paper-and-pencil test cannot do. With VPAs, educators can literally track and analyze the trajectory of students’ thinking and generate reports that provide such data in the form of feedback to both teachers and students.

By 2014, the states that have adopted the common-core standards will be expected to implement new computer-based assessments. They’ll have to decide whether to go with simple, digitized versions of paper-and-pencil tests, or to embark on the far more complex world of VPAs. Virtual performance assessments cost more to develop, but they do not cost more to administer, and they offer a greater payoff.

VPAs provide a detailed record of student actions. Even essay questions on traditional tests can’t compare with the realistic context of VPAs for mimicking the steps required for legitimate scientific inquiry. And such assessments are largely immune to the practice of teaching test-taking strategies that can distort results of multiple-choice assessments. If teachers “teach to the test” with a VPA, they will actually be providing relevant and rigorous instruction. Moreover, because VPAs can adjust the available evidence (and therefore the valid conclusions) of each scenario for different test administrations, strict test security is not a great problem.

A logical question is whether these tests are biased toward video and computer gamers. The Harvard researchers are testing that, too, and they note that prior research in virtual immersive environments showed no correlation between computer-gaming experience and performance in the curriculum. At scale, these virtual assessments are much more practical and cost-effective than hands-on performance assessments and are on a par with other forms of computer-based testing.

In “The Road Ahead for State Assessments,” a report released in May, the Rennie Center for Education Research and Policy and the group Policy Analysis for California Education urge state education leaders to give serious consideration to implementing VPAs, especially in science. They offer the following recommendations for practical, scalable implementation of such assessments as part of comprehensive state assessment systems. State education agencies should:

  • Provide teachers and students with the opportunity to try out virtual performance assessments so they get comfortable with the technology, so that the technology is not a barrier to demonstrating knowledge;

  • Provide teachers with professional development to foster instruction that will lead to high performance on these assessments;

  • Similarly, provide opportunities for parents, school boards, and community members to try their own VPA investigation to alleviate fears that this new teaching and testing approach promotes playing video games in the classroom; and

  • Support the infrastructure to do it right, ensuring that the devices and networks deployed can fully deliver the features that make VPAs stand out as a student-assessment tool.

There are a number of ways to integrate technology into the classroom to improve teaching and learning. Virtual performance assessment is one use of technology that could yield great results. VPAs are already changing the way assessments are executed in medicine and the military. Why not in education? If we’re serious about the importance of STEM learning and adequately preparing the next generation of students for real-world careers and decisions, leveraging technology to better assess students’ knowledge can help pave the way.

Related Tags:

A version of this article appeared in the July 13, 2011 edition of Education Week as Putting Virtual Assessments to the Test


This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Classroom Technology Webinar
Here to Stay – Pandemic Lessons for EdTech in Future Development
What technology is needed in a post pandemic district? Learn how changes in education will impact development of new technologies.
Content provided by AWS
School & District Management Live Online Discussion A Seat at the Table: Strategies & Tips for Complex Decision-Making
Schools are working through the most disruptive period in the history of modern education, facing a pandemic, economic problems, social justice issues, and rapid technological change all at once. But even after the pandemic ends,
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Education Funding Webinar
From Crisis to Opportunity: How Districts Rebuild to Improve Student Well-Being
K-12 leaders discuss the impact of federal funding, prioritizing holistic student support, and how technology can help.
Content provided by Salesforce.org

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Science Opinion Effective Science Learning Means Observing and Explaining. There's a Curriculum for That
Researchers say grappling with problems beats out traditional approaches.
William R. Penuel
5 min read
A group of student scientists monitor a volcanic eruption
Vanessa Solis/Education Week and iStock/Getty Images Plus
Science Quiz Quiz Yourself: How Much Do You Know About STEM Best Practices?
Quiz Yourself: How well do you know STEM best practices?
Science Science Teaching and Learning Found to Fall Off in Pandemic
The pandemic could have been a shining moment for STEM learning. Instead, new studies find, students and teachers struggled.
5 min read
Ahasbai Guerrero studies shadows in Gennifer Caven's 3rd grade classroom at El Verano Elementary School in Sonoma, Calif. San Francisco's Exploratorium developed an inquiry-based curriculum that blends English and science lessons.
Third grader Ahasbai Guerrero studies shadows as part of a pre-pandemic science program at El Verano Elementary School in Sonoma, Calif. New research suggests hands-on lessons like this have been difficult during the pandemic.
Ramin Rahimian for Education Week
Science Opinion Working With the Likes of Lego, Disney, and Lucasfilm to Engage Students in STEM
Rick Hess speaks with FIRST's Erica Newton Fessia about inspiring young people's interest in STEM using team-based robotics programs.
6 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty