Special Report
Assessment

Digital Simulations Emphasize Problem Solving

By Benjamin Herold — March 10, 2014 9 min read
  • Save to favorites
  • Print

Multiple-choice exams measuring the breadth of students’ content knowledge have hardly disappeared.

But the creators of a leading international exam, the U.S. government agency behind “the nation’s report card,” and the makers of new classroom learning tools are all turning their attention to something new: digital simulations aiming to measure how students solve problems, communicate, and work with others.

Examples include simulations that ask students to collaborate with a computer avatar to keep a tankful of tropical fish alive, help a virtual village fix its broken water pump, develop a model of an atom, and help an online city reduce pollution while growing its economy.

“In the real world, there is often more than one route to the right answer, or more than one right answer,” said Michael Barber, the chief education strategist for education publishing giant Pearson and a strong backer of the new assessment trend. “With simulations, you can have complex problems and allow different students to find different ways through.”

The use of simulations for assessment isn’t new: For decades, for example, airline companies and the military have used simulators to train and screen prospective pilots. Many educators have also had a long-standing interest in more traditional forms of low-tech simulated performance assessments, in which students are required to complete a sophisticated task, such as designing and conducting a science experiment.

But questions of cost, reliability, validity, and scalability have been stumbling blocks in the past to widespread adoption of both simulations and performance assessments, said Eva L. Baker, a research professor in the graduate school of education and information studies at the University of California, Los Angeles. She said those same challenges remain with today’s emerging technologies.

“We all want to avoid the gee-whiz factor,” said Ms. Baker, the director of the National Center for Research on Evaluation, Standards, & Student Testing, or CRESST. “We have to make sure there is adequate evidence these things work and are not simply the next big thing.”

Real-Time Feedback

Proponents maintain that digital simulations, which experts in the field say generally involve less narrative, design, and purely fun elements than digital games, can increase student engagement and provide real-time feedback to educators. They also believe the torrents of data that simulations generate can be mined for evidence of deeper learning—not just whether a student selects the correct canned response, but whether he or she asks the right questions, uncovers all the necessary information to make a smart choice, and shares that information effectively with a partner.

“Knowing the right answer is just half of what we should be looking for,” said Peggy G. Carr, the associate commissioner of the assessments division at the National Center for Education Statistics, which this spring is field-testing its new simulation-heavy technology, engineering, and literacy assessment with 20,000 U.S. 8th graders.

But while the TEL exam—like the Program for International Student Assessment, or PISA, which will begin incorporating digital simulations in 2015—is a summative test that aims to provide an overview of student performance, the real power of simulations lies inside the classroom, said Christopher Dede, a professor of learning technologies in Harvard University’s Graduate School of Education.

“If we can develop better formative assessments that provide diagnostic information that is used to go back and change the course of instruction,” Mr. Dede said, “we can have a much greater impact.”

Some groups are working on exactly that goal: Pearson, which is based in London and has U.S. headquarters in New York City, is playing a key support role in the development and rollout of SimCityEDU, a classroom adaptation of the classic urban-planning computer game. And PhET, a research and development laboratory at the University of Colorado at Boulder, is currently expanding the formative-assessment tools built into its 128 digital math and science simulations.

Barriers remain: It takes about six months and $50,000 to build a single PhET simulation, for example, and parents and privacy advocates are likely to worry about how all the student data that simulations generate is secured. On a practical level, the nation’s schools are likely to be occupied for the foreseeable future with the challenge of moving even more traditional assessments online.

“I don’t think this is going to change the face of education in a two- or three- or even a five-year period,” said Mr. Barber of Pearson. “But I think in 10 years, it will be transformative.”

Digital simulations are making their way into a wide variety of assessments, used for an equally wide variety of purposes. Four of the prominent players in this emerging field include:

1. National Center for Education Statistics

How do you measure students’ ability to understand, use, and improve technology?

Test-takers must determine the best way to fix a broken well in a simulation for the new technology, engineering, and literacy exam by the NCES.

With “task-based scenarios” online, believe officials of the NCES, which administers the National Assessment for Educational Progress.

In one such task on the agency’s new technology, engineering, and literacy exam, students are asked to assume the role of an engineer who must figure out why a village’s water well has stopped pumping water. The item gauges test-takers’ troubleshooting skills: Do they access and use the most relevant information? Can they quickly diagnose the problem?

“It’s intended to provide an assessment that’s more authentic for students,” said Peggy G. Carr, the associate commissioner of the assessments division at the NCES, a branch of the U.S. Department of Education.

The exam is currently going through its second large field test, although no live administration has yet been scheduled. The NCES is investing in the new assessment, Ms. Carr said, in part because science, technology, engineering, and mathematics skills often cut across academic content areas and are untouched by subject-specific tests.

The reliability and validity of digital simulations remains a concern, but she said an earlier field test of the new exam helped: The NCES feels confident, for example, that it has identified the most efficient way for test-takers to troubleshoot the broken pump. Test-takers who pursue that path will get the highest score. “There has to be a way to score these tasks such that you can put them on a scale with higher or lower levels of proficiency,” Ms. Carr said.

2. Organization for Economic Cooperation and Development

A leading global assessment used to compare the reading, mathematics, and science skills of students in 70 countries will use digital simulations to measure students’ problem-solving ability—both alone and with others—beginning in 2015.

Beginning in 2015, students taking a new PISA exam will collaborate with a computer avatar to keep a tank of virtual tropical fish alive.

In science, students taking the Program for International Student Assessment, or PISA, next year will encounter “experimental situations” in which they must manipulate multiple variables in order to perform a real-world task, such as figuring out how to make the most efficient zeer pot (a clay refrigerator that doesn’t require electricity.)

PISA will also include in 2015 a new exam solely dedicated to measuring students’ collaborative problem-solving ability. In one sample item, the challenge is to keep a tankful of tropical fish alive. In order to succeed, students will need to interact with a computer avatar via a chat window to uncover necessary information, devise and implement a plan of action, and communicate the results of their experiment.

“Simulations test students’ thinking skills and creativity in far greater ways than we can do with pencil and paper,” said Michael Davidson, the head of the early-childhood education and schools division of the Paris-based OECD, which administers PISA. “If countries agree that these are skills 15-year-olds should have, then having a summative assessment like PISA to determine if those skills are present is useful.”

3. PhET Interactive Simulations Project

For some, digital simulations are less about determining students’ proficiency than about helping them learn.

“Our simulations are providing dynamic feedback so students can build an understanding of cause-and-effect relationships, develop mental models, ask questions, generate evidence, and engage in the scientific process at the same time they’re learning core content ideas,” said Kathy Perkins, the director of PhET, based at the University of Colorado at Boulder.

A popular simulation from PhET lets science students manipulate the structure of atoms.

Since 2004, the PhET team has created 128 interactive digital simulations in physics, biology, chemistry, earth science, and math. All told, Ms. Perkins said, the simulations have been used more than 45 million times, primarily in middle and high schools.

In one popular example, students can drag and drop protons, neutrons, and electrons into a simulated atom.

Using the PhET simulations for formative assessment requires a lot from teachers, who generally must infer for themselves how students are interacting with the tools, have the skills to get students talking about what they’re learning, and know how to adjust classroom instruction based on the resulting insights.

To ease that burden, PhET is working to expand its professional-development resources for teachers and working with external partners such as Pearson and the Princeton, N.J.-based Educational Testing Service to think about new tools and functions that can be added to the simulations.

“We want to create tools that the broader education community can use and leverage,” Ms. Perkins said.

4. Pearson

Pearson has long been a titan in the standardized-testing industry, but officials from the London- and New York City-based company say they’re most excited about the potential of digital games and simulations to provide formative support inside classrooms. “I think [simulations] have a place in summative assessments, but you have to take their limitations into account,” said Kristen DiCerbo, a senior research scientist for the company.

Students can show their “systems thinking” skills by reducing pollution while improving the economy in the virtual city in SimCityEDU, a new classroom assessment tool from GlassLab.

Ms. DiCerbo has worked extensively with GlassLab—a team of researchers, designers, developers, and learning scientists making games for assessment, in which Pearson is a participant—on the development of the recently released SimCityEDU. A classroom adaptation of the classic urban-planning computer game, SimCityEDU—which Ms. DiCerbo describes as more game than simulation due to the level of design, narrative, and just plain fun that is built in—aims to promote and assess students’ “systems thinking,” or grasp of multiple overlapping cause-and-effect relationships.

As students try to reduce pollution while also improving the economy in a virtual city, their every digital action is tracked: 3,000 points of data or more in a 10-minute play cycle.

Ms. DiCerbo leads the effort to mine that information, searching for patterns that suggest evidence of learning. How to safely secure all that student data remains a sensitive issue, and there have been surprises—what to make, for example, of the 5 percent to 10 percent of players who decide the most efficient way to meet SimCityEDU’s goals is simply to bulldoze big sections of the city.

Events

Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment What the Research Says AI and Other Tech Can Power Better Testing. Can Teachers Use the New Tools?
Assessment experts call for better educator supports for technology use.
3 min read
Illustration of papers and magnifying glass
iStock / Getty Images Plus
Assessment What the Research Says What Teachers Should Know About Integrating Formative Tests With Instruction
Teachers need to understand how tests fit into their larger instructional practice, experts say.
3 min read
Students with raised hands.
E+ / Getty
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
New York City’s DOE Aims to Digitize World Language Exams
Explore how NYCDOE's adoption of new World Languages standards for grades 7-12 reshapes assessment.
Content provided by TAO by Open Assessment Technologies
Assessment AI May Be Coming for Standardized Testing
An international test may offer clues on how AI can help create better assessments.
4 min read
online test checklist 1610418898 brightspot
champpixs/iStock/Getty