Teaching

Study: Does the ‘Testing Effect’ Work With Digital Lessons?

By Debra Viadero — July 31, 2009 2 min read
  • Save to favorites
  • Print

I wrote a bit in June about a program of studies pointing to a “testing effect” in learning. That is, students seem to retain more of what they’re learning when teachers give them practice tests to help them review the material. This kind of testing, it turns out, seems to yield better results than simply having students restudy the material on their own.

A pair of psychologists at the University of California, Santa Barbara, wondered whether the “testing effect” still holds true when the original lesson comes via multimedia. And would it work, they also wondered, with “transfer” test questions—in other words, the kinds of questions that require students to use what they learned to solve a new problem?

To find out, researchers Cheryl I. Johnson and Richard E. Mayer showed 282 university students a computer animation teaching how lightning works. The students were randomly assigned afterward to groups in which they either:
1) Watched the video again;
2) Took a short practice test in which they were asked to write a description of how lightning works; or
3) Took a transfer test in which they answered questions such as “What could you do to decrease the intensity of lightning? “

The students who wrote about lightning—the condition that researchers refer to as the “practice-retention” test group—did better than the restudy group on a retention test taken a week later. The same general pattern holds for the kids who took the transfer test: They outperformed the restudy group on a transfer test the following week.

That the testing effect carries over to multimedia lessons doesn’t strike me as surprising. What’s interesting about this experiment, to my way of thinking, is the inclusion of test questions exploring whether students can transfer their new knowledge to other kinds of problems. That’s an element that’s been missing from some of the previous studies on the “testing effect.” When interviewed afterward, students who took the transfer test also rated their task to be more difficult than did students in the other two conditions.

The obvious next question is: Which kind of practice test gives more bang for the buck? The UCSB researchers tried to get at that, too, by comparing learning more directly across the different groups. They found that the students who took the practice-retention test did better than the takers of practice-transfer tests when the final test explored basic recall of the facts. The opposite was true, though, when the final test focused on asking to use what they learned to solve new problems. The authors’ conclusion: “The type of questions used on practice tests should correspond to the type of question that the teacher wants students to solve on the final test and beyond.”

I say, why don’t we give students practice tests that include both types of questions?

Read the full study, “A Testing Effect With Multimedia Learning,” in the August issue of the Journal of Educational Psychology.

Related Tags:

A version of this news article first appeared in the Inside School Research blog.