Special Report

New Windows Into a Child’s Mind

By Millicent Lawton — October 01, 1998 7 min read

Unlocking the secrets of how children think when they’re asked to solve a problem has long seemed as unrealistic as predicting the future in a crystal ball.

Students often can’t recall or explain in words how they reach an answer. And even when they turn in scratch paper or “show all work,” paper-and-pencil tests usually can’t reveal what information students used at which moment.

But what if a teacher could do the assessment on a computer? And what if a powerful database could provide not only a “map” of how each student solved a problem, but compile one class’s performance and compare it against the performance of another group of students?

It may sound like soothsaying, but it’s for real.

Welcome to the world of “interactive multimedia exercises,” or IMMEX. Originally designed a decade ago at the University of California, Los Angeles medical school to test students’ understanding of immunology, these computerized problem-solving assessments are now used regularly in more than a dozen elementary, middle, and high schools in southern California, and for a variety of subjects, including math, science, English, and social studies.

“It’s the closest I think [teacher] can get to getting inside their students’ minds,” says KrIstin Hershbell, a research associate at the WestEd regional research laboratory in With the help of Interactive multimedia exercises, a teacher can “map” the thinking processes that these Palisades Charter High School students use to solve problems. Menlo Park, Calif., which has been evaluating the IMMEX project.

Paula Dallas, a biology, teacher at Palisades Charter High School here in the Pacific Palisades section of the city, experienced that feeling recently with two classes of 10th graders who ran a Windows-based IMMEX problem set called True Roots.

In this exercise, students play the part of forensic scientists asked to identify the true parents of a girl named Leucine-just like the amino acid-who suspects she may have been the victim of a mix-up in the maternity ward.

The idea is for students to solve the problem as efficiently as po ible, electing only the information that will be most useful to them. For example, they can access data from an “experts” category that includes the police, school officials, and hospital staff. By clicking on “lab tests,” they can find blood types and the results of DNA fingerprinting.

The computer tracks what choices the students make, recording each step of their thinking for later review by them and the teacher.

“You can look deeper,” Dallas says of IMMEX. “You can see the process of how they go to the answer.”

Dallas is very familiar with this particular program, having helped write True Roots three summers ago when she attended the first annual teacher training institute for the IMMEX project. The institutes, funded through a four-year, $2 million teacher-enhancement grant from the National Science Foundation, have trained more than 270 K-12 teachers in 65 southern California schools.

As of last summer, the e teachers had written about 30 problem sets, each of which includes multiple versions of the same basic exercise. There are four problem sets each for the elementary and middle levels, and 22 for high school students.

Dallas gives her students two class periods over three days to run the True Roots program, assigning two students to each computer. She uses the middle day to discuss how they did on their first try and what strategies they might use on their second go-around. They’ll receive a grade-worth about 50 percent of a test-on how much they improve.

Just before Dallas convenes the class period between the two IMMEX runs, Ron Stevens, the inventor of IMMEX and a professor at UCLA’S medical school, sits in her classroom and shows her the students’ “search-path maps” from the day before. For the first time, she sees a visual reconstruction of the thought processes her students used to solve the problem. She knows not only if they solved the problem, but how.

Excitedly, Dallas declares, ''This is great!”

On each search-path map, lines zig and zag across the screen, showing the routes the students took in and out of the categories. She can tell if they entered, say, the blood-typing category, then jumped to another category, or dipped repeatedly into the DNA fingerprinting area.

A tangle of lines means the students hopped from category to category without a good sense of how to solve the problem. A single line from starting point to answer means they guessed-a forbidden strategy that some students try anyway.

By interpreting the patterns, Dallas sees which students were thinking analytically and which ones weren’t. She can also tell who learned their recent unit on genetics.

And finally, using a scale developed by Stevens, she can assign a number value to the students’ patterns, allowing her to compare the high schoolers’ performance to that of their peers as well as to that of undergraduates who have run the same exercise.

Programs such as IMMEX aren’t the only ways that educators believe technology can improve student assessments.

Some teachers, for example, are using software to help with the often-unwieldy task of managing portfolios of student work. Electronic portfolios can make it easier to organize and retrieve documents than paper versions, but they also pose some technical problems and can be harder to share with others unless they have a computer.

Other uses of technology are designed to streamline the assessment process. Researchers at the University of Colorado at Boulder, for instance, have designed software that they say can grade essays for content as well as teachers can.

In the area of computer-based assessments, some educators are using tests tailored to individual students through computerized adaptive testing, or CAT. The computer creates a unique test for each student who sits at the machine, selecting items that are appropriate for his or her ability based on the student’s responses to preceding questions.

While all of these uses of technology in assessment seem promising, many educators are particularly intrigued by exercises like IMMEX because of the insight they offer into student learning styles.

“Eighty percent of solving a problem is thinking about it in the right way,” says Robert J. Mislevy, a distinguished research scientist at the Educational Testing Service in Princeton, N.J., who has teamed up with the IMMEX project staff on a National Science Foundation grant proposal.

“It’s that 80 percent that’s really the name of the game in education,” he adds. “Traditional assessment, at least in part because of technical limitations, hasn’t really been able to provide much information about that.”

IMMEX is “probably pretty unique and pretty intriguing,” says Alan Lesgold, a professor of psychology and intelligent systems at the University of Pittsburgh. “What Ron [Stevens) has done that is rather impressive is to realize there are clear patterns to complex problem-solving activities, and you can learn to discern the patterns exhibited by people that have different strategies for problem-solving.”

Assessment programs like IMMEX also raise the all-important question of whether students learn more, or differently, when they use computers.

Some educational researchers believe that traditional forms of assessment can’t adequately measure student achievement on performance tasks like multimedia presentations. Research on computer-based assessments-very little of which has been conducted so far-could begin to address whether they more accurately reflect students’ abilities.

In the meantime, Dallas’ students, who come from all over the city to Palisades Charter’s math and science magnet program here in this affluent neighborhood, clearly get into the True Roots program. Two boys, partners during the lesson, erupt in a triumphant “Yes!” when they solve the problem correctly.

The students say it’s faster and easier to run a problem on the computer instead of having to shuffle through papers for a traditional exercise, and they seem to appreciate the intellectual challenge of IMMEX. As Tennille Hyde, 15, says, “It puts your brain to work.”

Dallas, too, is enthusiastic about the potential for IMMEX. She regrets only that she’s wanted to have her students use the program for years but hasn’t had the computers to do it until now.

The 20 laptops Dallas used for this three-day lesson have been borrowed from another school, Francis Polytechnic Senior High School in Sun Valley, where teachers incorporate IMMEX problem-solving techniques throughout the curriculum.

Another potential barrier for teachers wanting to use IMMEX is the amount of preparation required. Training on how to use the electronic problems can mean a significant time commitment-one that not everyone is willing to make, IMMEX staff members acknowledge.

But when they do, and have the equipment they need, they love it, Dallas and several other teachers who have used IMMEX say.

“I think it’s fantastic, and I think it’ll be an extremely valuable tool,” Dallas says. “If I could get these computers, I’d do it once a week.”

A version of this article appeared in the October 01, 1998 edition of Education Week