Future Teachers' Skills Gauged in Classroom Simulations
In a move to provide early diagnostic testing of the skill levels of future teachers, three Pennsylvania universities have created a set of simulated classroom exercises that assess the strengths and weaknesses of college education majors.
The Pre-Teacher Assessment center, as the new testing effort is termed, is modeled after those used in private industry. Tasks students will be asked to perform range from critiquing a school museum to delivering a 15-minute lesson.
As many as 150 college sophomores at Indiana, Millersville, and Slippery Rock Universities in Pennsylvania are scheduled to take the one-and-a-half-day exam this year.
The three former normal schools created the new center with the assistance of Development Dimensions International, one of the nation's largest management-development and personnel-selection firms.
According to Robert E. Millward, the project's director, the main purpose of the program is diagnostic--and not to screen students into or out of teacher-education programs.
"We're interested in what basic skills these sophomores have for teaching, and where they need work," he said. "I wasn't convinced that the paper-and-pencil competency tests that were rapidly being put together were going to be the answer."
Even if the competency tests could predict effective teaching, he said, they are administered too late in a student's college career to make much difference.
By measuring students' performance as sophomores, he argued, the new assessment center will provide plenty of time for further remediation and coursework.
Bruce M. Ashton, a senior consultant with ddi, said the "major distinction" between the assessment center and paper-and-pencil tests is that the center "allows you to look at relatively realistic job4performance behaviors."
"We're asking students to actually perform samples or representative activities that will show us directly the skills involved in the profession," he said.
The first 12 students in the program, all volunteers, are scheduled to be tested this month at the center at Indiana University of Pennsylvania, where Mr. Millward is a professor of education.
Creation of the center parallels work in several other states--including California, Connecticut, and Georgia--to develop new methods of measuring teachers' knowledge and skills.
At Stanford University, for instance, Professor of Education Lee S. Shulman is creating sample exercises to guide the creation of proposed national teacher-certification tests.
Like the Pre-Teacher Assessment center, the Stanford project is using simulated classroom exercises to tap performance of such representative--but seldom measured--tasks as selecting a textbook or evaluating a lesson plan.
Mr. Shulman's project, however, presumes that the skills needed to teach vary by subject, with those needed for science differing from those needed for English or mathematics. He has designed his exercises accordingly.
In contrast, the Pennsylvania center focuses on more generic, "content neutral" skills that Mr. Millward insists all teachers need.
"Our contention is that you have to first dwell on very specific basic skills ... and then expand to content," he said.
He added that the minimal emphasis on content also takes into account sophomores' limited teaching skills and knowledge.
The assessment center will measure students' strengths and weaknesses on 13 different teaching "dimensions," ranging from planning and organization to oral presentation and tolerance for stress.
Each dimension consists of a cluster of related behaviors, which were chosen based on a review of the research on effective teaching. Exercises were developed to elicit students' performance on each dimension at least twice over a one-and-a-half-day period.
Eventually, Mr. Millward said, he would like to expand the student-training modules that will follow the assessment to emphasize the relationship between content-knowledge and skills.
For now, students will be assessed on their performance in the following four simulations:
School Museum. Studentsare given information about a "boring and outdated" museum exhibit and asked how they would improve it and justify its funding.
Each student will receive a packet containing letters, budgets, attendance records, a list of exhibits, and related data. Within a two-hour limit, they must prepare a written report that makes specific recommendations about the museum's future and present them orally.
Film Vignettes. Students view a series of short films that present problems typically encountered by teachers. In one scene, for example, a teacher sees a student cheating on a scienceel10lquiz. The frame freezes and the student is asked, "What would you do in this situation?" Students are expected to answer each question in two or three sentences.
Education Fair. Students receive an "in basket" that presents a series of problems surrounding the creation of a districtwide school fair, such as scheduling conflicts and the identification of resources. They have two hours to prepare a written solution.
Actual Teaching. Students receive the materials needed to plan a simple, 15-minute lesson, which they must present to the assessor.
According to Mr. Millward, the simulations will provide a much sharper focus on students' strengths and weaknesses than is normally obtained via student-teaching experiences.
Most teachers, he noted, are not trained to provide the kind of close observation and supervision that the assessment center will.
In addition, Mr. Ashton said, the structured exercises zero in on the "most critical" activities involved in teaching. Observing the same skills at the school site might take days or even weeks.
Research in industrial settings has also found, he said, that assessment centers do not have the adverse impact on minority candidates that many paper-and-pencil tests do.
Equally important, Mr. Millward suggested, the simulations avoid throwing students into actual teaching experiences too soon--and then measuring the results.
That, he argued, would be like "attempting to screen potential pilots by actually putting them into the cockpit."
Common in Industry
Although new to education, assessment centers have been used by private industry for at least 30 years. But the time and money required to operate such centers has prevented their widespread use4by schools, Mr. Millward said.
The typical corporate assessment center provides one evaluator for every two examinees and can cost as much as $2,000 per person.
In contrast, the Pre-Teacher Assessment center is designed to evaluate 24 college students at once, with the aid of three to six evaluators. It will cost only $50 to $70 per student, Mr. Millward said, which will be paid by the universities.
Planning for the center--now in its third year--was supported by the U.S. Education Department and the Matsushita Foundation. Each of the three universities also agreed to contribute between $15,000 and $20,000 a year for the first three years of the project's development.
The center will employ a simplified rating instrument designed to avoid the open-ended narratives and lengthy consensus-building discussions typically required of evaluators. It will not give students a total score but will provide a profile of their skills on each of the 13 dimensions.
The assessors will rate the presence or absence of each behavior within a given dimension, along with a briefly worded description. They will then provide an overall rating for that dimension on a scale of 1 to 5.
Mr. Millward said he hoped students would be required to participate in the assessment procedure as part of their introductory methods course or early field experience.
For now, only volunteers are involved. The results will be given only to the students, who can then choose to share them with their professors.
The universities plan to spend the next four years evaluating the validity and reliability of both the assessment and the training modules that will accompany it. The consortium also hopes to market the assessment to other universities as a way to strengthen teacher-education programs.