Homegrown Tests Measure Core Critical-Reading Skills
Marcia L. Riddick, a 4th grade teacher at the F. Lyman Winship Elementary School here, remembers her initial skepticism about the new assessment tools developed by the nonprofit Boston Plan for Excellence.
“I just heard, ‘another test,’ ” she recalled. “I was thinking, would it be helpful? Was it for us or for the district?”
But today Ms. Riddick, who has taught for 11 years, is a convert. “This is a test that’s for teachers,” she said. “It guides our instruction.”
FAST-R, or Formative Assessments of Student Thinking in Reading, was developed by the BPE, a local education foundation, to provide teachers with information about what students are thinking when they try to find evidence and make inferences based on texts. Those two core reading skills are a central part of the state testing program, the Massachusetts Comprehensive Assessment System, or MCAS, and vital to youngsters’ ability to read for understanding.
“They’re not testing skills,” explained Lisa Lineweaver, a senior program officer at the foundation. “They’re core critical-reading skills. If a teacher is putting more time into teaching those, she’s going to be getting kids to be more proficient readers.”
In 2003, the Boston Plan for Excellence contracted with two doctoral students at Harvard University’s graduate school of education to develop the informal, no-stakes reading assessments. The spur came from the foundation’s work with a group of 25 principals who complained that, despite the large number of tests given in the school district, none of them helped improve instruction on a daily basis.
Piloted in 18 schools in the 2003-04 school year, with support from the Menlo Park, Calif.-based William and Flora Hewlett Foundation, FAST-R is now used in more than 50 schools in the 57,000-student Boston district.
No Fixed Test Dates
Each mini-assessment consists of a short reading passage, many drawn from released MCAS documents, followed by 10 multiple-choice questions and an open-response item.
The questions are designed to probe different types and levels of thinking, including whether students can identify evidence explicitly stated in the text, determine the implicit meaning from words in context, and gauge the meaning of a whole passage.
Students’ answers are scored as correct; a “near miss,” meaning the answer is true, based on the text, but irrelevant to the question; a “misread,” meaning it’s based on a misunderstanding of the text; or “out of bounds,” meaning it’s not based on the text but is plausible based on the student’s own prior knowledge.
A teacher receives a color-coded graph for an entire class that shows the pattern of students’ answers next to each question. The graph includes student-level data summaries that show how individual students and the whole class did across all the sets of questions, with information on each student’s race and gender, and whether the student is receiving special education or services for English-language learners.
Teacher’s guides to each of the total of 55 question sets for grades 3-12 also describe the difficulty of the reading passages in structure, purpose, vocabulary, style, and other features, as well as the grade range. Annotations highlight aspects of the passage that are key to comprehension or might give students trouble.
Schools volunteer for the program. Scores are not reported to the central office. And data are not used to measure school progress. Instead, teams of teachers at the school work with a “data coach,” a veteran teacher employed by the Boston Plan for Excellence, on how to give the exams, analyze results, and adjust instruction.
No fixed rules say when or how teachers give or use the assessments. BPE and district staff members do all data entry, and most teachers receive their data reports one or two weeks after FAST-R is given.
Zeroing In on Learning
At a recent meeting at Winship Elementary School, a roughly 200-student school on a hilltop overlooking the traditionally working-class Brighton neighborhood, a team of teachers and Principal Antonio N. Barbosa met with FAST-R data coach Paula Morgado to discuss what actions they’d taken based on their most recent results and to plan next steps.
Ms. Riddick recounted how, after getting results from the last reading passage, “I wanted to find out what the students were thinking.” So she went over each question with them and took notes about what they said.
“A lot of them, when they went back to the passage, they saw one clue or one piece of information and, based on that, they chose an answer,” she explained.
Now, she’s trying to encourage them to base their responses on multiple sources of evidence in the text. They’ve practiced some guided reading as a group, so she can model that thinking out loud. And she’s gotten a book on how to teach inferences using higher-level thinking. “I just want to push them a little,” she said.
Ms. Morgado encouraged her and suggested that next time she pick a reading passage that includes more questions requiring students to make inferences from the text, and that is at a slightly higher reading level.
The team also discussed the progress of individual students, whose responses they had been tracking over time, and whether their skills were improving based on the interventions tried. In addition to the assessments themselves, the Boston Plan for Excellence has developed a number of tools to help teachers focus on individual students’ learning.
A “learning-profile worksheet,” for example, helps a teacher gather data about a student’s reading and thinking from at least four different sources, such as student work, observations, conversations with parents, and other assessments, in order to identify patterns.
A “learning-trajectory worksheet” encourages a teacher to anticipate how a student will handle a FAST-R passage before it’s given, identify evidence that confirms or contradicts that prediction, plan for follow-up work, and then track changes in the student’s skills.
But the key, according to Principal Barbosa and others, is not the tools themselves, but the ongoing conversations that teachers have with one another and with the data coaches.
“Just giving out the piece of paper is not going to cut it,” said Mr. Barbosa. “The conversations that take place with the teachers, particularly across grade levels, and with the support teachers there, are then beginning to zero in on student learning.”
‘Opening Up Assessment’
Richard J. Murnane, a professor at Harvard’s education school who has worked with the Boston district, agrees. “I think so much depends on how skilled the faculty is in knowing what to do with the results,” he said.
“The key is getting coaches to see that a key part of the contribution they can and should make is to help teachers make constructive use of these formative assessments,” Mr. Murnane added. “I think that’s really an enormous challenge,” particularly figuring out how to use the detailed information generated to alter instruction and make it better.
Ms. Morgado says her role, and the frequency of her site visits, changes depending on both a school’s needs and its capacity. “A school that has the principal sitting at the meetings, or the literacy coach or an assistant principal, those seem to be the schools where everyone is on the same page, and there are conversations about student work and instruction across the board, not just in a FAST-R meeting,” she said.
The Boston Plan for Excellence has also begun to offer professional development at schools—for entire staffs or smaller groups—on topics that often arise in FAST-R implementation. The foundation also intends to convene a seminar series for staff members from many schools to think about common challenges, such as working with English-language learners, and to share how they’ve been using FAST-R.
Evaluations by the Cambridge, Mass.-based Education Matters, a nonprofit evaluation firm, have found that teachers, principals, and literacy coaches are enthusiastic about the program and the BPE’s support for its use.
Even so, a big challenge is to help new users understand that FAST-R is about more than test-prep. “We really do believe that FAST-R is giving teachers information to help them get kids to the next level,” said Ellen C. Guiney, the executive director of the BPE, “to have the skills that enable kids to get better scores and that they can transfer to their other reading.”
“That said,” she added, “if you were to do a survey of teachers, a lot of them are still doing it because ‘It helps my kids on MCAS.’ When there’s such a dominance of the standardized test, this is an uphill climb.”
The New York City-based Manpower Demonstration Research Center is conducting a multiyear evaluation of FAST-R, based on a randomly selected sample of Boston elementary schools, to track changes in student performance on the MCAS and other measures.
“I’m sure if FAST-R doesn’t show any help on MCAS, it will probably die, which would be too bad,” said Ms. Guiney, “because I do think formative assessment is absolutely a part of the answer to the puzzle of instructional improvement.”
Vol. 26, Issue 35, Pages 32-33