San Diego teacher Ashley Hermsmeier has written a provocative post where she observes that her students have no particular reason to care about the scores they receive on standardized tests. This is actually a big problem for educators whose reputations and even livelihood increasingly depends on these scores. She writes:
The test scores do not affect teens' lives in any tangible way and therefore students do not care about the tests. Threatening that something "might" happen "next year" doesn't even register on the teenage radar. They know their score will never be seen by a college admissions counselor, will not affect their grade and will not go on their transcripts. This translates into complete apathy toward the tests. Students can bubble in whatever they want on the answer sheet with no personal consequences, making (often inappropriate) designs out of the bubbles on their answer sheets. (Yes, this happens each year.)
Her solution to this is to try to MAKE them care. She suggests:
The simplest and fastest solution is to put the scores on student transcripts. Do this and we might also be surprised to find that kids in this country are smarter than current test scores reflect and that education in this country may not be as far behind other countries as we think.
And then presumably the colleges might start taking these tests into account when they make admissions decisions? The trouble with this is that I have a feeling few of the students making original designs in the bubbles on their answer sheets are spending March wondering if they were accepted to Harvard, and thus this would not really matter anyway.
But Ms. Hermsmeier has raised an important problem. In California our students take the California Standards Tests every spring. In the fall parents will receive a letter informing them of their child’s scores. Teachers will receive colorful spreadsheets showing the students who tested at proficient or below basic. But there is no actual consequence for the student. They will advance to the next grade (or be held back) as a result of the grades they earn from their teacher. Their test scores are available so we can see how well they learned, and perhaps how well we taught, but they are not used for any other purpose that intersects with the students’ lives.
For me the solution Ms. Hermsmeier arrives at is half right. If test scores are such valuable pieces of information that we are willing to base a teachers’ evaluation and pay on them, then they ought to matter to the person actually responsible for generating them.
But what if the students are actually right to discount the value of these tests? What if the tests, even when taken seriously, are not an accurate reflection of what students know and can do? The more pressure is placed on these tests, the more incumbent it becomes on us to try to convince the students that they need to try their hardest on them. By middle school students have taken these tests for years, and are aware of how they are likely to score. So they make a decision. I do not BELIEVE in what this test has to tell me about myself. I do not believe that I am BELOW BASIC.
My friend David Cohen wrote a fantastic post about this last week on the InterACT blog, explaining some of the troubles his students have with the tests:
Meet Burris. Poor Burris has had a rough time of it, changing schools, in and out of foster care, and probably dealing with some health or cognition issues that interfere with academic growth, but it's hard to tell because the records aren't complete and his current guardian is not pursuing any diagnoses or support. Burris is a nice enough kid, and will put forth a decent effort when the task seems relevant and non-threatening. One area where Burris is pretty smart though is in understanding schools. He's seen enough of them to know that tests are traps: all the schools use tests to find out how much you don't know yet and to punish you for it by putting you in classes you don't want to take. If you can even get him in the room with the test in front of him, he might go through the motions, but he will not risk his best effort only to be trapped by it again. Now that he's in tenth grade and new to my school, I have two options. I can tell Burris he's wrong about schools and tests and beg him to do his best - and blow any chance I have of gaining his trust, because you can't make a young adult believe something that runs counter to a decade's worth of life experience. Or I can tell Burris I understand his point of view, and maintain a relationship that will yield some results on work other than the tests. So, while I will maintain high expectations for Burris in the classroom, I honestly can't expect his test results will mean much. (Of course, I'll never see those test results anyways, because they don't come back during the school year and I won't be teaching Burris next year). Kids like Burris might make up another ten percent of my student load.
David goes on to describe some other fascinating varieties of students, each of which presents a different challenge when we want to measure their performance using that precious test.
While students like Burris make up perhaps ten percent of David’s population in affluent Palo Alto, in some Oakland schools students with this profile are far more numerous. David captures the teacher’s dilemma perfectly. Do we align our own values with those embodied in the tests, and festoon our walls with posters urging our students to invest in these measurement devices? I have a very hard time taking this path, because I do not think it will serve students like Burris well.
What we need to do is make our assessments connect to skills and knowledge that intersect with our students’ lives. I do not believe it should be our mission to get our students to care more about the tests the policy makers value. Instead, it should be our mission to get our policy makers to care more about how our students actually learn.
What do you think? Should we use all our powers to try to make students care about their test scores? Or does this do more harm than good to them?