Find your next job fast at the Jan. 28 Virtual Career Fair. Register now.
Education Opinion

Looking at Class Sets of Work with

By Justin Reich — April 25, 2015 5 min read

Last week, I had the great pleasure to have Michael Pershan of into my MIT Introduction to Education class to discuss the joys of looking closely at student data. Last year, Michael helped me run a lesson using material from his site, and this year I was lucky that he was in town for the National Council of Teachers of Mathematics conference.

Most of is organized around single problems, and we used three of those single problems last year. Michael has been noodling recently over the limitations of single problems and the value of class sets, so this year just before class, he posted 14 responses from fourth graders to four fraction problems. All four problems ask which of a pair of fractions is larger: 2/3 and 3/2, 2/3 and 3/4, 2/5 and 3/10, and finally 3/7 and 2/5.

To guide our inquiry in class, we borrowed from the ATLAS Looking at Student Work Protocol developed by the National School Reform Faculty. This protocol offers a specific series of steps for analyzing student work. We amended the protocol slightly to serve our particular purposes, but the ATLAS protocol offered a foundation.

As a starting point, our protocol asks reviewers (what I’ll call people looking at student work, in this case, my own MIT students) to make two important leaps. First, we ask reviewers to keep their contextual knowledge to a minimum. Reviewers look at work without a detailed knowledge of the lesson, the unit, or the composition in the classroom. The idea is to look at the student work on its own terms. Next, we ask reviewers to assume good faith of students. Assume that the student is doing their very best thinking and putting their best effort into the work. Not every student does, but when reviewers go down the path of “this kid’s not really trying,” it distracts from the most interesting question of, “well, what is this kid trying to do.”

So we had class sets of 56 problems: 14 students completing 4 problems each. We then systematically examined the class sets in four steps. Each step moves progressively up the “ladder of inference.” We try to begin with low-inference observations before moving on to discussing correlations, hypotheses, and interventions.

First, we asked reviewers (working in groups of four) to notice salient details. This is an extremely difficult step without practice. The goal is simply to observe what students were doing. “This student drew a circle. This student drew a rectangle. This student started with a circle, crossed it out, then drew a rectangle.” Noticing without judging is actually quite difficult.

The second step was to observe patterns. Several groups noticed that students all started with circular area representations of fractions. With fifths and sevenths, however, circles proved difficult to divide evenly (unless you grew up with six brothers and sisters, it’s basically impossible to divide a circle into even sevenths), so students often abandoned circular representations for rectangular ones in the last two problems. This was quite a realization for groups; it wasn’t totally obvious from looking at the problems, but the pattern seems quite clear in retrospect.

Reviewers also noticed that many students answered “The Same” for problem four. Those of you who can convert fractions to decimals might note that 3/7 is in fact bigger than 2/5, but not by much. Were the students mostly right, or just wrong? If I asked you “would you like 2/5ths or 3/7ths of a candy bar?”, would you care which part one you got?

The third step is to hypothesize student understanding, or as the ATLAS Looking at Student Work Protocol puts it, to ask the question “From the student’s perspective, what is the student working on?” Here we tried to figure out which students were applying ideas of least common denominator, which were trying to ensure accurate representations, which seemed to be following some kind of protocol and which seemed to be winging it. The most fascinating debate of this step was looking at one student who had drawn a rectangular representation of fifths, and then a second rectangular representation of tenths created by replicating the fifths and then drawing a line through middle to create tenths. Did the student go into the problem knowing that particular strategy, or did she draw a line through the second rectangle in a flash of insight? At this stage, reviewers begin to realize how easy it is to generate reasonable competing hypotheses of student thinking. We know more than we did before starting the exercise, but we still can’t say for sure what students are thinking.

Finally, we began brainstorming possible interventions. We had a long discussion about what our goals were. Of course, in the end, we want students to convert to lowest common denominator and compare with automaticity, but we generally agreed with Michael that it wasn’t enough to just teach them the algorithm; we wanted them to develop some intuition about what fractions are and why the algorithm works. Generally, the group settled on the need for more arrays in student practice, and teaching students a particular graphical protocol where students create their own arrays from two fractions, by drawing horizontal lines in a rectangle from one denominator and drawing vertical lines in the same rectangle from the second denominator.

We asked Michael what he planned to do next, and a playfully pained expression fell over his face. He didn’t know. There was no right answer. He had been doing lots of array work with them, with arrays of 24ths, and they still hadn’t developed a good intuition for comparing area models. It was wonderfully unclear exactly what to do next, but of course by Sunday night, he would have to pick something.

In the end, I had 25 MIT undergraduates spend 75 minutes discussing the intricacies of fourth-grade mathematical thinking by exclusively looking at answers to four problems. What the exercise suggests, more than anything else, is that the very close examination of student work reveals a rich complexity in their thinking, a complexity that raises as many questions as answers.

What teaches us is that anyone who wants to engage in conversations about this rich complexity now has friends to do it with. The Math TwitterBlogoSphere, hashtag #MTBoS, is filled with people eager to join in conversations about lesson design, student thinking, and evidence from student work. Having Michael join me in class was a great introduction, for my pre-service teachers, to all of the wonderful educators out their online excited to support each other’s growth as educators.

For regular updates, follow me on Twitter at @bjfr and for my publications, C.V., and online portfolio, visit EdTechResearcher.

The opinions expressed in EdTech Researcher are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.


This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
School & District Management Webinar
Branding Matters. Learn From the Pros Why and How
Learn directly from the pros why K-12 branding and marketing matters, and how to do it effectively.
Content provided by EdWeek Top School Jobs
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
School & District Management Webinar
How to Make Learning More Interactive From Anywhere
Join experts from Samsung and Boxlight to learn how to make learning more interactive from anywhere.
Content provided by Samsung
Teaching Live Online Discussion A Seat at the Table With Education Week: How Educators Can Respond to a Post-Truth Era
How do educators break through the noise of disinformation to teach lessons grounded in objective truth? Join to find out.

EdWeek Top School Jobs

Special Education Teachers
Lancaster, PA, US
Lancaster Lebanon IU 13
Speech Therapists
Lancaster, PA, US
Lancaster Lebanon IU 13
BASE Program Site Director
Thornton, CO, US
Adams 12 Five Star Schools
Director of Information Technology
Montpelier, Vermont
Washington Central UUSD

Read Next

Education Briefly Stated Briefly Stated: January 13, 2021
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Obituary In Memory of Michele Molnar, EdWeek Market Brief Writer and Editor
EdWeek Market Brief Associate Editor Michele Molnar, who was instrumental in launching the publication, succumbed to cancer.
5 min read
Education Briefly Stated Briefly Stated: December 9, 2020
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated Briefly Stated: Stories You May Have Missed
A collection of articles from the previous week that you may have missed.
8 min read