Assessment Opinion

Hard Data, Soft Skills

By Contributing Blogger — December 05, 2014 7 min read
  • Save to favorites
  • Print

This post is by Adam Carter, Academics Team Lead for Summit Public Schools.

This is a playlist on Electromagnetic Radiation. Here’s a picture of a portion of it:

This playlist represents an important learning resource for all Summit Public Schools’ students (and is, like the hundreds of other playlists our teachers have developed and curated, free and publicly available), who are able to navigate important content knowledge within these playlists. Students can studiously examine every resource on a playlist, or not; they can take diagnostic assessments and checks for understanding, or not; they can choose to take a content assessment, or not; they can focus on content knowledge most important for the cognitively rich projects they are working through, or not. Students have tremendous agency in what they learn, how they learn, and when they learn.

But as we all know, all choices are not of equal merit.

In the visual above, you can see a sample student’s view of the ninth grade curriculum. You can see how the “Electromagnetic Radiation” playlist fits into our project-based curriculum. By the red box around “Electromagnetic Radiation,” you can see that the student has not yet passed that particular focus area, which is more important than the below focus area, “Applications of Electromagnetic Waves.” You can see that both of these content focus areas are linked to the “Electromagnetic Wave Application Research Video” project, which is due on December 19th. Students have a window into how every aspect of our curriculum fits together towards college readiness.

What you can’t see--and what students can’t, either--is the important picture that emerges as they access these learning resources.

This blog is not about students acquiring content knowledge, or about students developing cognitive skills (two essential elements of college readiness).

It’s not about playlists and content assessments, or about project-based learning. It’s about something that is potentially far more powerful--about what I consider to be one of the most interesting and important questions in education today:

How do we track, assess and report Habits of Success most effectively so that we can support each student’s development?

Habits of Success, or “non-cognitive skills” (though they’re widely acknowledged as being rooted in cognitive processes) go by many names and include a broad set of invaluable skills essential for success. For now, however, I’ll focus on one - the ability to drive one’s learning, or what we at Summit call “self-directed learning.” Summit, drawing on the work of David Yeager of the University of Texas at Austin, breaks self-directed learning into five key behaviors our students develop, practice, and model, including:

  1. Productive Persistence
  2. Appropriate Help Seeking
  3. Strategy Shifting
  4. Challenge Seeking
  5. Response to Setbacks

As you think about your own personal and professional life, you will likely be able to think of times in which you exhibited each of these behaviors, and in which you were served well as a result. You formed a study group in what seemed like an impossible Organic Chemistry class in your junior year of college. A new boss required an entirely new approach to office communication--you had to figure it out. A particularly important, but especially challenging project that you opted to lead forced you to encounter a variety of obstacles that you met with tenacity and humility.

But how do you know when someone’s “got it”? And how do you know what to do when someone doesn’t have it?

A common answer in schools has been to survey students, families, and teachers. We have used a variety of surveys, and we continue to value the results of our Youth Truth survey. These surveys offer information essential for evaluating programs, for collegial coaching conversations and for resource allocation. But they don’t help us answer the question of how we effectively track, assess, and report Habits of Success so that we can support each student’s development.

Another answer--one that is far more involved--includes the creation of systems in which normed professionals engage in something like calibrated 360-degree reviews of each student. To what extent does this student appropriately seek challenges, and what is the evidence of this Habit of Success? While immensely valuable, this time-consuming process is difficult to operationalize across schools, and can lead to specious conclusions if not done well, and done consistently. It is likely most meaningful as a learning process--for adults and for students--rather than as a method of assessment.

Other possible solutions--ones akin to ClassDojo, Kickboard, or a student-friendly version of LinkedIn endorsements--are interesting, but rely on strict norming and consistent use. They often represent an “add-on” to the already busy job of teachers, and can lead to compliant behavior without the real development of Habits of Success. These systems, as useful as they are, can be gamed.

One relatively untapped source of information is the rich data set produced by students engaged in blended learning. This lengthy report from RAND, titled “Measuring Hard-to-Measure Student Competencies,” goes into great depth about various ways to measure Habits of Success, and includes this brief, but important, section:

Existing data could be mined to measure some intrapersonal competencies. Educational data systems that track attendance, course taking, behavior, grades, etc. are another source of data that can be used as the basis for measuring some competencies....Such archival data can also be used to capture competencies like persistence, educational aspirations, etc.

I believe that the most promising answer to the question of how to effectively track, assess, and report Habits of Success in order to support individual students at scale lies in understanding the existing evidence of student self-directed behaviors. Much of this evidence already exists--we simply need to use it to effectively trigger support for students.

A team at Summit, along with various partners, has begun to engage in the work of finding metrics that represent reliable evidence of self-directed learning. It must be noted that these metrics are, theoretically, proxies for the Habits of Success we want to foster in our students.

Here are two examples:

  1. If, when a student begins a playlist, her first click is on the diagnostic assessment and her second click is on a learning resource in an area in which the diagnostic tells her that she has not yet mastered, then she is appropriately help-seeking.
  2. If a student submits all steps of a project complete, on time, and in the appropriate order (e.g., a student brainstorms a thesis, then develops a thesis, then outlines an essay, then submits a rough draft, then submits a final draft), then he is productively persisting.

While we still have a tremendous amount of work to do, certain pictures are emerging from the data. For instance, both of the above hypotheses appear, given the information we have, to be correct. Students who perform better on content assessments appear to test themselves, then engage with a diversity of learning resources, and then take the content assessment one or two times. Students who perform much worse take the content assessment again and again with little attention to learning resources--attempts to game the system rather than to learn; or, they click slavishly on every resource on the playlist in order--down the row. The most effective and efficient learners structure their learning as “assessment sandwiches":

  • Assessment
  • Targeted Learning Resources
  • Assessment

Obviously, we have more time to spend figuring out which metrics are most aligned to the Habits of Success that drive students to learn.

But most importantly, using metrics like these to understand the patterns of behavior that separate our most self-directed learners from those who require the most scaffolding will allow us to intervene effectively with every student. We can target our interventions to the students who need them most.

And, because students don’t see the patterns left behind by every click of the mouse, we can coach them on how to learn without focusing directly on a discrete behavior. We don’t have to tell a student to assess-read-assess. We don’t need to use the term “assessment sandwich.” We can, however, implement mindset interventions and teach academic literacy strategies that we (and researchers like Yeager and Carol Dweck) believe will guide students towards the right behaviors.

Ultimately, we can teach students how to learn.

And how will we know if these interventions work? Well, the answer’s already there, in the breadcrumb trail left by time spent on the Electromagnetic Radiation diagnostic assessment.

To explore the Summit experience, visit here.

Note: Summit is partnering with a cohort of schools to bring next generation classrooms to students across the nation through the Summit Basecamp program. We are offering our Personalized Learning Plan (PLP) platform to grade-level teams committed to fostering the Habits of Success, Cognitive Skills and Content Knowledge that will propel all students to college readiness. If you are interested in learning more, please visit here.

The opinions expressed in Learning Deeply are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.