Education

What HR Should Know About Value-Added Data

By Peter Prokesch — June 02, 2011 7 min read
  • Save to favorites
  • Print

As a growing number of states move toward legislation that would institute teacher merit pay, the debate around whether and how to use student test scores in high-stakes staffing decisions has become even more hotly contested. The majority of merit pay initiatives, such as those recently proposed in Ohio and Florida, rely to some extent on value-added estimation, the method of measuring a teacher’s impact by tracking student growth on test scores from year to year.

We recently exchanged e-mails with Steven Glazerman, a Senior Fellow at the policy research group Mathematica. Glazerman specializes in teacher recruitment, performance management, professional development, and compensation. According to Glazerman, a strong understanding of the constructive uses and limitations of value-added data can prove beneficial for district-level human resources practitioners.

How can your research about value-added measures benefit human resources departments as districts implement new evaluation models?

As all school district HR professionals know, “teacher performance” is very complicated and multi-dimensional. As new ideas like value-added are introduced to enhance the way teachers are evaluated, the main question should not be: “Is this good or bad?” It should be, “How can this tool help me add to what we already know about our teachers’ performance?”

What should HR managers know about value-added measures to improve their practices?

Educators probably know all of this intuitively, but the key insight that value-added provides is that a student test score does not tell us much about the performance of teachers on its own. But when we think about how that same student or group of students would have fared with other teachers or in other school settings, we can learn how the adults are doing. For example, consider a teacher who performs miracles with severely disadvantaged middle school students. She might be responsible for moving her students from a 2nd grade reading level to a 5th grade reading level in one year, a sharp growth pattern that no other teacher in the past had been able to replicate. Yet without some notion of value-added, all people would see is a teacher whose kids were still not “proficient” by state standards, and hence failing. With a value-added mindset, we can ask: How much better (or worse) would that teacher’s students have done if they had an “average” teacher?

And while tenure, layoff decisions, and compensation are receiving a lot of attention as policy levers, there may be other HR practices where the value-added data can be very useful. These include targeting extra resources and assigning teachers to teams and grades, or identifying which teacher grade-level or subject teams are working well together and which ones are not.

In what areas should HR be cautious about using value-added data?

HR should exercise the most caution using policies where the consequences of being wrong are the greatest. Or they should calibrate those policies so that the risks of misclassifying a teacher are mitigated as much as possible.

The bottom line is that policymakers have to ask their data analysts: What is the confidence interval? What is the probability that someone we classify as exceptional is not exceptional? What is the probability that someone we failed to classify as exceptional truly is exceptional? The major caveat: Don’t just accept a single number without some measure of certainty associated with that number.

How will value-added data affect recruitment, staffing, and retention?

This depends on how the data are used. If the measures are misused, then teachers will be frustrated and policies based on these data will hurt recruitment, morale, and retention. On the other hand, if the data are used to help educators identify strengths and weaknesses, to target professional development, and reward the teachers whose work might otherwise go unnoticed, it could have strong positive effects throughout the system. Teachers would seek out opportunities to prove themselves and work with the students they are best able to reach, even if it means working with the most disadvantaged students. The most effective teachers would have the strongest incentive to stay and continue to post gains, while the struggling teachers would have to find ways to generate better results or be assigned to a position where they have less influence over tested grades and subjects.

What trends, if any, exist in recruitment, staffing, and retention in districts using value-added data?

There are many initiatives going on, but I can describe one that we’re implementing in 10 districts around the country called the Talent Transfer Initiative. In this initiative, districts use value-added data to identify the highest-performing teachers and offer them $20,000 incentives to transfer to their lowest-performing schools for at least two years. The same program offers $10,000 in retention incentives for high value-added teachers already in those schools to remain there. This is one example, but there are programs in many places including Washington, D.C., Charlotte-Mecklenburg, N.C., and Houston, Tex., that use such data for recruitment and/or retention.

In what ways, if any, can HR prepare for these types of initiatives that use value-added data?

The most important step to take is to invest in data systems that match the complexity of instructional delivery. This requires integrating HR data and student data into a common source, or at least a secure but linkable database that is updated several times per year to capture student and teacher mobility as well as the complicated ways that teachers collaborate and that students take courses.

HR professionals should have a say in this process and not just treat it as a technical function for the IT department. Those in charge of data collection and analysis need curriculum experts and HR experts to provide the specifications needed to set up the data systems so that the systems will focus on what matters.

What external support systems or programs can HR leverage to transition into, or improve upon, their use of value-added data?

An entire field is opening up of researchers, consultants, and tech companies that have the capacity to assist districts and states in building this type of data infrastructure. Mathematica Policy Research, my organization, primarily focuses on research and evaluation, but we have been assisting some districts in their efforts to successfully use value-added data. Other research organizations, like the Rand Corporation, the Value Added Research Center at the University of Wisconsin, and the Center for Education Policy Research at Harvard, also work with individual districts or states. Then there are software firms like the SAS Institute and consulting firms that develop, package, and market off-the-shelf models for districts and states.

You did extensive research on the impact of the TAP system, which uses value-added measures, in Chicago. What did you learn from this model?

The TAP system requires that student performance be an important factor in the evaluation of teacher performance, but it does not require a specific value-added model or level of granularity (teacher, team, or school). In the case of Chicago TAP, the district was not able to link student performance to specific teachers to a degree that it felt ready to compute teacher-specific value-added estimates. Therefore, I wouldn’t call the Chicago TAP experience a good test of this concept, at least not in its first few years. I don’t know why it has taken so many years to solve this problem, but perhaps that just indicates the importance of the points I made earlier about investing in data systems that can track the complexity of student course-taking and teacher collaboration.

What advice would you give HR practitioners figuring out how to best leverage value-added data?

Most researchers who have been working on value-added models for many years agree on two things: 1) Value-added results are just imperfect estimates of true performance. As such, we can use them to characterize the probability that a teacher’s “true” performance was above or below some threshold. It is up to policymakers to decide how high that probability must be to trigger a consequence. 2) Value-added information attempts to measure one dimension of teacher performance and cannot be any better than the tests that are used to measure student progress. As such, value-added scores must be combined with other types of evidence to gain a complete picture of a teacher’s performance.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Equity and Access in Mathematics Education: A Deeper Look
Explore the advantages of access in math education, including engagement, improved learning outcomes, and equity.
Content provided by MIND Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: March 20, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: March 13, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: February 21, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: February 7, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read