Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Education Opinion

The 2019 RHSU Edu-Scholar Public Influence Scoring Rubric

By Rick Hess — January 08, 2019 10 min read
  • Save to favorites
  • Print

Tomorrow, I’ll be unveiling the 2019 RHSU Edu-Scholar Public Influence Rankings, honoring the 200 university-based scholars who had the biggest influence on educational practice and policy last year. Today, I want to run through the methodology used to generate those rankings.

Given that more than 20,000 university-based faculty in the U.S. are researching education, simply making the Edu-Scholar list is an accomplishment in its own right. So, who made the list? Eligible are university-based scholars who focus primarily on educational questions (“university-based” meaning a formal university affiliation). Scholars who lacked a university affiliation (e.g., were no longer identified as members of the university community on an institution’s website) were not eligible.

The 150 finishers from last year automatically qualified for a spot in the Top 200, so long as they accumulated at least 10 “active points” in last year’s scoring. (Active points are those which reflect current activity, so they exclude Google Scholar points and points for number of books published.) That is, to automatically qualify, a scholar had to be in last year’s top 150 and earn at least 10 active points in the 2018 Rankings. The automatic qualifiers were then augmented by “at-large” additions chosen by the RHSU Selection Committee, a disciplinarily, methodologically, and ideologically diverse collection of respected scholars. (The committee was comprised entirely of individuals who had already automatically qualified for this year’s rankings.)

I’m indebted to the RHSU Selection Committee for its assistance and want to acknowledge the 2019 members: Deborah Ball (University of Michigan), Camilla Benbow (Vanderbilt), Tressie McMillan Cottom (Virginia Commonwealth University), Linda Darling-Hammond (Stanford), Susan Dynarski (University of Michigan), Dan Goldhaber (University of Washington), Sara Goldrick-Rab (Temple), Jay Greene (University of Arkansas), Eric Hanushek (Stanford), Shaun Harper (USC), Douglas N. Harris (Tulane), Jeff Henig (Columbia Teachers College), Tom Kane (Harvard), Robert Kelchen (Seton Hall), Sunny Ladd (Duke), Gloria Ladson-Billings (University of Wisconsin), Susanna Loeb (Brown), Bridget Terry Long (Harvard), Pedro Noguera (UCLA), Robert Pianta (University of Virginia), Jonathan Plucker (Johns Hopkins), Stephen Raudenbush (University of Chicago), Barbara Schneider (Michigan State), Marcelo Suarez-Orozco (UCLA), Jacob Vigdor (University of Washington), Kevin Welner (UC Boulder), Martin West (Harvard), Yong Zhao (University of Kansas), and Jonathan Zimmerman (University of Pennsylvania).

Okay, so that’s how the list was compiled. How were the actual rankings determined? Each scholar was scored in nine categories, yielding a maximum possible score of 200. Scores are calculated as follows:

Google Scholar Score: This figure gauges the number of articles, books, or papers a scholar has authored that are widely cited. A useful, popular way to measure the breadth and impact of a scholar’s work is to tally works in descending order of how often each is cited, and then identify the point at which the number of oft-cited works exceeds the cite count for the least-frequently cited. (This is known in the field as a scholar’s “h-index.”) For instance, a scholar who had 20 works that were each cited at least 20 times, but whose 21st most-frequently cited work was cited just 10 times, would score a 20. The measure recognizes that bodies of scholarship matter greatly for influencing how important questions are understood and discussed. The search was conducted using the advanced search “author” filter in Google Scholar. A hand search culled out works by other, similarly named, individuals. For those scholars who have created a Google Scholar account, their h-index was available at a glance. While Google Scholar is less precise than more specialized citation databases, it has the virtue of being multidisciplinary and publicly accessible. Points were capped at 50. (This search was conducted on December 12.)

Book Points: A search on Amazon tallied the number of books a scholar has authored, coauthored, or edited. Scholars received 2 points for a single-authored book, 1 point for a coauthored book in which they were the lead author, a half-point for coauthored books in which they were not the lead author, and a half-point for any edited volume. The search was conducted using an “Advanced Books Search” for the scholar’s first and last name. (On a few occasions, a middle initial or name was used to avoid duplication with authors who had the same name, e.g., “David Cohen” became “David K. Cohen.”) We only searched for “Printed Books” (one of several searchable formats) so as to avoid double-counting books which are also available as e-books. This means that books released only as e-books are omitted. However, few scholars on this list pen books that are published solely as e-books (this will likely change before long, but we’ll cross that bridge when we come to it). “Out of print” volumes were excluded, as were reports, commissioned studies, and special editions of magazines or journals. This measure reflects the conviction that books can influence public discussion in an outsized fashion. Book points were capped at 20. (This search was conducted on December 12.)

Highest Amazon Ranking: This reflects the author’s highest-ranked book on Amazon. The highest-ranked book was subtracted from 400,000 and the result was divided by 20,000 to yield a maximum score of 20. The nature of Amazon’s ranking algorithm means that this score can be volatile and favors more recent sales. The result is an imperfect measure, but one that conveys real information about whether a scholar has penned a book that is influencing contemporary discussion. (This search was conducted on December 12.)

Syllabus Points: This seeks to measure long-term academic impact on what is being read by the rising generation of university students. A search of “OpenSyllabusProject.org” (a website which collects over one million syllabi from across American, British, Canadian, and Australian universities) was used to gauge how widely used were the works of various authors. A search of the “Open Syllabus Explorer,” using the scholar’s name, was used to identify their top-ranked text. The score reflects the number of times that text appeared on syllabi, with the tally then divided by 5. The score was capped at 10 points. (This search was conducted on December 14.)

Education Press Mentions: This measures the total number of times the scholar was quoted or mentioned in Education Week, the Chronicle of Higher Education, or Inside Higher Education during 2018. Searches were conducted using each scholar’s first and last name. If applicable, we also searched names using a common diminutive and both with and without middle initials. In each instance, the highest result was recorded. The number of appearances in the Chronicle and Inside Higher Ed were averaged and that number was added to the number of times a scholar appeared in Education Week. (This was done to avoid overweighting higher education.) The resulting figure was multiplied by two, with total Ed Press points then capped at 25. (Education Week and Inside Higher Ed were searched on December 13. The Chronicle of Higher Education was searched on December 14.)

Web Mentions: This reflects the number of times a scholar was referenced, quoted, or otherwise mentioned online in 2018. The intent is to use a “wisdom of crowds” metric to gauge a scholar’s influence on the public discourse last year. The search was conducted using Google. The search terms were each scholar’s name and university affiliation (e.g., “Bill Smith” and “Rutgers University”). Using affiliation served a dual purpose: It avoids confusion due to common names and increases the likelihood that mentions are related to university-affiliated activity. If a scholar was mentioned sans affiliation, that mention was omitted. As with the Education Press category, searches included common diminutives and were run with and without middle initials. For each scholar, we used the single highest score from among these various name configurations. (We didn’t sum them, as that produces complications and potential duplication.) Points were calculated by dividing total mentions by 30. Scores were capped at 25. (This search was conducted on December 13.)

Newspaper Mentions: A Lexis Nexis search was used to determine the number of times a scholar was quoted or mentioned in U.S. newspapers. Again, searches used a scholar’s name and affiliation, diminutives, and were run with and without middle initials. In each instance, the highest result was recorded. To avoid double counting, the scores do not include any mentions from Education Week, the Chronicle of Higher Education, or Inside Higher Ed. Points were calculated by dividing the total number of mentions by two, and were capped at 30. (The search was conducted on December 12.)

Congressional Record Mentions: We conducted a simple name search in the Congressional Record for 2018 to determine whether a scholar had testified or if their work was referenced by a member of Congress. Qualifying scholars received five points. (This search was conducted on December 12.)

Kred Score: Since Klout was discontinued this year, we sought out a new metric to gauge online presence and influence. Kred Influence uses an individual’s Twitter handle to calculate a score, between 0 and 1,000, that accounts for mentions, retweets, and replies. We first determined whether a given scholar had a Twitter account, with a hand search ruling out similarly named individuals. For scholars who had an account, we then obtained their Kred Influence score. The Kred score was divided by 100, yielding a maximum score of 10. (This search was conducted on December 14.)

The rankings seek to acknowledge scholars who are actively engaged in public discourse and whose work has an impact on practice and policy. That’s why the scoring discounts, for instance, rarely cited publications or books that are unread or out of print. Generally, the scholars who rank highest are both influential researchers and influential public voices.

There are obviously lots of provisos when it comes to the Edu-Scholar results. Different disciplines approach books and articles differently. Senior scholars have had more opportunity to build a substantial body of work and influence (for what it’s worth, the results unapologetically favor sustained accomplishment). And readers may care more for some categories than others. That’s all well and good. The point is to spur discussion about the nature of constructive public influence: who’s doing it, how valuable it is, and how to gauge a scholar’s contribution.

A few notes regarding questions that arise every year: First, there are some academics that dabble (quite successfully) in education, but for whom education is only a sideline. They are not included in these rankings. For a scholar to be included, education must constitute a substantial slice of their scholarship. This helps ensure that the rankings serve as something of an apples-to-apples comparison. Second, scholars sometimes change institutions in the course of a year. My policy is straightforward: For the categories where affiliation is used, searches are conducted using a scholar’s year-end affiliation. This avoids concerns about double-counting and reduces the burden on my overworked RAs. Scholars do get dinged a bit in the year they move. But that’s life. Third, it goes without saying that tomorrow’s list represents only a sliver of the faculty doing this work. For those interested in scoring additional scholars, it’s a straightforward task to do so using the scoring rubric enumerated above. Indeed, the exercise was designed so that anyone can generate a comparable rating for a given scholar in a half-hour or less.

Finally, a note of thanks: For the hard work of coordinating the selection committee, finalizing the 2019 list, and then spending dozens of hours crunching and double-checking all of this data for 200 scholars, I owe a big shout-out to my gifted, diligent, and wholly remarkable research assistants RJ Martin, Amy Cummings, Sofia Gallo, and Connor Kurtz.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Data Webinar
Education Insights with Actionable Data to Create More Personalized Engagement
The world has changed during this time of pandemic learning, and there is a new challenge faced in education regarding how we effectively utilize the data now available to educators and leaders. In this session
Content provided by Microsoft
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Accelerate Learning with Project-Based Learning
Earlier this year, the George Lucas Educational Foundation released four new studies highlighting how project-based learning (PBL) helps accelerate student learning—across age groups, multiple disciplines, and different socio-economic statuses. With this year’s emphasis on unfinished
Content provided by SmartLab Learning
School & District Management Live Online Discussion Principal Overload: How to Manage Anxiety, Stress, and Tough Decisions
According to recent surveys, more than 40 percent of principals are considering leaving their jobs. With the pandemic, running a school building has become even more complicated, and principals' workloads continue to grow. If we

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education California Makes Ethnic Studies a High School Requirement
California is among the first in the nation to require students to take a course in ethnic studies to get a diploma starting in 2029-30.
4 min read
FILE - In this Jan. 22, 2020, file photo, Democratic Assembly members, from left, James Ramos, Chris Holden Jose Medina, and Rudy Salas, Jr., right, huddle during an Assembly session in Sacramento, Calif. Medina's bill to make ethnic studies a high school requirement was signed into law by California Gov. Gavin Newsom on Friday, Oct. 8, 2021. (AP Photo/Rich Pedroncelli, File)
Education California Requires Free Menstrual Products in Public Schools
The move comes as women’s rights advocates push nationwide for affordable access to pads, tampons, and other items.
1 min read
Tammy Compton restocks tampons at Compton's Market, in Sacramento, Calif., on June 22, 2016. California public schools and colleges must stock their restrooms with free menstrual products under a new law signed by Gov. Gavin Newsom, Friday, Oct. 8, 2021.
Tammy Compton restocks tampons at Compton's Market, in Sacramento, Calif., on June 22, 2016. California public schools and colleges must stock their restrooms with free menstrual products under a new law signed by Gov. Gavin Newsom, Friday, Oct. 8, 2021.
Rich Pedroncelli/AP
Education Florida to Dock School District Salaries for Requiring Masks
Florida is set to dock salaries and withhold funding from local school districts that defied Gov. Ron DeSantis' ban on mask mandates.
2 min read
Florida Gov. Ron DeSantis speaks, Tuesday, Sept. 14, 2021, at the Doral Academy Preparatory School in Doral, Fla.
Florida Gov. Ron DeSantis speaks, Tuesday, Sept. 14, 2021, at the Doral Academy Preparatory School in Doral, Fla.
Wilfredo Lee/AP
Education More Than 120,000 U.S. Kids Had Caregivers Die During Pandemic
The toll has been far greater among Black and Hispanic Americans, a new study suggests.
3 min read
FILE - In this Thursday, Sept. 2, 2021 file photo, a funeral director arranges flowers on a casket before a service in Tampa, Fla. According to a study published Thursday, Oct. 7, 2021, by the medical journal Pediatrics, the number of U.S. children orphaned during the COVID-19 pandemic may be larger than previously estimated, and the toll has been far greater among Black and Hispanic Americans. (AP Photo/Chris O'Meara, File)