Ranking Academic Impact

February 17th, 2020 / By

Paul Heald and Ted Sichelman have published a new ranking of the top U.S. law schools by academic impact. Five distinguished scholars comment on their ranking in the same issue of Jurimetrics Journal in which the ranking appears. But neither the authors of this ranking nor their distinguished commentators notice a singular result: The Heald/Sichelman rankings include a law school that does not exist.

According to Heald and Sichelman, Oregon State ranks 53d among U.S. law schools for its SSRN downloads; 35th for its citations in the Hein database; and 46th in a combined metric. Oregon State, however, does not have a law school. The University of Oregon has a law school, but it appears separately in the Heald/Sichelman rankings. So Heald and Sichelman have not simply fumbled the name of Oregon’s only public law school.

Instead, it appears that my own law school (Ohio State) has been renamed Oregon State. I can’t be sure without seeing Heald and Sichelman’s underlying data; even the “open” database posted in Dropbox refers to the nonexistent Oregon State. But Ohio State, currently tied for 34th in the US News survey, seems conspicuously absent from the Heald/Sichelman ranking.

I’m sure that my deans will contact Heald and Sichelman to request a correction–assuming that Oregon State actually is Ohio State. Oregon State Law School’s administrators probably will not complain. They can’t celebrate either, of course, because they don’t exist. But apart from that correction, let’s ruminate on this error. What does it have to say about rankings?

Reliability

A mistake like this obviously raises doubts about the reliability of the Heald/Sichelman ranking. If an error of this magnitude exists, what other errors lurk in the data? Even if you like the Heald/Sichelman method, how do you know it was carried out faithfully?

Some errors plague any type of large quantitative study, but an error of this nature is unusual. One of the key rules of quantitative analysis is to step back from the data periodically to ask if the patterns make sense. Surprising results may represent genuine, novel insights–but they can also be signs of underlying errors.

Heald and Sichelman studied the correlations between their rankings and several other measures. Didn’t they notice that one school produced a missing value? And when discussing schools that had highly discrepant rankings, didn’t they notice that one school in their scheme did not appear at all in other ranking schemes?

It’s possible, of course, that Heald and Sichelman misnamed Ohio State throughout their database so that they compared Oregon State’s Heald/Sichelman rank with the same misnamed school’s US News rank. Or perhaps the error slipped in near the end when they or an assistant changed “Ohio” to “Oregon” in the article’s spreadsheets.

Quantitative researchers who have their hands deeply in the data, however, should catch errors like this. Even after Heald and Sichelman banish Oregon State from their ranking, I will retain doubts about the reliability of their data. And my doubts about the reliability of other rankings, no matter how “scientific,” have been aroused.

For What Purpose?

Heald and Sichelman’s error is troubling, but I am equally concerned about the failure of any of their readers to spot the mistake. How could five commentators, as well as numerous other readers and workshop participants, blithely skim over the nonexistent Oregon State? Even if they weren’t familiar with Oregon’s law schools, weren’t they surprised to see Oregon State ranked 35th for Hein citations? The three schools in that state currently appear as 83d, 104th, and in the unranked fourth tier of the US News survey. Shouldn’t someone have noticed the surprising strength of Oregon State’s faculty?

I suspect that no one noticed the presence of Oregon State because most faculty read rankings primarily to see where their own school ranks. That’s what I did: I was curious where my own faculty ranked and, when Ohio State was absent, I looked more closely. It was only then that I noticed a law school that doesn’t exist.

But if that is the use of these academic impact rankings, to give faculty comfort or angst about where their law school ranks, are these rankings worth producing? They require a great deal of work and number crunching, as Heald and Sichelman make clear. Even with their presumably careful work, a substantial mistake occurred. Is the pay-off (including mistakes) worth the effort?

More worrisome, I think these rankings will harm the legal profession and its clients. Legal educators are key stewards of the legal profession. We are the profession’s primary gatekeepers: Few people become lawyers without first earning our diplomas. We are also responsible for giving students the foundation they need to serve clients competently and ethically.

Rankings of academic impact almost certainly will incentivize schools to invest still more of their resources in faculty scholarship—which, in turn, will raise tuition, reduce student discounts, and/or divert money from preparing students for their essential professional roles.

Scholarship is part of our commitment to the profession, clients, and society, but only one part. Over the last 20 years, I have seen law schools shift increasing resources to scholarship, while reducing teaching loads and raising tuition rapaciously. We produced excellent scholarship before 2000–scholarship that created fields like critical race theory, law and economics, feminist theory, and social science analyses of law-related issues. There is much still to explore, but why does today’s scholarship demand so many more resources? And will rankings further accelerate that trend?

,

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests