Ranking Academic Impact

February 17th, 2020 / By

Paul Heald and Ted Sichelman have published a new ranking of the top U.S. law schools by academic impact. Five distinguished scholars comment on their ranking in the same issue of Jurimetrics Journal in which the ranking appears. But neither the authors of this ranking nor their distinguished commentators notice a singular result: The Heald/Sichelman rankings include a law school that does not exist.

According to Heald and Sichelman, Oregon State ranks 53d among U.S. law schools for its SSRN downloads; 35th for its citations in the Hein database; and 46th in a combined metric. Oregon State, however, does not have a law school. The University of Oregon has a law school, but it appears separately in the Heald/Sichelman rankings. So Heald and Sichelman have not simply fumbled the name of Oregon’s only public law school.

Instead, it appears that my own law school (Ohio State) has been renamed Oregon State. I can’t be sure without seeing Heald and Sichelman’s underlying data; even the “open” database posted in Dropbox refers to the nonexistent Oregon State. But Ohio State, currently tied for 34th in the US News survey, seems conspicuously absent from the Heald/Sichelman ranking.

I’m sure that my deans will contact Heald and Sichelman to request a correction–assuming that Oregon State actually is Ohio State. Oregon State Law School’s administrators probably will not complain. They can’t celebrate either, of course, because they don’t exist. But apart from that correction, let’s ruminate on this error. What does it have to say about rankings?

Reliability

A mistake like this obviously raises doubts about the reliability of the Heald/Sichelman ranking. If an error of this magnitude exists, what other errors lurk in the data? Even if you like the Heald/Sichelman method, how do you know it was carried out faithfully?

Some errors plague any type of large quantitative study, but an error of this nature is unusual. One of the key rules of quantitative analysis is to step back from the data periodically to ask if the patterns make sense. Surprising results may represent genuine, novel insights–but they can also be signs of underlying errors.

Heald and Sichelman studied the correlations between their rankings and several other measures. Didn’t they notice that one school produced a missing value? And when discussing schools that had highly discrepant rankings, didn’t they notice that one school in their scheme did not appear at all in other ranking schemes?

It’s possible, of course, that Heald and Sichelman misnamed Ohio State throughout their database so that they compared Oregon State’s Heald/Sichelman rank with the same misnamed school’s US News rank. Or perhaps the error slipped in near the end when they or an assistant changed “Ohio” to “Oregon” in the article’s spreadsheets.

Quantitative researchers who have their hands deeply in the data, however, should catch errors like this. Even after Heald and Sichelman banish Oregon State from their ranking, I will retain doubts about the reliability of their data. And my doubts about the reliability of other rankings, no matter how “scientific,” have been aroused.

For What Purpose?

Heald and Sichelman’s error is troubling, but I am equally concerned about the failure of any of their readers to spot the mistake. How could five commentators, as well as numerous other readers and workshop participants, blithely skim over the nonexistent Oregon State? Even if they weren’t familiar with Oregon’s law schools, weren’t they surprised to see Oregon State ranked 35th for Hein citations? The three schools in that state currently appear as 83d, 104th, and in the unranked fourth tier of the US News survey. Shouldn’t someone have noticed the surprising strength of Oregon State’s faculty?

I suspect that no one noticed the presence of Oregon State because most faculty read rankings primarily to see where their own school ranks. That’s what I did: I was curious where my own faculty ranked and, when Ohio State was absent, I looked more closely. It was only then that I noticed a law school that doesn’t exist.

But if that is the use of these academic impact rankings, to give faculty comfort or angst about where their law school ranks, are these rankings worth producing? They require a great deal of work and number crunching, as Heald and Sichelman make clear. Even with their presumably careful work, a substantial mistake occurred. Is the pay-off (including mistakes) worth the effort?

More worrisome, I think these rankings will harm the legal profession and its clients. Legal educators are key stewards of the legal profession. We are the profession’s primary gatekeepers: Few people become lawyers without first earning our diplomas. We are also responsible for giving students the foundation they need to serve clients competently and ethically.

Rankings of academic impact almost certainly will incentivize schools to invest still more of their resources in faculty scholarship—which, in turn, will raise tuition, reduce student discounts, and/or divert money from preparing students for their essential professional roles.

Scholarship is part of our commitment to the profession, clients, and society, but only one part. Over the last 20 years, I have seen law schools shift increasing resources to scholarship, while reducing teaching loads and raising tuition rapaciously. We produced excellent scholarship before 2000–scholarship that created fields like critical race theory, law and economics, feminist theory, and social science analyses of law-related issues. There is much still to explore, but why does today’s scholarship demand so many more resources? And will rankings further accelerate that trend?

, View Comment (1)

Engines of Anxiety

April 29th, 2016 / By

Two sociologists, Wendy Nelson Espeland and Michael Sauder, have published a book that examines the impact of US News rankings on legal education. The book, titled Engines of Anxiety, is available as an e-book through Project Muse. If your university subscribes to Project Muse (as mine does), you can download the book and read it for free on your laptop or tablet. If you don’t have access to a university library, some public libraries also subscribe to books through Project Muse. It’s a great way to read academic books and journals. H/t to TaxProf for noting publication of this book.

, No Comments Yet

The U.S. News Rankings Are Horrible. Stop Paying Attention.

March 11th, 2016 / By

Note: A version of this piece was published last year on Law.com, but the U.S. News rankings remain as toxic of an influence as ever. This years version was published on Above the Law.

Next week, the law school world will overreact to slightly-shuffled U.S. News rankings. Proud alumni and worried students will voice concerns. Provosts will threaten jobs. Prospective students will confuse the annual shuffle with genuine reputational change.

Law school administrators will react predictably. They’ll articulate methodological flaws and lament negative externalities, but will nevertheless commit to the rankings game through their statements and actions. Assuring stakeholders bearing pitchforks has become part of the job description. (more…)

, View Comments (2)

Why Ranking Law Schools Nationally Is Nonsensical

January 19th, 2016 / By

This piece was originally published on Bloomberg.

Earlier this month, at the American Association of Law Schools’ annual meeting in New York, the AALS’s Section for the Law School Dean hosted a panel on law school rankings. During a Q&A, Nebraska Law School Dean Susan Poser posed a series of questions to Bob Morse, chief architect of the U.S. News law school rankings.

I don’t know anything about schools except the one I went to and the one I’m at now. How do you justify asking us to rank the prestige of other schools, and how do you justify giving this component such a large weight?

Blake Edwards, writing for Big Law Business, has more details on the panel here. I want spark a discussion about some ways to improve the reputation metric.
(more…)

, View Comments (2)

How To Fix The U.S. News Law School Rankings

January 13th, 2016 / By

This was originally published on Above the Law.

To put it mildly, I’m not a fan of the U.S. News law school rankings. They poison the decision-making process for law students and law schools alike. For students, they cause irrational choices about where to attend or how much to pay. For schools, they produce a host of incentives that do not align with the goal of providing an accessible, affordable legal education.

Because of their undeniable influence, it makes sense to seek methodological changes that nudge schools in a better direction.

(more…)

, View Comment (1)

ATL Rankings: The Bad and the Maybe

June 5th, 2015 / By

I’ve already discussed the positive aspects of Above the Law (ATL)’s law school rankings. Here I address the poorly constructed parts of the ranking scheme. Once again, I use ATL to provoke further thought about all law school rankings.

Quality Jobs Score

ATL complements its overall employment score, which is one of the scheme’s positive features, with a “quality jobs score.” The latter counts only “placement with the country’s largest and best-paying law firms (using the National Law Journal’s “NLJ 250”) and the percentage of graduates embarking on federal judicial clerkships.”

I agree with ATL’s decision to give extra weight to some jobs; even among jobs requiring bar admission, some are more rewarding to graduates than others. This category, however, is unnecessarily narrow–and too slanted towards private practice.

Using ATL’s own justification for the category’s definition (counting careers that best support repayment of law school debt), it would be easy to make this a more useful category. Government and public interest jobs, which grant full loan forgiveness after ten years, also enable repayment of student loans. Given the short tenure of many BigLaw associates, the government/public interest route may be more reliable than the BigLaw one.

I would expand this category to include all government and public interest jobs that qualify graduates for loan forgiveness at the ten-year mark, excluding only those that are school financed. Although ATL properly excludes JD-advantage jobs from its general employment score, I would include them here–as long as the jobs qualify for public-service loan forgiveness. A government job requiring bar admission, in other words, would count toward both employment measures, while a JD-advantage government position would count just once.

Making this change would reduce this factor’s bias toward private practice, while incorporating information that matters to a wider range of prospective students.

SCOTUS Clerks and Federal Judges

Many observers have criticized this component, which counts “a school’s graduates as a percentage of (1) all U.S. Supreme Court clerks (since 2010) and (2) currently sitting Article III judges.” For both of these, ATL adjusts the score for the size of the school. What’s up with that?

ATL defends the criterion as useful for students “who want to be [federal] judges and academics.” But that’s just silly. These jobs constitute such a small slice of the job market that they shouldn’t appear in a ranking designed to be useful for a large group of users. If ATL really embraces the latter goal, there’s an appropriate way to modify this factor.

First, get rid of the SCOTUS clerk count. That specialized information is available elsewhere (including on ATL) for prospective students who think that’s relevant to their choice of law school. Second, expand the count of sitting Article III judges to include counts of (a) current members of Congress; (b) the President and Cabinet members; and (c) CEO’s and General Counsel at all Fortune 500 companies. Finally, don’t adjust the counts for school size.

These changes would produce a measure of national influence in four key areas: the judiciary, executive branch, legislature, and corporate world. Only a small percentage of graduates will ever hold these very prestigious jobs, but the jobholders improve their school’s standing and influence. That’s why I wouldn’t adjust the counts for school size. If you’re measuring the power that a school exerts through alumni in these positions, the absolute number matters more than the percentage.

Leaders in private law firms, state governments, and public interest organizations also enhance a school’s alumni network–and one could imagine adding those to this component. Those organizations, however, already receive recognition in the two factors that measure immediate graduate employment. It seems more important to add legislative, executive, and corporate influence to the rankings. As a first step, therefore, I would try to modify this component as I’ve outlined here.

Component Sorting

A major flaw in ATL’s scheme is that it doesn’t allow users to sort schools by component scores. The editors have published the top five schools in most categories, but that falls far short of full sorting. Focused-purpose rankings are most useful if readers can sort schools based on each component. One reader may value alumni ratings above all other factors, while another reader cares about quality jobs. Adding a full-sort feature to the ranking would be an important step.

Why Rank?

Like many educators, I dislike rankings. The negative incentives created by US News far outweigh the limited value it offers prospective students. Rankings can also mislead students into making decisions based solely on those schemes, rather than using rank as one tool in a broader decisionmaking process. Even if modified in the ways I suggest here, both of these drawbacks may affect the ATL rankings.

As Law School Transparency has shown, it is possible to give prospective students useful information about law schools without adding the baggage of rankings. Above the Law could perform a greater public service by publishing its data as an information set rather than as an integrated ranking.

But rankings draw attention and generate revenue; they are unlikely to disappear. If we’re going to have rankings, then it’s good to have more than one. Comparing schemes may help us see the flaws in all ranking systems; perhaps eventually we’ll reject rankings in favor of other ways to organize information.

, No Comments Yet

ATL Rankings: The Good, the Bad, and the Maybe

June 4th, 2015 / By

In my last post I used Above the Law (ATL)’s law school rankings to explore three types of ranking schemes. Now it’s time to assess the good, bad, and maybe of ATL’s system. In this column I explore the good; posts on the bad and maybe will follow shortly. ATL’s metrics are worth considering both to assess that system and to reflect on all ranking schemes.

Employment Score

ATL’s ranking gives substantial weight to employment outcomes, a factor that clearly matters to students. I agree with ATL that “full-time, long-term jobs requiring bar passage (excluding solos and school-funded positions)” offer the best measure for an employment score. Surveys show that these are the jobs that most graduates want immediately after law school. Equally important, these are the jobs that allow law schools to charge a tuition premium for entry to a restricted profession. Since schools reap the premium, they should be measured on their ability to deliver the outcome.

For a focused-purpose ranking, finally, simple metrics make the most sense. Prospective law students who don’t want to practice can ignore or adjust the ATL rankings (which assume practice as a desired outcome). A student admitted to Northwestern’s JD-MBA program, for example, will care more about that program’s attributes than about the ATL rank. For most students, ATL’s employment score offers a useful starting point.

Alumni Rating

This metric, like the previous one, gives useful information to prospective students. If alumni like an institution’s program, culture, and outcomes, prospective students may feel the same. Happy alumni also provide stronger networks for career support. The alumni rating, finally, may provide a bulwark against schools gaming other parts of the scheme. If a school mischaracterizes jobs, for example, alumni may respond negatively.

It’s notable that ATL surveys alumni, while US News derives reputation scores from a general pool of academics, lawyers, and judges. The former offers particularly useful information to prospective students, while the latter focuses more directly on prestige.

Debt Per Job

This is a nice way of incorporating two elements (cost and employment) that matter to students. The measure may also suggest how closely the institution focuses on student welfare. A school that keeps student costs low, while providing good outcomes, is one that probably cares about students. Even a wealthy student might prefer that institution over one with a worse ratio of debt to jobs.

The best part of this metric is that it gives law schools an incentive to award need-based scholarships. Sure, schools could try to improve this measure by admitting lots of wealthy students–but there just aren’t that many of those students to go around. Most schools have already invested in improving employment outcomes, so the best way to further improve the “debt per job” measure is for the school to award scholarships to students who would otherwise borrow the most.

Over the last twenty years, US News has pushed schools from need-based scholarships to LSAT-based ones. What a refreshing change if a ranking scheme led us back to need-based aid.

Education Cost

Cost is another key factor for 0Ls considering law schools and, under the current state of the market, I support ATL’s decision to use list-price tuition for this measure. Many students negotiate discounts from list price, but schools don’t publish their net tuition levels. The whole negotiation system, meanwhile, is repugnant. Why are schools forcing young adults to test their bargaining skills in a high-stakes negotiation that will affect their financial status for up to a quarter century?

We know that in other contexts, race and gender affect negotiation outcomes. (These are just two of many possible citations.) How sure are we that these factors don’t affect negotiations for tuition discounts? Most of the biases that taint negotiations are unconscious rather than conscious. And even if law school administrators act with scrupulous fairness, these biases affect the students seeking aid: Race and gender influence a student’s willingness to ask for more.

In addition to these biases, it seems likely that students from disadvantaged backgrounds know less about tuition negotiation than students who have well educated helicopter parents. It’s no answer to say that economically disadvantaged students get some tuition discounts; the question is whether they would have gotten bigger discounts if they were armed with more information and better negotiating skills.

Negotiation over tuition is one of the most unsavory parts of our current academic world. I favor any component of a ranking scheme that pushes schools away from that practice. If schools don’t want to be ranked based on an inflated list-price tuition, then they can lower that tuition (and stop negotiating) or publish their average net tuition. My co-moderator made the same point last year, and it’s just as valid today.

The Bad and Maybe

Those are four strengths of the ATL rankings. Next up, the weaknesses.

, No Comments Yet

More on Rankings: Three Purposes

June 1st, 2015 / By

I want to continue my discussion of the law school rankings published by Above the Law (ATL). But before I do, let’s think more generally about the purpose of law school rankings. Who uses these rankings, and for what reason? Rankings may serve one or more of three purposes:

1. Focused-Purpose Rankings

Rankings in this first category help users make a specific decision. A government agency, for example, might rate academic institutions based on their research productivity; this ranking could the guide the award of research dollars. A private foundation aiming to reward innovative teaching might develop a ranking scheme more focused on teaching prowess.

US News and Above the Law advertise their rankings as focused-purpose ones: Both are designed to help prospective students choose a law school. One way to assess these rankings, accordingly, is to consider how well they perform this function.

Note that focused-purpose rankings can be simple or complex. Some students might choose a law school based solely on the percentage of graduates who secure jobs with the largest law firms. For those students, NLJ’s annual list of go-to law schools is the only ranking they need.

Most prospective students, however, consider a wider range of factors when choosing a law school. The same is true of people who use other types of focused-purpose rankings. The key function of these rankings is that they combine relevant information in a way that helps a user sort that information. Without assistance, a user could focus on only a few bits of information at a time. Focused-purpose rankings overcome that limit by aggregating some of the relevant data.

This doesn’t mean that users should (or will) make decisions based solely on a ranking scheme. Although a good scheme combines lots of relevant data, the scheme is unlikely to align precisely with each user’s preferences. Most people who look at rankings use them as a starting point. The individual adds relevant information omitted by the ranking scheme, or adjusts the weight given to particular components, before making a final decision.

A good ranking scheme in the “focused purpose” category supports this process through four features. The scheme (a) incorporates factors that matter to most users; (b) omits other, irrelevant data; (c) uses unambiguous metrics as components; and (d) allows users to disaggregate the components.

2. Prestige Rankings

Some rankings explicitly measure prestige. Others implicitly offer that information, although they claim another purpose. In either case, the need for “prestige” rankings is somewhat curious. Prestige does not inhere in institutions; it stems from the esteem that others confer upon the institution. Why do we need a ranking system to tell us what we already believe?

One reason is that our nation is very large. People from the West Coast may not know the prestige accorded Midwestern institutions. Newcomers to a profession may also seek information about institutional prestige. Some college students know very little about the prestige of different law schools.

For reasons like these, prestige rankings persist. It is important to recognize, however, that prestige rankings differ from the focused-purpose schemes discussed above. Prestige often relates to one of those focused purposes: A law school’s prestige, for example, almost certainly affects the employability of its graduates. A ranking of schools based on prestige, however, is different than a ranking that incorporates factors that prospective students find important in selecting a school.

Prestige rankings are more nebulous than focused-purpose ones. The ranking may depend simply on a survey of the relevant audience. Alternatively, the scheme may incorporate factors that traditionally reflect an institution’s prestige. For academic institutions, these include the selectivity of its admissions, the qualifications of its entering class, and the institution’s wealth.

3. Competition Rankings

Competition rankings have a single purpose: to confer honor. A competition ranking awards gold, silver, bronze, and other medals according to specific criteria. These rankings differ from the previous categories because their sole purpose is to accord honor for winning the competition.

Many athletic honors fall into this category. We honor Olympic gold medalists because they were the best at their event on a particular day, even if their prowess diminishes thereafter.

Competition rankings are most common in athletics and the arts, although they occasionally occur in academia. More commonly, as I discuss below, people misinterpret focused-purpose rankings as if they were competition ones.

US News

As noted above, US News promotes its law school ranking for a focused purpose: to help prospective students choose among law schools. Over time, however, the ranking has acquired aspects of both a prestige scheme and a competition one. These characteristics diminish the rankings’ use for potential students; they also contribute to much of the mischief surrounding the rankings.

Many professors, academic administrators, and alumni view their school’s US News rank as a general measure of prestige, not simply as a tool for prospective students to use when comparing law schools. Some of the US News metrics contribute to this perception. Academic reputation, for example, conveys relatively little useful information to potential students. It is much more relevant to measuring an institution’s overall prestige.

Even more troublesome, some of these audiences have started to treat the US News rankings as a competition score. Like Olympic athletes, schools claim honor simply for achieving a particular rank. Breaking into the top fourteen, top twenty, or top fifty becomes cause for excessive celebration.

If the US News ranking existed simply to aid students in selecting a law school, they would cause much less grief. Imagine, for example, if deans could reassure anxious alumni by saying something like: “Look, these rankings are just a tool for students to use when comparing law schools. And they’re not the only information that these prospective students use. We supplement the rankings by pointing to special features of our program that the rankings don’t capture. We have plenty of students who choose our school over ones ranked somewhat above us because they value X, Y, and Z.”

Deans can’t offer that particular reassurance, and listeners won’t accept it, because we have all given the US News rankings the status of prestige or competition scores. It may not matter much if a school is number 40 or 45 on a yardstick that 0Ls use as one reference in choosing a law school. Losing 5 prestige points, on the other hand, ruins everyone’s day.

Above the Law

I’ll offer a more detailed analysis of the ATL rankings in a future post. But to give you a preview: One advantage of these rankings over US News is that they focus very closely on the particular purpose of aiding prospective students. That focus makes the rankings more useful for their intended audience; it also avoids the prestige and competition auras that permeate the US News product.

, No Comments Yet

ATL Law School Rankings

May 29th, 2015 / By

Above the Law (ATL) has released the third edition of its law school rankings. Writing about rankings is a little like talking about intestinal complaints: We’d rather they didn’t exist, and it’s best not to mention such things in polite company. Rankings, however, are here to stay–and we already devote an inordinate amount of time to talking about them. In that context, there are several points to make about Above the Law‘s ranking scheme.

In this post, I address an initial question: Who cares about the ATL rankings? Will anyone read them or follow them? In my next post, I’ll explore the metrics that ATL uses and the incentives they create. In a final post, I’ll make some suggestions to improve ATL’s rankings.

So who cares? And who doesn’t?

Prospective Students

I think potential law students are already paying attention to the ATL rankings. Top-Law-Schools.com, a source used by many 0Ls, displays the Above the Law rankings alongside the US News (USN) list. Prospective students refer to both ranking systems in the site’s discussion forum. If prospective students don’t already know about ATL and its rankings, they will soon.

If I were a prospective student, I would pay at least as much attention to the ATL rankings than the USN ones. Above the Law, after all, incorporates measures that affect students deeply (cost, job outcomes, and alumni satisfaction). US News includes factors that seem more esoteric to a potential student.

Also, let’s face it: Above the Law is much more fun to read than US News. Does anyone read US News for any purpose other than rankings? 0Ls read Above the Law for gossip about law schools and the profession. If you like a source and read it regularly, you’re likely to pay attention to its recommendations–including recommendations in the form of rankings.

Alumni

Deans report that their alumni care deeply about the school’s US News rank. Changes in that number may affect the value of a graduate’s degree. School rank also creates bragging rights among other lawyers. We don’t have football or basketball teams at law schools, so what other scores can we brag about?

I predict that alumni will start to pay a lot of attention to Above the Law‘s ranking scheme. Sure, ATL is the site we all love to hate: Alumni, like legal educators, cringe at the prospect of reading about their mistakes on the ever-vigilant ATL. But the important thing is that they do read the site–a lot. They laugh at the foibles of others, nod in agreement with some reports, and keep coming back for more. This builds a lot of good will for Above the Law.

Equally important, whenever Above the Law mentions a law school in a story, it appends information about the school’s ATL rank. For an example, see this recent story about Harvard Law School. (I purposely picked a positive story, so don’t get too excited about following the link.)

Whenever alumni read about their law school–or any law school–in Above the Law, they will see information about ATL’s ranking. This is true even for the 150 schools that are “not ranked” by Above the Law. For them, a box appears reporting that fact along with information about student credentials and graduate employment.

This is an ingenious (and perfectly appropriate) marketing scheme. Alumni who read Above the Law will constantly see references to ATL’s ranking scheme. Many will care about their school’s rank and will pester the school’s dean for improvement. At first, they may not want to admit publicly that they care about an ATL ranking, but that reservation will quickly disappear. US News is a failed magazine; Above the Law is a very successful website. Which one do you think will win in the end?

US News, moreover, has no way to combat this marketing strategy. We’ve already established that no one reads US News for any reason other than the rankings. So US News has no way to keep its rankings fresh in the public’s mind. Readers return to Above the Law week after week.

Law Professors

Law professors will not welcome the ATL rankings. We don’t like any rankings, because they remind us that we’re no longer first in the class. And we certainly don’t like Above the Law, which chronicles our peccadilloes.

Worst of all, ATL rankings don’t fit with our academic culture. We like to think of ourselves as serious-minded people, pursuing serious matters with great seriousness. How could we respect rankings published by a site that makes fun of us and all of our seriousness? Please, be serious.

Except…professors spent a long time ignoring the US News rankings. We finally had to pay attention when everyone else started putting so much weight on them. Law faculty are not leaders when it comes to rankings; we are followers. If students and alumni care about ATL’s rankings, we eventually will pay attention.

University Administrators

People outside academia may not realize how much credence university presidents, provosts, and trustees give the US News rankings. The Board of Trustees at my university has a scorecard for academic initiatives that includes these two factors: (1) rank among public colleges, as determined by USN, and (2) number of graduate or professional programs in the USN top 25. On the first, we aim to improve our rank from 18 to 10. On the second, we hope to increase the number of highly ranked departments from 49 to 65.

These rank-related goals are no longer implicit; they are quite explicit at universities. And, although academic leaders once eschewed US News as a ranking source, they now embrace the system.

Presidents and provosts are likely to laugh themselves silly if law schools clamor to be judged by Above the Law rather than US News. At least for the immediate future, this will restrain ATL’s power within academia.

On the other hand, I remember a time (in the late 1990’s) when presidents and provosts laughed at law schools for attempting to rely upon their US News rank. “Real” academic departments had fancier ranking schemes, like those developed by the National Research Council. But US News was the kudzu of academic rankings: It took over faster than anyone anticipated.

Who’s to say that the Above the Law rankings won’t have their day, at least within legal education?

Meanwhile

Even if US News retains its primary hold on academic rankings, Above the Law may have some immediate impact within law schools. High US News rank, after all, depends upon enrolling talented students. If prospective students pay attention to Above the Law–as I predict they will–then law schools will have to do the same. To maintain class size and student quality, we need to know what students want. For that, Above the Law offers essential information

, View Comments (6)

How Deans Should Game the Above the Law Rankings

May 1st, 2014 / By

The popular legal news website Above the Law just announced its 2014 Top 50 Law School Rankings. ATL’s methodology focuses exclusively on outcomes: only jobs, total cost, and alumni satisfaction matter.

I generally disfavor rankings. Ranking systems appeal to a desire for clear answers even when clear answers don’t exist. Through a simple list format, rankings project the appearance of authority and value even when they provide neither.

Inherent issues aside, ATL’s rankings at least focus on elements that should and do matter to prospective students. As a result, the ATL rankings incentivize schools to act in ways that measurably help students. That’s a welcome change.

If you’re a law school dean that wants to increase its standing in the ATL rankings, here are the two most critical steps:

1. Lower Tuition

The ATL rankings factor in total educational cost, which combines living expenses, tuition, inflation, and the interest accumulated during law school. Unless a law school moves across the country, student living expenses are relatively inflexible. To compete on the education cost metric schools must either lower tuition or convince ATL to use net price instead of sticker price.

In using sticker price, ATL penalizes schools that use a high tuition, high discount model. That’s basically every school (but maybe changing). Schools that shift to a more transparent pricing model will benefit in next year’s rankings without taking in less tuition revenue.

2. Maintain or Reduce Class Size

Although class sizes are not directly measured by the ATL rankings, each employment metric either controls for graduating class size (SCOTUS clerkships; Article III judges) or relies on an employment percentage for which graduating class size is the denominator. Graduating class size is a function of incoming class size, net transfers, and students dropping out or taking longer to finish school than anticipated.

Smaller incoming classes demonstrate a modicum of social and professional responsibility in a visible manner. This buys trust from incoming students. But the urge to take more transfers to generate more revenue must be appealing these days as schools try to make up for lost 1L revenue. After all, transfers pay more, do not impact LSAT or GPA medians, have low marginal cost, and integrate rather silently. Large transfer classes also seem appealing if you believe that enrollment cuts have been too deep—an increasingly common, yet disturbing belief.

Due to ATL’s methodology, schools cannot hide from enrollment levels that adversely affect employment outcomes. Neither can schools make up for over-enrollment by funding jobs for graduates. As such, resisting the temptation to grow enrollment will benefit schools on rankings that unapologetically penalize schools for graduating too many students into a crowded entry-level market.

* * *

Schools game rankings. That’s just a basic fact about modern higher education. At least with ATL’s rankings, gaming the rankings produces measurable, positive results for students and the profession. It sure beats an incentive to burn money on blackacre to secure a higher ranking.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests