ATL Rankings: The Bad and the Maybe

June 5th, 2015 / By

I’ve already discussed the positive aspects of Above the Law (ATL)’s law school rankings. Here I address the poorly constructed parts of the ranking scheme. Once again, I use ATL to provoke further thought about all law school rankings.

Quality Jobs Score

ATL complements its overall employment score, which is one of the scheme’s positive features, with a “quality jobs score.” The latter counts only “placement with the country’s largest and best-paying law firms (using the National Law Journal’s “NLJ 250”) and the percentage of graduates embarking on federal judicial clerkships.”

I agree with ATL’s decision to give extra weight to some jobs; even among jobs requiring bar admission, some are more rewarding to graduates than others. This category, however, is unnecessarily narrow–and too slanted towards private practice.

Using ATL’s own justification for the category’s definition (counting careers that best support repayment of law school debt), it would be easy to make this a more useful category. Government and public interest jobs, which grant full loan forgiveness after ten years, also enable repayment of student loans. Given the short tenure of many BigLaw associates, the government/public interest route may be more reliable than the BigLaw one.

I would expand this category to include all government and public interest jobs that qualify graduates for loan forgiveness at the ten-year mark, excluding only those that are school financed. Although ATL properly excludes JD-advantage jobs from its general employment score, I would include them here–as long as the jobs qualify for public-service loan forgiveness. A government job requiring bar admission, in other words, would count toward both employment measures, while a JD-advantage government position would count just once.

Making this change would reduce this factor’s bias toward private practice, while incorporating information that matters to a wider range of prospective students.

SCOTUS Clerks and Federal Judges

Many observers have criticized this component, which counts “a school’s graduates as a percentage of (1) all U.S. Supreme Court clerks (since 2010) and (2) currently sitting Article III judges.” For both of these, ATL adjusts the score for the size of the school. What’s up with that?

ATL defends the criterion as useful for students “who want to be [federal] judges and academics.” But that’s just silly. These jobs constitute such a small slice of the job market that they shouldn’t appear in a ranking designed to be useful for a large group of users. If ATL really embraces the latter goal, there’s an appropriate way to modify this factor.

First, get rid of the SCOTUS clerk count. That specialized information is available elsewhere (including on ATL) for prospective students who think that’s relevant to their choice of law school. Second, expand the count of sitting Article III judges to include counts of (a) current members of Congress; (b) the President and Cabinet members; and (c) CEO’s and General Counsel at all Fortune 500 companies. Finally, don’t adjust the counts for school size.

These changes would produce a measure of national influence in four key areas: the judiciary, executive branch, legislature, and corporate world. Only a small percentage of graduates will ever hold these very prestigious jobs, but the jobholders improve their school’s standing and influence. That’s why I wouldn’t adjust the counts for school size. If you’re measuring the power that a school exerts through alumni in these positions, the absolute number matters more than the percentage.

Leaders in private law firms, state governments, and public interest organizations also enhance a school’s alumni network–and one could imagine adding those to this component. Those organizations, however, already receive recognition in the two factors that measure immediate graduate employment. It seems more important to add legislative, executive, and corporate influence to the rankings. As a first step, therefore, I would try to modify this component as I’ve outlined here.

Component Sorting

A major flaw in ATL’s scheme is that it doesn’t allow users to sort schools by component scores. The editors have published the top five schools in most categories, but that falls far short of full sorting. Focused-purpose rankings are most useful if readers can sort schools based on each component. One reader may value alumni ratings above all other factors, while another reader cares about quality jobs. Adding a full-sort feature to the ranking would be an important step.

Why Rank?

Like many educators, I dislike rankings. The negative incentives created by US News far outweigh the limited value it offers prospective students. Rankings can also mislead students into making decisions based solely on those schemes, rather than using rank as one tool in a broader decisionmaking process. Even if modified in the ways I suggest here, both of these drawbacks may affect the ATL rankings.

As Law School Transparency has shown, it is possible to give prospective students useful information about law schools without adding the baggage of rankings. Above the Law could perform a greater public service by publishing its data as an information set rather than as an integrated ranking.

But rankings draw attention and generate revenue; they are unlikely to disappear. If we’re going to have rankings, then it’s good to have more than one. Comparing schemes may help us see the flaws in all ranking systems; perhaps eventually we’ll reject rankings in favor of other ways to organize information.

, No Comments Yet

ATL Rankings: The Good, the Bad, and the Maybe

June 4th, 2015 / By

In my last post I used Above the Law (ATL)’s law school rankings to explore three types of ranking schemes. Now it’s time to assess the good, bad, and maybe of ATL’s system. In this column I explore the good; posts on the bad and maybe will follow shortly. ATL’s metrics are worth considering both to assess that system and to reflect on all ranking schemes.

Employment Score

ATL’s ranking gives substantial weight to employment outcomes, a factor that clearly matters to students. I agree with ATL that “full-time, long-term jobs requiring bar passage (excluding solos and school-funded positions)” offer the best measure for an employment score. Surveys show that these are the jobs that most graduates want immediately after law school. Equally important, these are the jobs that allow law schools to charge a tuition premium for entry to a restricted profession. Since schools reap the premium, they should be measured on their ability to deliver the outcome.

For a focused-purpose ranking, finally, simple metrics make the most sense. Prospective law students who don’t want to practice can ignore or adjust the ATL rankings (which assume practice as a desired outcome). A student admitted to Northwestern’s JD-MBA program, for example, will care more about that program’s attributes than about the ATL rank. For most students, ATL’s employment score offers a useful starting point.

Alumni Rating

This metric, like the previous one, gives useful information to prospective students. If alumni like an institution’s program, culture, and outcomes, prospective students may feel the same. Happy alumni also provide stronger networks for career support. The alumni rating, finally, may provide a bulwark against schools gaming other parts of the scheme. If a school mischaracterizes jobs, for example, alumni may respond negatively.

It’s notable that ATL surveys alumni, while US News derives reputation scores from a general pool of academics, lawyers, and judges. The former offers particularly useful information to prospective students, while the latter focuses more directly on prestige.

Debt Per Job

This is a nice way of incorporating two elements (cost and employment) that matter to students. The measure may also suggest how closely the institution focuses on student welfare. A school that keeps student costs low, while providing good outcomes, is one that probably cares about students. Even a wealthy student might prefer that institution over one with a worse ratio of debt to jobs.

The best part of this metric is that it gives law schools an incentive to award need-based scholarships. Sure, schools could try to improve this measure by admitting lots of wealthy students–but there just aren’t that many of those students to go around. Most schools have already invested in improving employment outcomes, so the best way to further improve the “debt per job” measure is for the school to award scholarships to students who would otherwise borrow the most.

Over the last twenty years, US News has pushed schools from need-based scholarships to LSAT-based ones. What a refreshing change if a ranking scheme led us back to need-based aid.

Education Cost

Cost is another key factor for 0Ls considering law schools and, under the current state of the market, I support ATL’s decision to use list-price tuition for this measure. Many students negotiate discounts from list price, but schools don’t publish their net tuition levels. The whole negotiation system, meanwhile, is repugnant. Why are schools forcing young adults to test their bargaining skills in a high-stakes negotiation that will affect their financial status for up to a quarter century?

We know that in other contexts, race and gender affect negotiation outcomes. (These are just two of many possible citations.) How sure are we that these factors don’t affect negotiations for tuition discounts? Most of the biases that taint negotiations are unconscious rather than conscious. And even if law school administrators act with scrupulous fairness, these biases affect the students seeking aid: Race and gender influence a student’s willingness to ask for more.

In addition to these biases, it seems likely that students from disadvantaged backgrounds know less about tuition negotiation than students who have well educated helicopter parents. It’s no answer to say that economically disadvantaged students get some tuition discounts; the question is whether they would have gotten bigger discounts if they were armed with more information and better negotiating skills.

Negotiation over tuition is one of the most unsavory parts of our current academic world. I favor any component of a ranking scheme that pushes schools away from that practice. If schools don’t want to be ranked based on an inflated list-price tuition, then they can lower that tuition (and stop negotiating) or publish their average net tuition. My co-moderator made the same point last year, and it’s just as valid today.

The Bad and Maybe

Those are four strengths of the ATL rankings. Next up, the weaknesses.

, No Comments Yet

More on Rankings: Three Purposes

June 1st, 2015 / By

I want to continue my discussion of the law school rankings published by Above the Law (ATL). But before I do, let’s think more generally about the purpose of law school rankings. Who uses these rankings, and for what reason? Rankings may serve one or more of three purposes:

1. Focused-Purpose Rankings

Rankings in this first category help users make a specific decision. A government agency, for example, might rate academic institutions based on their research productivity; this ranking could the guide the award of research dollars. A private foundation aiming to reward innovative teaching might develop a ranking scheme more focused on teaching prowess.

US News and Above the Law advertise their rankings as focused-purpose ones: Both are designed to help prospective students choose a law school. One way to assess these rankings, accordingly, is to consider how well they perform this function.

Note that focused-purpose rankings can be simple or complex. Some students might choose a law school based solely on the percentage of graduates who secure jobs with the largest law firms. For those students, NLJ’s annual list of go-to law schools is the only ranking they need.

Most prospective students, however, consider a wider range of factors when choosing a law school. The same is true of people who use other types of focused-purpose rankings. The key function of these rankings is that they combine relevant information in a way that helps a user sort that information. Without assistance, a user could focus on only a few bits of information at a time. Focused-purpose rankings overcome that limit by aggregating some of the relevant data.

This doesn’t mean that users should (or will) make decisions based solely on a ranking scheme. Although a good scheme combines lots of relevant data, the scheme is unlikely to align precisely with each user’s preferences. Most people who look at rankings use them as a starting point. The individual adds relevant information omitted by the ranking scheme, or adjusts the weight given to particular components, before making a final decision.

A good ranking scheme in the “focused purpose” category supports this process through four features. The scheme (a) incorporates factors that matter to most users; (b) omits other, irrelevant data; (c) uses unambiguous metrics as components; and (d) allows users to disaggregate the components.

2. Prestige Rankings

Some rankings explicitly measure prestige. Others implicitly offer that information, although they claim another purpose. In either case, the need for “prestige” rankings is somewhat curious. Prestige does not inhere in institutions; it stems from the esteem that others confer upon the institution. Why do we need a ranking system to tell us what we already believe?

One reason is that our nation is very large. People from the West Coast may not know the prestige accorded Midwestern institutions. Newcomers to a profession may also seek information about institutional prestige. Some college students know very little about the prestige of different law schools.

For reasons like these, prestige rankings persist. It is important to recognize, however, that prestige rankings differ from the focused-purpose schemes discussed above. Prestige often relates to one of those focused purposes: A law school’s prestige, for example, almost certainly affects the employability of its graduates. A ranking of schools based on prestige, however, is different than a ranking that incorporates factors that prospective students find important in selecting a school.

Prestige rankings are more nebulous than focused-purpose ones. The ranking may depend simply on a survey of the relevant audience. Alternatively, the scheme may incorporate factors that traditionally reflect an institution’s prestige. For academic institutions, these include the selectivity of its admissions, the qualifications of its entering class, and the institution’s wealth.

3. Competition Rankings

Competition rankings have a single purpose: to confer honor. A competition ranking awards gold, silver, bronze, and other medals according to specific criteria. These rankings differ from the previous categories because their sole purpose is to accord honor for winning the competition.

Many athletic honors fall into this category. We honor Olympic gold medalists because they were the best at their event on a particular day, even if their prowess diminishes thereafter.

Competition rankings are most common in athletics and the arts, although they occasionally occur in academia. More commonly, as I discuss below, people misinterpret focused-purpose rankings as if they were competition ones.

US News

As noted above, US News promotes its law school ranking for a focused purpose: to help prospective students choose among law schools. Over time, however, the ranking has acquired aspects of both a prestige scheme and a competition one. These characteristics diminish the rankings’ use for potential students; they also contribute to much of the mischief surrounding the rankings.

Many professors, academic administrators, and alumni view their school’s US News rank as a general measure of prestige, not simply as a tool for prospective students to use when comparing law schools. Some of the US News metrics contribute to this perception. Academic reputation, for example, conveys relatively little useful information to potential students. It is much more relevant to measuring an institution’s overall prestige.

Even more troublesome, some of these audiences have started to treat the US News rankings as a competition score. Like Olympic athletes, schools claim honor simply for achieving a particular rank. Breaking into the top fourteen, top twenty, or top fifty becomes cause for excessive celebration.

If the US News ranking existed simply to aid students in selecting a law school, they would cause much less grief. Imagine, for example, if deans could reassure anxious alumni by saying something like: “Look, these rankings are just a tool for students to use when comparing law schools. And they’re not the only information that these prospective students use. We supplement the rankings by pointing to special features of our program that the rankings don’t capture. We have plenty of students who choose our school over ones ranked somewhat above us because they value X, Y, and Z.”

Deans can’t offer that particular reassurance, and listeners won’t accept it, because we have all given the US News rankings the status of prestige or competition scores. It may not matter much if a school is number 40 or 45 on a yardstick that 0Ls use as one reference in choosing a law school. Losing 5 prestige points, on the other hand, ruins everyone’s day.

Above the Law

I’ll offer a more detailed analysis of the ATL rankings in a future post. But to give you a preview: One advantage of these rankings over US News is that they focus very closely on the particular purpose of aiding prospective students. That focus makes the rankings more useful for their intended audience; it also avoids the prestige and competition auras that permeate the US News product.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests