A Milestone for Legal Education

December 15th, 2016 / By

For the first time ever, women constitute a majority of JD students at ABA-accredited law schools. 50.32% of JD students studying for fall exams are women.*

It’s a milestone to celebrate–but also one to view with caution.

As Kyle McEntee and I reported last month, female law students remain clustered at the least prestigious law schools. You can find a graphic representation of these data, along with a podcast in which Kyle and I discuss the numbers, here.

After crunching the latest disclosures, there remains a strong (and statistically significant) correlation between a law school’s US News rank and its percentage of female students: On average, the better ranked schools enroll a significantly smaller percentage of women students. The correlation remains when we look at schools’ placement outcomes. Men are significantly more likely than women to attend schools that place a large percentage of their graduates in full-time, long-term jobs requiring a law license. Women are more likely to attend schools with weak employment outcomes.

When we looked at last year’s data, we found a correlation of .381 between a school’s US News rank and the percentage of women it enrolled. This year, the correlation is almost as high, at .357. The story is similar for the relationship between percentage of female students and good job outcomes. Last year’s data showed a correlation of -.520, while the updated data yield an association of -.508. All of these relationships are statistically significant: the odds of them occurring by chance are less than one in a thousand.

Women now outnumber men in law schools, but our pipeline is still broken. Let’s do more to recognize and correct gender bias in the profession. You can start with Law School Transparency’s podcast series on Women In the Law.

* Source: The ABA’s annual data release. These totals include students from Penn State’s two campuses, which seem to have been omitted from the “All Schools” spreadsheet on the ABA site. 55,059 of this year’s students are men, while 55,766 are women.

, View Comment (1)

Engines of Anxiety

April 29th, 2016 / By

Two sociologists, Wendy Nelson Espeland and Michael Sauder, have published a book that examines the impact of US News rankings on legal education. The book, titled Engines of Anxiety, is available as an e-book through Project Muse. If your university subscribes to Project Muse (as mine does), you can download the book and read it for free on your laptop or tablet. If you don’t have access to a university library, some public libraries also subscribe to books through Project Muse. It’s a great way to read academic books and journals. H/t to TaxProf for noting publication of this book.

, No Comments Yet

How To Fix The U.S. News Law School Rankings

January 13th, 2016 / By

This was originally published on Above the Law.

To put it mildly, I’m not a fan of the U.S. News law school rankings. They poison the decision-making process for law students and law schools alike. For students, they cause irrational choices about where to attend or how much to pay. For schools, they produce a host of incentives that do not align with the goal of providing an accessible, affordable legal education.

Because of their undeniable influence, it makes sense to seek methodological changes that nudge schools in a better direction.

(more…)

, View Comment (1)

More on Rankings: Three Purposes

June 1st, 2015 / By

I want to continue my discussion of the law school rankings published by Above the Law (ATL). But before I do, let’s think more generally about the purpose of law school rankings. Who uses these rankings, and for what reason? Rankings may serve one or more of three purposes:

1. Focused-Purpose Rankings

Rankings in this first category help users make a specific decision. A government agency, for example, might rate academic institutions based on their research productivity; this ranking could the guide the award of research dollars. A private foundation aiming to reward innovative teaching might develop a ranking scheme more focused on teaching prowess.

US News and Above the Law advertise their rankings as focused-purpose ones: Both are designed to help prospective students choose a law school. One way to assess these rankings, accordingly, is to consider how well they perform this function.

Note that focused-purpose rankings can be simple or complex. Some students might choose a law school based solely on the percentage of graduates who secure jobs with the largest law firms. For those students, NLJ’s annual list of go-to law schools is the only ranking they need.

Most prospective students, however, consider a wider range of factors when choosing a law school. The same is true of people who use other types of focused-purpose rankings. The key function of these rankings is that they combine relevant information in a way that helps a user sort that information. Without assistance, a user could focus on only a few bits of information at a time. Focused-purpose rankings overcome that limit by aggregating some of the relevant data.

This doesn’t mean that users should (or will) make decisions based solely on a ranking scheme. Although a good scheme combines lots of relevant data, the scheme is unlikely to align precisely with each user’s preferences. Most people who look at rankings use them as a starting point. The individual adds relevant information omitted by the ranking scheme, or adjusts the weight given to particular components, before making a final decision.

A good ranking scheme in the “focused purpose” category supports this process through four features. The scheme (a) incorporates factors that matter to most users; (b) omits other, irrelevant data; (c) uses unambiguous metrics as components; and (d) allows users to disaggregate the components.

2. Prestige Rankings

Some rankings explicitly measure prestige. Others implicitly offer that information, although they claim another purpose. In either case, the need for “prestige” rankings is somewhat curious. Prestige does not inhere in institutions; it stems from the esteem that others confer upon the institution. Why do we need a ranking system to tell us what we already believe?

One reason is that our nation is very large. People from the West Coast may not know the prestige accorded Midwestern institutions. Newcomers to a profession may also seek information about institutional prestige. Some college students know very little about the prestige of different law schools.

For reasons like these, prestige rankings persist. It is important to recognize, however, that prestige rankings differ from the focused-purpose schemes discussed above. Prestige often relates to one of those focused purposes: A law school’s prestige, for example, almost certainly affects the employability of its graduates. A ranking of schools based on prestige, however, is different than a ranking that incorporates factors that prospective students find important in selecting a school.

Prestige rankings are more nebulous than focused-purpose ones. The ranking may depend simply on a survey of the relevant audience. Alternatively, the scheme may incorporate factors that traditionally reflect an institution’s prestige. For academic institutions, these include the selectivity of its admissions, the qualifications of its entering class, and the institution’s wealth.

3. Competition Rankings

Competition rankings have a single purpose: to confer honor. A competition ranking awards gold, silver, bronze, and other medals according to specific criteria. These rankings differ from the previous categories because their sole purpose is to accord honor for winning the competition.

Many athletic honors fall into this category. We honor Olympic gold medalists because they were the best at their event on a particular day, even if their prowess diminishes thereafter.

Competition rankings are most common in athletics and the arts, although they occasionally occur in academia. More commonly, as I discuss below, people misinterpret focused-purpose rankings as if they were competition ones.

US News

As noted above, US News promotes its law school ranking for a focused purpose: to help prospective students choose among law schools. Over time, however, the ranking has acquired aspects of both a prestige scheme and a competition one. These characteristics diminish the rankings’ use for potential students; they also contribute to much of the mischief surrounding the rankings.

Many professors, academic administrators, and alumni view their school’s US News rank as a general measure of prestige, not simply as a tool for prospective students to use when comparing law schools. Some of the US News metrics contribute to this perception. Academic reputation, for example, conveys relatively little useful information to potential students. It is much more relevant to measuring an institution’s overall prestige.

Even more troublesome, some of these audiences have started to treat the US News rankings as a competition score. Like Olympic athletes, schools claim honor simply for achieving a particular rank. Breaking into the top fourteen, top twenty, or top fifty becomes cause for excessive celebration.

If the US News ranking existed simply to aid students in selecting a law school, they would cause much less grief. Imagine, for example, if deans could reassure anxious alumni by saying something like: “Look, these rankings are just a tool for students to use when comparing law schools. And they’re not the only information that these prospective students use. We supplement the rankings by pointing to special features of our program that the rankings don’t capture. We have plenty of students who choose our school over ones ranked somewhat above us because they value X, Y, and Z.”

Deans can’t offer that particular reassurance, and listeners won’t accept it, because we have all given the US News rankings the status of prestige or competition scores. It may not matter much if a school is number 40 or 45 on a yardstick that 0Ls use as one reference in choosing a law school. Losing 5 prestige points, on the other hand, ruins everyone’s day.

Above the Law

I’ll offer a more detailed analysis of the ATL rankings in a future post. But to give you a preview: One advantage of these rankings over US News is that they focus very closely on the particular purpose of aiding prospective students. That focus makes the rankings more useful for their intended audience; it also avoids the prestige and competition auras that permeate the US News product.

, No Comments Yet

ATL Law School Rankings

May 29th, 2015 / By

Above the Law (ATL) has released the third edition of its law school rankings. Writing about rankings is a little like talking about intestinal complaints: We’d rather they didn’t exist, and it’s best not to mention such things in polite company. Rankings, however, are here to stay–and we already devote an inordinate amount of time to talking about them. In that context, there are several points to make about Above the Law‘s ranking scheme.

In this post, I address an initial question: Who cares about the ATL rankings? Will anyone read them or follow them? In my next post, I’ll explore the metrics that ATL uses and the incentives they create. In a final post, I’ll make some suggestions to improve ATL’s rankings.

So who cares? And who doesn’t?

Prospective Students

I think potential law students are already paying attention to the ATL rankings. Top-Law-Schools.com, a source used by many 0Ls, displays the Above the Law rankings alongside the US News (USN) list. Prospective students refer to both ranking systems in the site’s discussion forum. If prospective students don’t already know about ATL and its rankings, they will soon.

If I were a prospective student, I would pay at least as much attention to the ATL rankings than the USN ones. Above the Law, after all, incorporates measures that affect students deeply (cost, job outcomes, and alumni satisfaction). US News includes factors that seem more esoteric to a potential student.

Also, let’s face it: Above the Law is much more fun to read than US News. Does anyone read US News for any purpose other than rankings? 0Ls read Above the Law for gossip about law schools and the profession. If you like a source and read it regularly, you’re likely to pay attention to its recommendations–including recommendations in the form of rankings.

Alumni

Deans report that their alumni care deeply about the school’s US News rank. Changes in that number may affect the value of a graduate’s degree. School rank also creates bragging rights among other lawyers. We don’t have football or basketball teams at law schools, so what other scores can we brag about?

I predict that alumni will start to pay a lot of attention to Above the Law‘s ranking scheme. Sure, ATL is the site we all love to hate: Alumni, like legal educators, cringe at the prospect of reading about their mistakes on the ever-vigilant ATL. But the important thing is that they do read the site–a lot. They laugh at the foibles of others, nod in agreement with some reports, and keep coming back for more. This builds a lot of good will for Above the Law.

Equally important, whenever Above the Law mentions a law school in a story, it appends information about the school’s ATL rank. For an example, see this recent story about Harvard Law School. (I purposely picked a positive story, so don’t get too excited about following the link.)

Whenever alumni read about their law school–or any law school–in Above the Law, they will see information about ATL’s ranking. This is true even for the 150 schools that are “not ranked” by Above the Law. For them, a box appears reporting that fact along with information about student credentials and graduate employment.

This is an ingenious (and perfectly appropriate) marketing scheme. Alumni who read Above the Law will constantly see references to ATL’s ranking scheme. Many will care about their school’s rank and will pester the school’s dean for improvement. At first, they may not want to admit publicly that they care about an ATL ranking, but that reservation will quickly disappear. US News is a failed magazine; Above the Law is a very successful website. Which one do you think will win in the end?

US News, moreover, has no way to combat this marketing strategy. We’ve already established that no one reads US News for any reason other than the rankings. So US News has no way to keep its rankings fresh in the public’s mind. Readers return to Above the Law week after week.

Law Professors

Law professors will not welcome the ATL rankings. We don’t like any rankings, because they remind us that we’re no longer first in the class. And we certainly don’t like Above the Law, which chronicles our peccadilloes.

Worst of all, ATL rankings don’t fit with our academic culture. We like to think of ourselves as serious-minded people, pursuing serious matters with great seriousness. How could we respect rankings published by a site that makes fun of us and all of our seriousness? Please, be serious.

Except…professors spent a long time ignoring the US News rankings. We finally had to pay attention when everyone else started putting so much weight on them. Law faculty are not leaders when it comes to rankings; we are followers. If students and alumni care about ATL’s rankings, we eventually will pay attention.

University Administrators

People outside academia may not realize how much credence university presidents, provosts, and trustees give the US News rankings. The Board of Trustees at my university has a scorecard for academic initiatives that includes these two factors: (1) rank among public colleges, as determined by USN, and (2) number of graduate or professional programs in the USN top 25. On the first, we aim to improve our rank from 18 to 10. On the second, we hope to increase the number of highly ranked departments from 49 to 65.

These rank-related goals are no longer implicit; they are quite explicit at universities. And, although academic leaders once eschewed US News as a ranking source, they now embrace the system.

Presidents and provosts are likely to laugh themselves silly if law schools clamor to be judged by Above the Law rather than US News. At least for the immediate future, this will restrain ATL’s power within academia.

On the other hand, I remember a time (in the late 1990’s) when presidents and provosts laughed at law schools for attempting to rely upon their US News rank. “Real” academic departments had fancier ranking schemes, like those developed by the National Research Council. But US News was the kudzu of academic rankings: It took over faster than anyone anticipated.

Who’s to say that the Above the Law rankings won’t have their day, at least within legal education?

Meanwhile

Even if US News retains its primary hold on academic rankings, Above the Law may have some immediate impact within law schools. High US News rank, after all, depends upon enrolling talented students. If prospective students pay attention to Above the Law–as I predict they will–then law schools will have to do the same. To maintain class size and student quality, we need to know what students want. For that, Above the Law offers essential information

, View Comments (6)

More on the ABA Questionnaire

July 9th, 2013 / By

Legal educators on several blogs have been discussing the ABA’s decision to eliminate expenditure data from the annual questionnaire completed by law schools. I called Scott Norberg, Deputy Consultant to the ABA’s Section of Legal Education and Admissions to the Bar, to find out more about the change.

Professor Norberg noted that the expenditure elimination is part of a larger project to slim down the annual questionnaire. Most of the changes went into effect last year, but the Section’s Council waited a year to implement elimination of the expenditure section. No objections arose to the proposed change, so the Council adopted it for this fall’s questionnaire.

Although the annual questionnaire will no longer ask explicitly about expenditures, it does request information about a law school’s reserve funds and debt (p. 7). These questions will allow the ABA to identify schools that may be in financial trouble, without needing more detailed expenditure data every year.

That’s a relief from a consumer protection perspective. But do we have to worry now that US News will incorporate financial reserves or debt level into its ranking scheme? I’m not sure I even want to think about that one.

, No Comments Yet

Notable Change in the ABA Questionnaire

July 8th, 2013 / By

Last week the ABA notified law school deans that it will no longer request annual information about each school’s expenditures. Schools will report three years of expenditures in connection with site visits, but the annual reporting of expenditures has been eliminated (see p. 4).

H/t to TaxProf and Brian Leiter for this breaking news. Now, what does the change mean for ABA data collection, legal education, and the US News rankings?

Background: The Annual Questionnaire

The ABA collects data from law schools every year through its annual questionnaire. That instrument, revised annually by the Council’s Data Policy & Collection Committee, gathers information about enrollment, courses, faculty composition, and other issues related to legal education. At least within recent years, the questionnaire has asked schools about both revenues and expenditures. The 2013 questionnaire will ask only about overall revenues, not overall expenditures.

The revised instrument still asks about two specific expenditures: money spent on library operations and money spent for student scholarships, grants, or loans. It does not, however, require schools to report other expenditures–such as money spent on salaries, conferences, coffee, and all of the other matters that make up a law school budget.

Going Forward: Data, the ABA, and Legal Education

I’m puzzled that the ABA has chosen to eliminate expenditures from the annual questionnaire, especially given the contemporary budget crunch at many law schools. Responding to the questionnaire tormented me when I was an associate dean, so I don’t advocate mindless data collection. The information collected by the ABA, however, seems to serve numerous valuable purposes. Questionnaire results help track the overall health of legal education, inform accreditation standards, and offer perspectives on policy issues related to law schools. The instructions to the fiscal portion of the questionnaire also suggest that the ABA uses this information to monitor the fiscal health of individual schools. Given the ABA’s role in protecting students, that is an important goal.

Given this range of objectives, why will the ABA continue to collect annual information about law school revenues, but not expenditures? Law schools seem to be facing unprecedented budgetary strain. In times like this, wouldn’t the ABA want to know both revenues and expenditures–so that it could gauge the financial course of legal education? As the Task Force on the Future of Legal Education finalizes its recommendations, wouldn’t it want to know how badly law schools are hurting? And as the Standards Review Committee considers the costs imposed by some accreditation measures, wouldn’t it be useful to know whether law schools are operating in the red?

I’m not suggesting that the ABA should distribute scorecards revealing the financial health of each law school. But wouldn’t aggregate data on revenue, expenditures, and the gap between the two be particularly useful right now? Annual reports of revenue give us some measure of our industry’s health, but expenditure figures are just as important. How else will we know whether schools are able to adapt to flat or declining revenues?

There’s also the matter of protecting students at individual schools. Each school will have to demonstrate its financial health during site visits, but those visits occur every seven years. Seven years is a long time–plenty long enough for a school to sustain significant financial damage and endanger the education of enrolled students. If the ABA is going to monitor anything, shouldn’t it check both revenues and expenditures on an annual basis?

I understand that many educators are celebrating elimination of the expenditures section, largely because of the US News effect discussed below. I assume, however, that the questionnaire once served purposes other than generating data for US News. Are we sure that we want to reduce our information about the financial health of legal education? Now?

Going Forward: US News

Against all reason, US News has long used expenditures as a significant component of its law school rankings. Expenditures currently account for 11.25% of the ranking formula. This component of the rankings has rightly provoked criticism from dozens, if not hundreds, of legal educators. The ABA’s elimination of expenditures from its annual questionnaire might be an attempt to discourage US News from incorporating this information.

If that’s the ABA’s motive, will the gambit work? It seems to me that US News has at least four options:

1. Continue to ask law schools to supply expenditure data. US News already asks for information that the ABA doesn’t request; it has no obligation to track the ABA’s questionnaire. Calculating expenditures takes time if you’re trying to game the system (or at least keep up with other schools that are gaming the system); the school has to think of every possible expenditure to include. Gamesmanship aside, however, it would be hard for a dean to claim with a straight face that a request for expenditures was too burdensome to meet. If a school isn’t tracking its annual expenditures, and doesn’t have a computer program that will spit those numbers out on demand, that’s really all we need to know about the school.

I hope US News doesn’t pursue this approach. I agree with all of the critics that expenditures serve no useful purpose in a ranking of law schools (even assuming that a ranking itself serves some useful purpose). It seems to me, however, that US News could easily maintain its ranking system without the ABA’s question on school expenditures.

2. Reconfigure the ranking formula to include just library and student aid expenditures. The ABA questionnaire, rather curiously, continues to ask for data on library and student aid expenditures. US News, therefore, could decide to plug just these expenditures into its ranking formula. The formula already does count student aid expenditures separately, so there’s precedent for that.

This approach would be even worse than the first option. Giving library expenditures extra weight would tempt law schools to increase spending in a part of the budget that many critics already think is too large. Creating incentives for additional student aid sounds beneficent, but it would fuel the already heated arms race to snare credentials with scholarship money. We need to wind that race down in legal education, not extend it further.

3. Replace expenditures with revenues. Since the ABA questionnaire still asks for each school’s annual revenue, US News could incorporate that figure into its ranking formula. This approach might be marginally more rational than the focus on expenditures: Schools with more money may be able to provide more opportunities to their students. Focusing on revenues, furthermore, would not penalize schools that saved some of their revenue for a rainy day.

On the other hand, this criterion would continue to bias the rankings in favor of wealthy, well established, and private schools. It would also invite the same type of gamesmanship that schools have demonstrated when reporting expenditures.

4. Eliminate money as a factor. This is my preferred outcome, and I assume that it is the one most educators would prefer. Expenditures don’t have a role in judging the quality of a law school, and they’re a source of endless manipulation. Both law schools and their consumers would be better off if we rid the rankings of the expenditures factor.

Conclusion

US News will do whatever it chooses to do. Years of entreaties, rants, and denunciation haven’t stopped it from incorporating expenditures into its law school ranking. I’m doubtful that the ABA’s change will suddenly bring US News to its senses. Meanwhile, I’m very worried about how we’re going to inform legal educators, regulators, and potential students about the financial health of law schools. Revenues are fun to count, but running a law school requires expenditures as well.

, View Comments (2)

Deluged by Debt

March 13th, 2013 / By

As Brian Tamanaha writes at Balkinization, law school debt levels continue their relentless climb. The latest figures from US News show that, among 2012 graduates, the average amount borrowed for law school exceeded $150,000 at six law schools. Only one of those schools (Northwestern) ranks among the top fifteen law schools; one other (American) ranks 56th. The other four (Thomas Jefferson, California Western, Phoenix, and New York Law School) lie in the unranked fourth tier.

As Brian’s post shows, the job outcomes at five of these schools (all but Northwestern) are dismal. Less than 40% of the students at these schools obtained full-time jobs that required bar admission and would last at least one year. Even at Northwestern, only 77% of the class met that mark. How can all of the graduates with part-time, temporary, or non-lawyering jobs possibly pay off more than $150,000 in debt–plus all of the accrued interest on that debt? What calculations can justify attending most law schools at that debt-to-outcome ratio?

The problem, of course, reaches far beyond these six schools. They are at the top of the debt ladder, but most other law schools are close behind. 123 law schools, well over half of the 193 listed schools, reported average amounts borrowed that exceeded $100,000. Even Irvine law school’s first graduates, who paid no tuition for their three years of law school, reported debt. More than two-thirds (68%) of Irvine’s initial class incurred debt, borrowing an average of $49,602 for their “free” law school ride. Remarkably, that figure gave Irvine the second lowest average debt load among the 193 law schools.

When students borrow almost $50,000 to attend law school, even without paying tuition, we have to re-think the way we structure legal education.

, No Comments Yet

US News and Employment Outcomes

March 12th, 2013 / By

Over the last two years, pressure has mounted for more transparent information about the jobs that law school graduates obtain. US News traditionally used very coarse measures of employment, most recently focusing on the percentage of graduates who reported any type of job nine months after graduation. Those nine-month employment rates included part-time jobs, temporary positions, and employment with little relationship to a law degree. A part-time sales clerk at Macy’s was just as “employed” as a law firm associate on the partnership track.

This measure allowed law schools to claim very high employment rates, both in the US News tables and in their own promotional materials. The dazzling nine-month percentages–97%, 98%, 99%!–implied that law school was still a sure road to secure, professional, and well paid employment. Applicants had to seek other information, often buried in complex websites, to understand how many of those “employed” graduates were working in part-time, short-term jobs–sometimes funded by the law schools themselves.

We’ve made progress over the last year. Law School Transparency led the way by publishing more detailed job information about every ABA-accredited law school. The ABA followed suit by requesting more nuanced information from law schools and publishing that data. The ABA also revised its accreditation standards to insist that law schools disclose more complete information to students. But still, those tables in US News, with all of those high employment rates, were very, very appealing.

Today US News joined the push for more accurate employment information. The 2014 rankings include a new measure of employment outcomes. US News now weights jobs according to whether they are JD-related, part-time or full-time, and short-term or long-term. The online magazine is not disclosing the full formula, but notes that “[f]ull weight was given for graduates who had a full-time job lasting at least a year where bar passage was required or a J.D. degree was an advantage.” At the other end of the spectrum, “[t]he lowest weight applied to jobs categorized as both part-time and short-term.”

Perhaps most important, US News has published for each law school the percentage of its 2011 graduates who obtained jobs falling into the first category–jobs that were full-time, long-term, and related to the JD. Those percentages are available, free of charge, for all law school applicants to ponder.

The results aren’t pretty. At the top eight schools, more than 90% of graduates are still finding full-time, long-term jobs that use their law degrees. Some of those jobs may not justify the cost of attendance, and we might still wonder about some of the graduates who didn’t obtain full-time, long-term, law-related work within nine months of graduation. But law-related employment rates of 90% or more might justify three years of expensive, intensive professional education.

Outside the elite eight, however, job outcomes plummet sharply. Berkeley and Michigan, two premiere public schools, tie for ninth place in the new ranking. Yet only 82.6% and 85.8% of their graduates, respectively, found full-time, long-term employment for which the JD conferred an advantage. Conversely, by nine months after graduation, 14-17% of their graduates were still marking time in part-time, short-term, or non-legal positions. Those aren’t outcomes for which students should pay top tuition dollars.

Further down the list, the outcomes are even more bleak. Minnesota and Washington University in St. Louis round out the top twenty law schools with a tie for nineteenth place. Yet nine months after graduation, only two thirds (66.3% and 66.6%) of the graduates from these schools were working in long-term, full-time jobs related to the JD. A full third of each class failed to achieve employment that used their expensive and hard-won degrees.

The percentages vary after that, climbing as high as 88.0% (for George Washington) and falling as low as 23.6% (Whittier). Over the next few days, bloggers will analyze the factors that contributed to higher employment rates (school-funded positions, geography, a large percentage of JD Advantage jobs) and those that produced lower outcomes. No amount of analysis, however, can conceal the overall pattern. No school outside the top eight placed more than 90% of its graduates in full-time, long-term, law-related work. Only 13 schools, including that top eight, exceeded the 85% mark. And only 34 schools, out of the 195 supplying employment information, managed to place as many as three-quarters of their graduates in a full-time, long-term, law-related job within nine months of graduation.

The new employment measure devised by US News is far from perfect. Its greatest flaw lies in equating all “JD Advantage” jobs with positions requiring bar admission. Statistics gathered by NALP show that law graduates are far less satisfied with JD Advantage jobs than with ones requiring a law license. Among 2011 graduates, 46.8% of those in JD Advantage jobs were still seeking other employment; just 16.5% of those in bar-admission-required jobs were doing so. Those statistics appear only in NALP’s Jobs and JDs book, and they do not distinguish full-time, long-term jobs from part-time, short-term ones, but I will ask NALP if they can provide more information on those distinctions.

Even more worrisome, I don’t believe that any organization audits the claims that graduates and law schools make about which jobs carry the “JD Advantage” tag. NALP counts the jobs reported in that category but, apparently, does not ask schools to identify the positions that count as “JD Advantage.” Nor, to my knowledge, does the ABA or US News. Does a job as a court assignment clerk count as a “JD Advantage” position? What about a job as a middle school social studies teacher? Or one as a bail agent, debt collector, or police officer? I can imagine a JD assisting workers in any of these fields–but the jobs are ones that the majority of job holders perform quite well without the training or expense of a JD.

These are issues that we need to address very soon, both for purposes of the US News ranking scheme and with respect to the information that law schools provide their applicants. But for now, the generous definition of “employed,” which includes any job carrying the “JD Advantage” label, makes the outcomes reported by US News especially troubling. Even allowing for a very liberal definition of law-related jobs, even including all of those “alternative” careers that schools have touted, law schools are leaving a remarkably large percentage of their graduates without jobs that use their degrees. Short-term and part-time jobs are not good outcomes for students who have spent over $100,000–often borrowed at high interest rates–for a legal education. Neither are jobs unrelated to the JD, ones for which schools don’t even dare claim a “JD advantage.”

Prospective students and law schools need to take note of these outcomes and take them seriously to heart. If we can’t provide even solid “JD Advantage” jobs to a substantial number of our graduates, then the value of our degree is in serious question.

, View Comment (1)

Take This Job and Count It

January 19th, 2013 / By

In an article in the Journal of Legal Metrics, two Law School Transparency team members outline LST’s methodology for the LST Score Reports, an online tool designed to improve decisions by prospective law students. LST uses employment outcomes, projected costs, and admissions stats to help prospective students navigate their law school options.

Kyle McEntee and Derek Tokaz, the authors of both this paper and the online tool, resist the urge to rank schools on a national scale. Instead, they sort schools by where their graduates work post-graduation, allowing applicants to consider schools by geographic profile. The reports then use reader-friendly terms, like the percentage of graduates who secured full-time legal jobs, to help prospective students make educated decisions about which schools, if any, can meet their needs.

McEntee and Tokaz designed the reports to help prospective law students, but this article has important information for legal educators as well. The U.S. News rankings won’t disappear any time soon, but I think prospective students will begin looking at LST’s Score Reports in addition to the rankings. The reports contain more nuanced information, which prospective applicants will value; they also try to direct applicants into deeper exploration of their law school options.

As McEntee and Tokaz show, employment scores correlate imperfectly with U.S. News rank. As applicants begin to consider these scores, together with more transparent employment information on the schools’ websites, some schools will benefit while others suffer. Schools that under-perform their U.S. News score in job placement may want to explore why. Prospective students certainly will.

The other lesson for educators is that the vast majority of legal hiring is local. Students tend to stay in the city, state, and general region where they earned their law degree. As employers increasingly demand internships and unpaid apprenticeships, this trend may become even more dominant. It is hard to work part-time for a firm in one city while attending class in another. It’s far from impossible these days, with internet commuting, but students who lack face-time with prospective employers will be at a disadvantage. It’s also daunting to relocate after law school without a job in hand.

Law schools may find this information discouraging; most schools cherish their “national reputation” and want to extend it. It’s important to recognize, however, that the best job opportunities for graduates may be local ones. Time that a school spends promoting its national brand may deliver less return for graduates than time spent at local bar meetings.

On the bright side, schools should understand that a “national reputation” can co-exist with primarily local placement rates. That, in fact, is the reality for a vast number of law schools today. People around the country have heard about many law schools, even when those schools place most of their graduates locally. National reputation takes many forms and can pay off in many ways–even for graduates in later years. One lesson that I take from McEntee and Tokaz’s paper, however, is that schools should focus more diligently on their local, state, and regional reputations. That’s where the majority of job opportunities for graduates will lie.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests