You are currently browsing archives for July 2013.

Financial Returns to Legal Education

July 21st, 2013 / By

I was busy with several projects this week, so didn’t have a chance to comment on the new paper by Michael Simkovic and Frank McIntyre. With the luxury of weekend time, I have some praise, some caveats, and some criticism for the paper.

First, in the praise category, this is a useful contribution to both the literature and the policy debates surrounding the value of a law degree. Simkovic and McIntyre are not the first to analyze the financial rewards of law school–or to examine other aspects of the market for law-related services–but their paper adds to this growing body of work.

Second, Simkovic and McIntyre have done all of us a great service by drawing attention to the Survey of Income and Program Participation. This is a rich dataset that can inform many explorations, including other studies related to legal education. The survey, for example, includes questions about grants, loans, and other assistance used to finance higher education. (See pp. 307-08 of this outline.) I hope to find time to work with this dataset, and I hope others will as well.

Now I move to some caveats and criticisms.

Sixteen Years Is Not the Long Term

Simkovic and McIntyre frequently refer to their results as representing “long-term” outcomes or “historic norms.” A central claim of the study, for example, is that the earnings premium from a law degree “is stable over the long term, with short term cyclical fluctuations.” (See slide 26 of the powerpoint overview.) These representations, however, rest on a “term” of just sixteen years, from 1996-2011. Sixteen years is less than half the span of a typical law graduate’s career; it is too short a period to embody long-term trends.

This is a different caveat from the one that Simkovic and McIntyre express, that we can’t know whether contemporary changes in the legal market will disrupt the trends they’ve identified. We can’t, in other words, know that the period from 2012-2027 will look like the one from 1996-2011. Equally important, however, the study doesn’t tell us anything about the years before 1996. Did the period from 1980-1995 look like the one from 1996-2011? What about the period from 1964-1979? Or 1948-1963?

The SIPP data can’t tell us about those periods. The survey began during the 1980s, but the instrument changed substantially in 1996. Nor do other surveys, to my knowledge, give us the type of information we need to perform those historical analyses. Simkovic and McIntyre didn’t overlook relevant data, but they claim too much from the data they do have.

Note that SIPP does contain data about law graduates of all ages. This is one of the strengths of the database, and of the Simkovic/McIntyre analysis. This study shows us the earnings of law graduates who have been practicing for decades, not just those of recent graduates. That analysis, however, occurs entirely within the sixteen-year window of 1996-2011. Putting aside other flaws or caveats for now, Simkovic and McIntyre are able to describe the earnings premium for law graduates of all ages during that sixteen-year window. They can say, as they do, that the premium has fluctuated within a particular band over that period. That statement, however, is very different than saying that the premium has been stable over the “long term” or that this period sets “historic norms.” To measure the long term, we’d want to know about a longer period of time.

This matters, because saying something has been “stable over the long term” sounds very reassuring. Sixteen years, however, is less than half the span of a typical law graduate’s career. It’s less, even, than the time that many graduates will devote to repaying their law school loans. The widely touted Pay As You Earn program extends payments over twenty years, while other plans structure payments over twenty-five years. Simkovic and McIntyre’s references to the “long term” suggest a stability that their sixteen years of data can’t support.

What would a graph of truly long-term trends show? We can’t know for sure without better data. The data might show the same pattern that Simkovic and McIntyre found for recent years. On the other hand, historic data might reveal periods when the economic premium from a law degree was small or declining. A study of long-term trends might also identify times when the JD premium was rising or higher than the one identified by Simkovic and McIntyre. A lot has changed in higher education, legal education, and the legal profession over the last 25, 50, or 100 years. That past may or may not inform the future, but it’s important to recognize that Simkovic and McIntyre tell us only about the recent past–a period that most recognize as particularly prosperous for lawyers–not about the long term.

Structural Shifts

Simkovic and McIntyre discount predictions that the legal market is undergoing a structural shift that will change lawyer earnings, the JD earnings premium, or other aspects of the labor market. Their skepticism does not stem from examination of particular workplace trends; instead it rests largely on the data they compiled. This is where Simkovic and McIntyre’s claim of stability “over the long term” becomes most dangerous.

On pp. 36-37, for example, Simkovic and McIntyre list a number of technological changes that have affected law practice, from “introduction of the typewriter” to “computerized and modular legal research through Lexis and Westlaw; word processing; electronic citation software; electronic document storage and filing systems; automated document comparison; electronic document search; email; photocopying; desktop publishing; standardized legal forms; will-making and tax-preparing software.” They then conclude (on p. 37) that “[t]hrough it all, the law degree has continued to offer a large earnings premium.”

That’s clearly hyperbole: We have no idea, based on the Simkovic and McIntyre analysis, how most of these technological changes affected the value of a law degree. Today’s JD, based on a three-year curriculum, didn’t exist when the typewriter pioneered. Lexis, WestLaw, and word processing have been around since the 1970s; photocopying dates further back than that. A study of earnings between 1996 and 2011 can’t tell us much about how those innovations affected the earnings of law graduates.

It is true (again, assuming for now no other flaws in the analysis) that legal education delivered an earnings premium during the period 1996-2011, which occurred after all of these technologies had entered the workforce. Neither typewriters nor word processors destroyed the earnings that law graduates, on average, enjoyed during those sixteen years. That is different, however, from saying that these technologies had no structural effect on lawyers’ earnings.

The Tale of the Typewriter

The lowly typewriter, in fact, may have contributed to a major structural shift in the legal market: the creation of three-year law schools and formal schooling requirements for bar admission. Simkovic and McIntyre (at fn 84) quote a 1901 statement that sounds like a melodramatic indictment of the typewriter’s impact on law practice. Francis Miles Finch, the Dean of Cornell Law School and President of the New York State Bar Association, told the bar association in 1901 that “current conditions are widely and radically different from those existing fifty years ago . . . the student in the law office copies nothing and sees nothing. The stenographer and the typewriter have monopolized what was his work . . . and he sits outside of the business tide.”

Finch, however, was not wringing his hands over new technology or the imminent demise of the legal profession; he was pointing out that law office apprentices no longer had the opportunity to absorb legal principles by copying the pleadings, briefs, letters, and other work of practicing lawyers. Finch used this change in office practices to support his argument for new licensing requirements: He proposed that every lawyer should finish four years of high school, as well as three years of law school or four years of apprenticeship, before qualifying to take the bar. These were novel requirements at the turn of the last century, although a movement was building in that direction. After Finch’s speech, the NY bar association unanimously endorsed his proposal.

Did the typewriter single-handedly lead to the creation of three-year law schools and academic prerequisites for the bar examination? Of course not. But the changing conditions of apprentice work, which grew partly from changes in technology, contributed to that shift. This structural shift, in turn, almost certainly affected the earnings of aspiring lawyers.

Some would-be lawyers, especially those of limited economic means, may not have been able to delay paid employment long enough to satisfy the requirements. Those aspirants wouldn’t have become lawyers, losing whatever financial advantage the profession might have conferred. Those who complied with the new requirements, meanwhile, lost several years of earning potential. If they attended law school, they also transferred some of their future earnings to the school by paying tuition. In these ways, the requirements reduced earnings for potential lawyers.

On the other hand, by raising barriers to entry, the requirements may have increased earnings for those already in the profession–as well as for those who succeeded in joining. Finch explicitly noted in his speech that “the profession is becoming overcrowded” and it would be a “benefit” if the educational requirements reduced the number of lawyers. (P. 102.)

The structural change, in other words, probably created winners and losers. It may also have widened the gap between those two groups. It is difficult, more than a century later, to trace the full financial effects of the educational requirements that our profession adopted during the first third of the twentieth century. I would not, however, be as quick as Simkovic and McIntyre to dismiss structural changes or their complex economic impacts.

Summary

I’ve outlined here both my praise for Simkovic and McIntyre’s article and my first two criticisms. The article adds to a much needed literature on the economics of legal education and the legal profession; it also highlights a particularly rich dataset for other scholars to explore. On the other hand, the article claims too much by referring to long-term trends and historic norms; this article examines labor market returns for law school graduates during a relatively short (and perhaps distinctive) recent period of sixteen years. The article also dismisses too quickly the impact of structural shifts. That is not really Simkovic and McIntyre’s focus, as they concede. Their data, however, do not provide the type of long-term record that would refute the possibility of structural shifts.

My next post related to this article will pick up where I left off, with winners and losers. My policy concerns with legal education and the legal profession focus primarily on the distribution of earnings, rather than on the profession’s potential to remain profitable overall. Why did law school tuition climb aggressively from 1996 through 2011, if the earnings premium was stable during that period? Why, in other words, do law schools reap a greater share of the premium today than they did in earlier decades?

Which students, meanwhile, don’t attend law school at all, forgoing any share in law school’s possible premium? For those who do attend, how is that premium distributed? Are those patterns shifting? I’ll explore these questions of winners and losers, including what we can learn about the issues from Simkovic and McIntyre, in a future post.
.

, View Comments (8)

Crucial Weaknesses

July 19th, 2013 / By

Clearly, Simkovic and McIntyre’s article has given new life to those who would defend the status quo. However, even assuming the statistical methodology is sound (which I do, as I have no reason to believe otherwise and no time to recreate it), the study suffers from a number of crucial weaknesses.

First, Part IV makes the assumption that current market challenges reflect no more than the historically cyclical nature of the legal market. If you do not agree with this assumption (and I do not–I think Susskind’s view on this issue is far more sound), then the entire study is fundamentally flawed. However, even if you buy this assumption, there remain further issues with the study.

The title itself, the “Million-Dollar Law Degree” is misleading at best. This million dollar figure reflects the mean value, where the mean is skewed significantly higher than the median. Thus, it overstates the value for significantly more than half of all JD grads. It also reflects “pre-tax” value, a point that the authors do not address until near the end of the article at Part V.C. There, the authors acknowledge that their calculated benefit must be divided between private “after-tax” earnings and public tax revenues. (more…)

No Comments Yet

New Study on Economic Value of Law Degree

July 17th, 2013 / By

I won’t spend much time summarizing the new paper by Michael Simkovic, an associate law professor at Seton Hall University School of Law, and Frank McIntyre, an assistant professor of finance and economics at Rutgers University Business School. Inside Higher Ed summarized the report just fine.

Instead, I want to comment on what I see as a misguided attempt to quell critics claiming that the law school investment is not a sound choice for many people. I hope Professor Simkovic and Professor McIntyre are correct that, on average and even down to the 25th percentile, the law school investment makes financial sense.

It just completely misses the point and grossly under-appreciates the human element.

Rather than viewing law degree holders in isolation, we can get better estimates of the causal effect of education by comparing the earnings of individuals with law degrees to the earnings of similar individuals with bachelor’s degrees while being mindful of the statistical effects of selection into law school.

Unfortunately, law degree holders are individuals who are sometimes (perhaps often) hurt by going to law school. Talking about groups necessarily smooths over the stories underneath the data—the ones that make you feel good and the ones that make you sick to your stomach. The reality is that there are many people that have been hurt and are hurt right now as a direct consequence of the costs associated with entering the legal profession (or trying to). These graduates very well may make more money in the long run. But this is hardly comforting to those considering law school and those who care about the people who do.

As I told Inside Higher Ed, law schools have made a habit out of capturing as much value out of their students as possible—and for a long time, used deceptive and immoral marketing tactics to do so. The dynamics are changing and should change because of the outrageously high price of obtaining a legal education. Even if an analysis shows an investment has a positive net present value in the long run, people are not like corporations. The short-term matters more for real people. Tens of thousands of law graduates leave school each year wondering how they’re going to manage to pay off their six-figure loans. That’s what motivates critics and frightens prospective law students.

Long-term value is not irrelevant, but using it to argue that education isn’t priced too high troubles me. If we think our society and our country are better for having an educated population, as these two authors do, then we had better stop pricing people out of education.

View Comments (6)

New Salary Data: Arkansas Law Schools

July 15th, 2013 / By

I wrote last week about a group of states that are using a “linked-records” method to collect detailed salary information for graduates of higher education. The method has some flaws, but it is improving rapidly. The databases, meanwhile, already contain information about graduates of fifteen law schools spread over five states. Let’s take a look, starting alphabetically with Arkansas.

Arkansas has two ABA-accredited law schools: the University of Arkansas at Fayetteville and the University of Arkansas at Little Rock. Both schools place a substantial majority of their graduates with employers in Arkansas, making them excellent candidates for the linked-records system. For the class of 2012, according to ABA data, 81 of Fayetteville’s 119 employed graduates (68.1%) took their first jobs in Arkansas. For the Little Rock campus, the figure was 85.3% (93 out of 109 employed graduates).

Average Salaries in Law

What salaries did those graduates earn? The College Measures database doesn’t have figures yet for 2012 graduates–or even for 2011 ones in Arkansas. But it does report the average first-year earnings of graduates from the classes of 2006 through 2010 who stayed in-state to work. For the Fayetteville campus, the average was $45,745, and for the Little Rock campus it was $47,060.

Those averages come with all of the caveats I mentioned in my earlier post: They exclude graduates working out of state, graduates holding federal jobs, and self-employed graduates. Perhaps most important, those averages include the legal market’s boom years of 2006 through 2009, along with just one down year. When the database incorporates salaries for the classes of 2011 through 2013, the averages may be lower.

Comparisons with Other Programs

Even including those boom years, however, the salaries of Arkansas law graduates suffer in comparison to starting salaries in other advanced degree programs. The Little Rock campus collected sufficient salary data from three different PhD programs: higher education administration, educational leadership, and physical sciences. The average starting salary in each of those programs was higher than in law, ranging from $52,726 in physical sciences to $72,134 in educational leadership.

To be fair, doctoral candidates in educational leadership or higher education administration often have significant workplace experience; they’re less likely than law students to move directly from college to graduate school. The salaries for these PhD’s, therefore, may partly reflect their workplace experience–not just the value of the degree. Still, eight of Little Rock’s undergraduate programs produced higher starting salaries than its law school did, topping out at $65,978 for registered nurses.

The story is similar at the Fayetteville main campus. There, five of seven doctoral programs produced higher starting salaries than law–and a sixth came within $500 of of law. I was surprised to see that the starting salaries of Arkansas law graduates compare unfavorably with those of graduates holding doctorates in adult and continuing education (average starting salary of $58,013), educational leadership ($85,245), and public policy analysis ($68,425). Even a master’s degree in political science produced an average starting salary ($44,202), within shouting distance of a law salary.

Equally depressing comparisons come from the University of Arkansas’s medical sciences campus. Dental hygienists with just an associate’s degree averaged higher starting salaries ($49,644) than law graduates from either Arkansas campus. A master’s in public health garnered, on average, $56,074. And doctors of pharmacy out-earned almost everyone with an average starting salary of $104,977.

Some of these careers, of course, may reach salary plateaus; it’s possible that Arkansas’s law graduates will earn more as their experience mounts. Even at the entry level, an Arkansas law degree continues to produce higher earnings than most undergraduate degrees. College graduates from the Fayetteville campus averaged just $33,956 during their first year in the workforce.

NALP Data

How do the linked-records salaries compare to ones reported to NALP? I couldn’t find salary information on either Arkansas law school’s website, but NALP’s Jobs and JDs book, available in hard cover, offers some interesting data. In 2007, law graduates working full-time in Arkansas reported an average salary of $49,966. That’s higher than the rolling averages compiled through the linked-records method, but not too far off. (Note that the NALP figures refer to all law graduates working in Arkansas, while the linked-records data include all Arkansas law graduates working in Arkansas. The salary pools, however, should be comparable.)

For 2011, on the other hand, NALP’s reported salaries seem quite high for Arkansas jobs. The reported mean is $52,806–more than six thousand dollars higher than the linked-records average for the boom years. It’s possible that the highest paying legal jobs in Arkansas are going to graduates of out-of-state schools. But it’s also quite likely, as NALP and law schools acknowledge, that the NALP-reported salaries skew high. That’s a good reason to support continued development of other methods for tracking salaries.

Below Minimum Wage

The last piece of information from the Arkansas linked-records database is particularly interesting. When calculating average salaries, Arkansas excluded any graduates who earned less than $13,195 per year, which is the state’s minimum wage threshold. Most employees earning less than that threshold are part-time or temporary workers. Including those salaries in a calculation of average full-time earnings would unfairly depress the average, so the researchers excluded these “below minimum wage” workers from the calculations.

Arkansas, however, does report the number of these “below minimum wage” workers for each degree program. Those numbers are depressingly high for the two law schools. Fifty-two of Little Rock’s graduates, 8.4% of all students who graduated between 2006 and 2010, earned less than $13,195 for the year that started six months after their graduation date. The percentage was the same for the Fayetteville campus: fifty-five graduates, or 8.4% of those who graduated between 2006 and 2010, earned less than minimum wage once they entered the workforce. That’s one in every twelve law graduates.

A few of these graduates may have worked in Arkansas for a few months and then moved to another state; that would produce a small amount of earnings in the Arkansas database. Others may have worked part-time for employers to supplement a solo practice or freelance work. The one in twelve figure, on the other hand, doesn’t include graduates who subsisted entirely on freelance wages or who found no paying work at all; those graduates don’t appear at all in the linked-records database.

Observations

What do we make of these data? The linked-records databases, like other sources of employment information, are incomplete. It is particularly difficult to distinguish unemployed graduates from those who have moved to other states–or to determine salary levels for the latter group of graduates. If researchers ultimately link databases across states, those connections would greatly improve the available information.

This brief examination of Arkansas data, meanwhile, illustrates the kind of comparisons facilitated by linked-records databases. Starting salaries for law graduates exceed those for most (although not all) college majors, but they lag behind salaries for many other advanced-degree holders. As we continue to debate reforms in legal education, we have to remember the options available to prospective students. Starting salaries are an important element in that calculus, one that students will be able to track more easily with databases like the ones available through College Measures.

View Comments (4)

More on the ABA Questionnaire

July 9th, 2013 / By

Legal educators on several blogs have been discussing the ABA’s decision to eliminate expenditure data from the annual questionnaire completed by law schools. I called Scott Norberg, Deputy Consultant to the ABA’s Section of Legal Education and Admissions to the Bar, to find out more about the change.

Professor Norberg noted that the expenditure elimination is part of a larger project to slim down the annual questionnaire. Most of the changes went into effect last year, but the Section’s Council waited a year to implement elimination of the expenditure section. No objections arose to the proposed change, so the Council adopted it for this fall’s questionnaire.

Although the annual questionnaire will no longer ask explicitly about expenditures, it does request information about a law school’s reserve funds and debt (p. 7). These questions will allow the ABA to identify schools that may be in financial trouble, without needing more detailed expenditure data every year.

That’s a relief from a consumer protection perspective. But do we have to worry now that US News will incorporate financial reserves or debt level into its ranking scheme? I’m not sure I even want to think about that one.

, No Comments Yet

Notable Change in the ABA Questionnaire

July 8th, 2013 / By

Last week the ABA notified law school deans that it will no longer request annual information about each school’s expenditures. Schools will report three years of expenditures in connection with site visits, but the annual reporting of expenditures has been eliminated (see p. 4).

H/t to TaxProf and Brian Leiter for this breaking news. Now, what does the change mean for ABA data collection, legal education, and the US News rankings?

Background: The Annual Questionnaire

The ABA collects data from law schools every year through its annual questionnaire. That instrument, revised annually by the Council’s Data Policy & Collection Committee, gathers information about enrollment, courses, faculty composition, and other issues related to legal education. At least within recent years, the questionnaire has asked schools about both revenues and expenditures. The 2013 questionnaire will ask only about overall revenues, not overall expenditures.

The revised instrument still asks about two specific expenditures: money spent on library operations and money spent for student scholarships, grants, or loans. It does not, however, require schools to report other expenditures–such as money spent on salaries, conferences, coffee, and all of the other matters that make up a law school budget.

Going Forward: Data, the ABA, and Legal Education

I’m puzzled that the ABA has chosen to eliminate expenditures from the annual questionnaire, especially given the contemporary budget crunch at many law schools. Responding to the questionnaire tormented me when I was an associate dean, so I don’t advocate mindless data collection. The information collected by the ABA, however, seems to serve numerous valuable purposes. Questionnaire results help track the overall health of legal education, inform accreditation standards, and offer perspectives on policy issues related to law schools. The instructions to the fiscal portion of the questionnaire also suggest that the ABA uses this information to monitor the fiscal health of individual schools. Given the ABA’s role in protecting students, that is an important goal.

Given this range of objectives, why will the ABA continue to collect annual information about law school revenues, but not expenditures? Law schools seem to be facing unprecedented budgetary strain. In times like this, wouldn’t the ABA want to know both revenues and expenditures–so that it could gauge the financial course of legal education? As the Task Force on the Future of Legal Education finalizes its recommendations, wouldn’t it want to know how badly law schools are hurting? And as the Standards Review Committee considers the costs imposed by some accreditation measures, wouldn’t it be useful to know whether law schools are operating in the red?

I’m not suggesting that the ABA should distribute scorecards revealing the financial health of each law school. But wouldn’t aggregate data on revenue, expenditures, and the gap between the two be particularly useful right now? Annual reports of revenue give us some measure of our industry’s health, but expenditure figures are just as important. How else will we know whether schools are able to adapt to flat or declining revenues?

There’s also the matter of protecting students at individual schools. Each school will have to demonstrate its financial health during site visits, but those visits occur every seven years. Seven years is a long time–plenty long enough for a school to sustain significant financial damage and endanger the education of enrolled students. If the ABA is going to monitor anything, shouldn’t it check both revenues and expenditures on an annual basis?

I understand that many educators are celebrating elimination of the expenditures section, largely because of the US News effect discussed below. I assume, however, that the questionnaire once served purposes other than generating data for US News. Are we sure that we want to reduce our information about the financial health of legal education? Now?

Going Forward: US News

Against all reason, US News has long used expenditures as a significant component of its law school rankings. Expenditures currently account for 11.25% of the ranking formula. This component of the rankings has rightly provoked criticism from dozens, if not hundreds, of legal educators. The ABA’s elimination of expenditures from its annual questionnaire might be an attempt to discourage US News from incorporating this information.

If that’s the ABA’s motive, will the gambit work? It seems to me that US News has at least four options:

1. Continue to ask law schools to supply expenditure data. US News already asks for information that the ABA doesn’t request; it has no obligation to track the ABA’s questionnaire. Calculating expenditures takes time if you’re trying to game the system (or at least keep up with other schools that are gaming the system); the school has to think of every possible expenditure to include. Gamesmanship aside, however, it would be hard for a dean to claim with a straight face that a request for expenditures was too burdensome to meet. If a school isn’t tracking its annual expenditures, and doesn’t have a computer program that will spit those numbers out on demand, that’s really all we need to know about the school.

I hope US News doesn’t pursue this approach. I agree with all of the critics that expenditures serve no useful purpose in a ranking of law schools (even assuming that a ranking itself serves some useful purpose). It seems to me, however, that US News could easily maintain its ranking system without the ABA’s question on school expenditures.

2. Reconfigure the ranking formula to include just library and student aid expenditures. The ABA questionnaire, rather curiously, continues to ask for data on library and student aid expenditures. US News, therefore, could decide to plug just these expenditures into its ranking formula. The formula already does count student aid expenditures separately, so there’s precedent for that.

This approach would be even worse than the first option. Giving library expenditures extra weight would tempt law schools to increase spending in a part of the budget that many critics already think is too large. Creating incentives for additional student aid sounds beneficent, but it would fuel the already heated arms race to snare credentials with scholarship money. We need to wind that race down in legal education, not extend it further.

3. Replace expenditures with revenues. Since the ABA questionnaire still asks for each school’s annual revenue, US News could incorporate that figure into its ranking formula. This approach might be marginally more rational than the focus on expenditures: Schools with more money may be able to provide more opportunities to their students. Focusing on revenues, furthermore, would not penalize schools that saved some of their revenue for a rainy day.

On the other hand, this criterion would continue to bias the rankings in favor of wealthy, well established, and private schools. It would also invite the same type of gamesmanship that schools have demonstrated when reporting expenditures.

4. Eliminate money as a factor. This is my preferred outcome, and I assume that it is the one most educators would prefer. Expenditures don’t have a role in judging the quality of a law school, and they’re a source of endless manipulation. Both law schools and their consumers would be better off if we rid the rankings of the expenditures factor.

Conclusion

US News will do whatever it chooses to do. Years of entreaties, rants, and denunciation haven’t stopped it from incorporating expenditures into its law school ranking. I’m doubtful that the ABA’s change will suddenly bring US News to its senses. Meanwhile, I’m very worried about how we’re going to inform legal educators, regulators, and potential students about the financial health of law schools. Revenues are fun to count, but running a law school requires expenditures as well.

, View Comments (2)

New Salary Data

July 7th, 2013 / By

Law school critics have pressed schools to produce better information about the salaries earned by their graduates. Existing sources, as we know, provide incomplete or biased information. The Bureau of Labor Statistics (BLS) gathers data about lawyers’ salaries, but those reports omit solo practitioners, law firm partners, and law graduates who don’t practice law. Nor can we break down the BLS data to identify earnings by new lawyers or by graduates of particular schools.

The salary information gathered by NALP, in contrast, focuses on new graduates, includes graduates in non-practice jobs, and can be tied to particular schools (if a school chooses to publish their data). But these figures suffer from significant selection bias; NALP warns that these salaries “are biased upwards.”

Better salary information, however, is on the way. Researchers in other fields have found a new way to gather salary data about graduates of degree programs. The method hinges on the fact that employers pay unemployment taxes for each individual they employ. These taxes fund the pools used to support unemployment compensation. The government wants to make sure that it gathers its fair share of taxes, so employers report the wages they pay each individual. State unemployment compensation agencies, therefore, possess databanks of social security numbers linked to wages.

Educational institutions, similarly, possess the social security numbers of their graduates. It is possible, therefore, to use SSNs to link graduates with their salaries. The researchers doing this, of course, don’t examine the salaries of individual graduates. Instead, this “linked-records” approach allows them to generate aggregate salary data about graduates by college, major, year of degree, and several other criteria. The method also allows researchers to track salaries over time, both to see how entry-level salaries change and to track income as graduates gain workplace experience. For a brief overview of the method, see this paper from Berkeley’s Center for Studies in Higher Education.

The linked-record approach has the potential to generate very nuanced information about the financial pay-off of different educational programs. Salary information, in fact, is already available for several law schools. Before we get to that, however, let’s look more closely at the method’s wider application and its current limits.

Applications

California has used this research method to generate an extensive database of salary outcomes for graduates of its community college programs. Using the online “salary surfer,” you can discover that the highest earning graduates from those programs are individuals who earn a certificate in electrical systems and power transmission. Those graduates average $93,410 two years after certification and $123,174 five years out.

If you’re not willing to climb utility poles or hang out with high voltage wires, a plumbing certificate also pays off reasonably well in California, generating an average salary of $65,080 two years after graduation. That certificate, however, doesn’t seem to add more value with time–at least not during the early years of a career. Average salary for certified plumbers rises to just $65,299 five years after graduation.

Community college degrees in health-related work also generate substantial salaries. Degrees in the humanities, fine and applied arts, cosmetology, and travel services, on the other hand, are poor bets financially. Paralegal training falls in the middle: A paralegal degree from a California school yields an average salary of $38,191 two years after graduation and $42,332 five years out. Paralegal certificates, notably, generate higher wages. Those paralegals average $41,546 two years after certification and $47,674 after five years. I suspect that premium occurs because the certificate earners already hold four-year college degrees; they combine the paralegal certificate with a BA to earn more in the workplace.

You can spend hours with the California database, exploring the many subjects that community colleges teach and the varied financial pay-offs for those degrees. Let’s move on, however, to a much broader database.

The research organization College Measures is working with several states to identify salary outcomes for all types of post-secondary degrees. This database, like the one for California community colleges, relies upon the linked-records data collection method described above. The College Measures site currently includes schools in Arkansas, Colorado, Tennessee, Texas, and Virginia–with Florida and Nevada coming soon. The database doesn’t include every school or degree program in these states, but coverage is growing. Here are just a few findings to illustrate the detail available on the site:

* Chicken farming is a staple of the Arkansas economy, and the University of Arkansas’s main campus offers a BA in poultry science. Those degree holders average $37,251 during their first year after college–a little more than accounting BA’s from the same campus can expect to earn ($36,681).

* Arkansas, however, teaches much more than poultry science and accounting. Some of the highest earning graduates major in chemical engineering ($56,655), physics ($48,820), computer engineering ($45,589), and economics ($43,739). If you want to maximize income after graduation, on the other hand, stay away from majors in audiology ($20,417), classics ($20,842), and drama ($22,629).

* Moving to the Texas portion of the site, you won’t be surprised to discover that the most remunerative BA offered by the University of Texas at Austin is in Petroleum Engineering. Those graduates average $115,777 during their first year out of school.

* The least financially rewarding BA’s from the UT-Austin campus, at least initially, are general music performance ($11,098), Arabic Language and Literature ($17,192), and General Visual and Performing Arts ($17,749).

You can find similar results for other majors and schools in these states, as well as for schools in Colorado, Tennessee, and Virginia. Before continuing, however, let’s examine several key limits on the currently available data.

Limits

1. One State at a Time. The linked-records databases currently operate only within a single state: they can only identify salaries for graduates who work in the same state where they attended school. The Colorado database, for example, includes both of the state’s ABA-accredited law schools–but it reports only salaries for graduates who worked in Colorado the year after graduation.

This constraint will understate salaries for law schools that send a large number of graduates to other states for high-paying jobs. If Connecticut creates a database, for example, Yale Law School will receive no credit for the salaries of graduates who work in Massachusetts, New York, the District of Columbia, and other states. The University of Texas’s law school, currently included in the College Measures database, receives credit for salaries earned at BigLaw firms in Dallas or Houston–but not for those earned in Chicago, Los Angeles, or New York.

Researchers are working to overcome this limit by linking databases nationally. I suspect that will happen within the next year or two, making the linked-records method much more comprehensive. Meanwhile, the “one state” limit casts doubt on salary results for schools with a large number of graduates who leave the state.

For many law schools, however, even single-state salary reports can yield useful information. Most law schools place the majority of their graduates in entry-level jobs within the same state. All of the Texas law schools place more than half of their graduates with Texas employers. The same is true for the Arkansas law schools, Colorado schools, and two of the three Tennessee schools. Among the states for which linked-records data are currently available, only the Virginia law schools send a majority of their graduates out of state.

For law schools that place a majority of their graduates in-state, the linked-record databases provide a welcome perspective on a wide range of salaries. These databases include jobs with small law firms, local government, and small businesses. They will also identify law graduates with jobs outside of law practice. That’s a much wider scope than the salaries reported to NALP, which disproportionately represent large law firm jobs. Even if some of a school’s graduates leave the state, this in-state salary slice is likely to give prospective students a realistic perspective on the range of salaries earned by a school’s graduates.

2. Rolling Five-Year Averages. The linked-records databases report five-year averages, rather than average salaries for a single graduating class. This feature preserves anonymity in small programs and makes the data less “noisy.” The technique, however, can also mask dramatic market shifts.

This is particularly problematic in law, because average salaries rose dramatically from 2005 through 2009, and then plunged just as precipitously. Most of the states included in the College Measures database report the average salary for students who graduated in 2006 through 2010. For law graduates, those years include at least three high-earning years (2007 through 2009) and just one post-recession year (2010). The outdated averages on the College Measures site almost certainly overstate the amounts earned by more recent law school classes.

This problem, in my opinion, makes the salaries currently reported for law schools unreliable as predictors of current salaries. On the other hand, the data could be useful for other purposes. It would be instructive, for example, to compare each school’s linked-record average with an average of the salaries that school reported to NALP over the same five years. That comparison might indicate the extent to which NALP-reported salaries skew high. Within a few years, meanwhile, the linked-records databases will offer more useful salary projections for students considering law school. They will also help us see the extent to which salaries for law graduates have shifted over time.

3. Un- and Under-Employed Graduates. The linked-records databases do not reveal how many graduates are unemployed. Graduates who are missing from a state’s records may be unemployed or they may be working in another state. Researchers currently have no way to distinguish those two statuses.

As the research becomes more sophisticated, and especially if researchers are able to link records nationally, this problem will decrease. For now, users of the database have to remember that salaries reflect averages for employed graduates. Users need to search separately for the number of a school’s unemployed graduates.

For law schools, those figures are relatively easy to obtain because they appear on each school’s ABA employment summary. By combining that resource with the College Measures information, prospective students and others can determine the percentage of a law school’s graduates who were employed nine months after graduation, as well as the average salaries earned by graduates who worked in the same state as the school.

Underemployed graduates, those working in part-time or temporary jobs, do appear in most of the linked-record databases. This is a major advantage of the linked-record method: the method calculates each graduate’s annual earnings, even if those wages came from part-time or temporary work. If a graduate worked at more than one job, the linked records will aggregate wages from each of those jobs. The results won’t reveal how hard graduates had to work to generate their income, but database users will be able to tell how much on average they earned.

4. Excluded Workers. In addition to the caveats discussed above, the linked-records databases omit two important categories of workers. Most lack information about federal employees, although some states have started adding that information. Within a year or two, federal salaries should be fully integrated with other wages. For law school graduates, meanwhile, salaries for the most common federal jobs are already well known.

More significant, the linked-record databases do not include information about the self-employed. This omission matters more in some fields than others. Utility companies employ the workers who repair high-voltage power lines; you won’t find many free-lancers climbing utility poles. Plumbers, on the other hand, are more likely to set up shop for themselves.

For recent law graduates, the picture is mixed. Relatively few of them open solo practices immediately after graduation, but a growing number may work as independent contractors. The latter group, notably, may include graduates who receive career exploration grants from their schools. Depending on how those grants are structured, the graduates may not count as “employees” of either the school or the organization where they work; instead, they may be independent contractors. If that’s the case, their wages will not appear in the linked-record databases.

As experience grows with linked-record databases, it will be possible to determine how many law graduates fall outside of those records. It should be possible, for example, to compare the number of graduates who report in-state jobs to their schools with the number of in-state salaries recorded in a linked-record database. The difference between the two numbers will represent graduates who work as solos or independent contractors. The researchers creating these databases may also find ways to incorporate earnings data about self-employed graduates.

What About Law Schools?

Tomorrow, I will discuss salary information reported for the fifteen law schools currently included in the College Measures database. If you’re impatient, just follow the links. Those specific results, however, matter less than the overall scope of this salary-tracking method. The linked-record method promises much more sophisticated salary information than educational institutions have ever gathered on their own. The salaries can be tied to specific schools and degree programs. We (as well as prospective students and policymakers) will be able to compare financial outcomes across fields, schools, and states. As the databases grow in size, we will also be able to track salaries five, ten, fifteen, or twenty years after graduation. That amount of information is breathtaking–and a little scary.

, No Comments Yet

Bar Passage and Accreditation

July 4th, 2013 / By

The Standards Review Committee of the ABA’s Section of Legal Education has been considering a change to the accreditation standard governing graduates’ success on the bar examination. The heart of the current standard requires schools to demonstrate that 75% of graduates who attempt the bar exam eventually pass that exam. New Standard 315 would require schools to show that 80% of their graduates (of those who take the bar) pass the exam by “the end of the second calendar year following their graduation.”

I support the new standard, and I urge other academics to do the same. The rule doesn’t penalize schools for graduates who decide to use their legal education for purposes other than practicing law; the 80% rate applies only to graduates who take the bar exam. The rule then gives those graduates more than two years to pass the exam. Because the rule measures time by calendar year, May graduates would have five opportunities to pass the bar before their failure would count against accreditation. As a consumer protection provision, this is a very lax rule. A school that can’t meet this standard is not serving its students well: It is either admitting students with too little chance of passing the bar or doing a poor job of teaching the students that it admits.

The proposal takes on added force given the plunge in law school applications. As schools attempt to maintain class sizes and revenue, there is a significant danger that they will admit students with little chance of passing the bar exam. Charging those students three years of professional-school tuition, when they have little chance of joining the profession, harms the students, the taxpayers who support their loans, and the economy as a whole. Accreditation standards properly restrain schools from overlooking costs like those.

Critics of the proposal rightly point out that a tougher standard may discourage schools from admitting minority students, who pass the bar at lower rates than white students. This is a serious concern: Our profession is still far too white. On the other hand, we won’t help diversity by setting minority students up to fail. Students who borrow heavily to attend law school, but then repeatedly fail the bar exam, suffer devastating financial and psychological blows.

How can we maintain access for minority students while protecting all students from schools with low bar-passage rates? I discuss three ideas below.

The $30,000 Exception

When I first thought about this problem, I considered suggesting a “$30,000” exception to proposed Standard 315. Under this exception, a school could exclude from the accreditation measure any student who failed the bar exam but paid less than $10,000 per year ($30,000 total) in law school tuition and fees.

An exception like this would encourage schools to give real opportunities to minority students whose credentials suggest a risk of bar failure. Those opportunities would consist of a reasonably priced chance to attend law school, achieve success, and qualify for the bar. Law schools can’t claim good karma for admitting at-risk students who pay high tuition for the opportunity to prove themselves. That opportunity benefits law schools as much, or more, than the at-risk students. If law schools want to support diversification of our profession–and we should–then we should be willing to invest our own dollars in that goal.

A $30,000 exception would allow schools to make a genuine commitment to diversity, without worrying about an accreditation penalty. The at-risk students would also benefit by attending school at a more reasonable cost. Even if those students failed the bar, they could more easily pay off their modest loans with JD Advantage work. A $30,000 exception could be a win-win for both at-risk students and schools that honestly want to create professional access.

I hesitate to make this proposal, however, because I’m not sure how many schools genuinely care about minority access–rather than about preserving their own profitability. A $30,000 exception could be an invitation to admit a large number of at-risk students and then invest very little in those students. Especially with declining applicant pools, schools might conclude that thirty students paying $10,000 apiece is better than thirty empty seats. Since those students would not count against a school’s accreditation, no matter how many of them failed the bar exam, schools might not invest the educational resources needed to assist at-risk students.

If schools do care about minority access, then a $30,000 exception to proposed Standard 315 might give us just the leeway we need to admit and nurture at-risk students. If schools care more about their profitability, then an exception like that would be an invitation to take advantage of at-risk students. Which spirit motivates law schools today? That’s a question for schools to reflect upon.

Adjust Bar Passing Scores

One of the shameful secrets of our profession is that we raised bar-exam passing scores during the last three decades, just as a significant number of minority students were graduating from law school. More than a dozen states raised the score required to pass their bar exam during the 1990’s. Other states took that path in more recent years: New York raised its passing score in 2005; Montana has increased the score for this month’s exam takers; and Illinois has announced an increase that will take effect in July 2015.

These increases mean that it’s harder to pass the bar exam today than it was ten, twenty, or thirty years ago. In most states, grading techniques assure that scores signal the same level of competence over time. This happens, first, because the National Conference of Bar Examiners (NCBE), “equates” the scores on the Multistate Bar Exam (MBE) from year to year. That technique, which I explain further in this paper, assures that MBE scores reflect the same level of performance each year. An equated score of 134 on the February 2013 MBE reflects the same performance as a score of 134 did in 1985.

Most states, meanwhile, grade their essay questions in a way that similarly guards against shifting standards. These states scale essay scores to the MBE scores achieved by examinees during the same test administration. This means that the MBE (which is equated over time) sets the distribution of scores available for the essay portion of the exam. If the July 2013 examinees in Ohio average higher MBE scores than the 2012 test-takers, the bar examiners will allot them correspondingly higher essay scores. Conversely, if the 2013 examinees score poorly on the MBE (compared to earlier testing groups in Ohio), they will receive lower essay scores as well. You can read more about this process in the same paper cited above.

These two techniques mean that scores neither inflate nor deflate over time; the measuring stick within each state remains constant. A score of 264 on the July 2013 Illinois bar exam will represent the same level of proficiency as a score of 264 did in 2003 or 1993.

When a state raises its passing score, therefore, it literally sets a higher hurdle for new applicants. Beginning in 2015, Illinois will no longer admit test-takers who score 264 on the exam; instead it will require applicants to score 272–eight points more than applicants have had to score for at least the last twenty years.

Why should that be? Why do today’s racially diverse applicants have to achieve higher scores than the largely white applicants of the 1970s? Law practice may be harder today than it was in the 1970s, but the bar exam doesn’t test the aspects of practice that have become more difficult. The bar exam doesn’t measure applicants on their mastery of the latest statutes, their ability to interact with clients and lawyers from many cultures, or their adeptness with new technologies. The bar exam tests basic doctrinal principles and legal analysis. Why is the minimum level of proficiency on those skills higher today than it was thirty or forty years ago?

If we want to diversify the profession, we have to stop raising the bar as the applicant pool diversifies. I do not believe that states acted with racial animus when increasing their passing scores; instead, the moves seem more broadly protectionist, occurring during times of recession in the legal market and as the number of law school graduates has increased. Those motives, however, deserve no credit. The bottom line is that today’s graduates have to meet a higher standard than leaders of the profession (those of us in our fifties and sixties) had to satisfy when we took the bar.

Some states have pointed to the low quality of bar exam essays when voting to raise their passing score. As I have explained elsewhere, these concerns are usually misplaced. Committees convened to review a state’s passing score often harbor unrealistic expectations about how well any lawyer–even a seasoned one–can read, analyze, and write about a new problem in 30 minutes. Bad statistical techniques have also tainted these attempts to recalibrate minimum passing scores.

Let’s roll back passing scores to where they stood in the 1970s. Taking that step would diversify the profession by allowing today’s diverse graduates to qualify for practice on the same terms as their less-diverse elders. Preserving accreditation of schools that produce a significant percentage of bar failures, in contrast, will do little to promote diversity.

Work Harder to Support Students’ Success

Teaching matters. During my time in legal education, I have seen professors improve skills and test scores among students who initially struggled with law school exams or bar preparation. These professors, notably, usually were not tenure-track faculty who taught Socratic classes or research seminars. More often, they were non-tenure-track instructors who were willing to break the law school box, to embrace teaching methods that work in other fields, to give their students more feedback, and to learn from their own mistakes. If one teaching method didn’t work, they would try another one.

If we want to improve minority access to the legal profession, then more of us should be willing to commit time to innovative teaching. Tenure-track faculty are quick to defend their traditional teaching methods, but slow to pursue rigorous tests of those methods. How do we know that the case method or Socratic questioning are the best ways to educate students? Usually we “know” this because (a) it worked for us, (b) it feels rigorous and engaging when we stand at the front of the classroom, (c) we’ve produced plenty of good lawyers over the last hundred years, and (d) we don’t know what else to do anyway. But if our methods leave one in five graduates unable to pass the bar (the threshold set by proposed Standard 315), then maybe there’s something wrong with those methods. Maybe we should change our methods rather than demand weak accreditation standards?

Some faculty will object that we shouldn’t have to “teach to the bar exam,” that schools must focus on skills and knowledge that the bar doesn’t test. Three years, however, is a long time. We should be able to prepare students effectively to pass the bar exam, as well as build a foundation in other essential skills and knowledge. The sad truth is that these “other” subjects and skills are more fun to teach, so we focus on them rather than on solid bar preparation.

It is disingenuous for law schools to disdain rigorous bar preparation, because the bar exam’s very existence supports our tuition. Students do not pay premium tuition for law school because we teach more content than our colleagues who teach graduate courses in history, classics, mathematics, chemistry, or dozens of other subjects. Nor do we give more feedback than those professors, supervise more research among our graduate students, or conduct more research of our own. Students pay more for a law school education than for graduate training in most other fields because they need our diploma to sit for the bar exam. As long as lawyers limit entry to the profession, and as long as law schools serve as the initial gatekeeper, we will be able to charge premium prices for our classes. How can we eschew bar preparation when the bar stimulates our enrollments and revenue?

If we want to diversify the legal profession, then we should commit to better teaching and more rigorous bar preparation. We shouldn’t simply give schools a pass if more than a fifth of their graduates repeatedly fail the bar. If the educational deficit is too great to overcome in three years, then we should devote our energy to good pipeline programs.

Tough Standards

Some accreditation standards create unnecessary costs; they benefit faculty, librarians, or other educational insiders at the expense of students. Comments submitted to the ABA Task Force on the Future of Legal Education properly question many of those standards. The Standards Review Committee likewise has questioned onerous standards of that type.

Proposed Standard 315, however, is tough in a different way. That standard holds schools accountable in order to protect students, lenders, and the public. Private law schools today charge an average of $120,000 for a JD. At those prices, schools should be able to assure that at least 80% of graduates who choose to take the bar exam will pass that exam within two calendar years. If schools can’t meet that standard, then they shouldn’t bear the mark of ABA accreditation.

, View Comments (2)

Understanding Fisher

July 2nd, 2013 / By

[We are pleased to present a guest post by Ruth Colker, Distinguished University Professor and Heck-Faust Chair in Constitutional Law, Moritz College of Law, The Ohio State University. This discussion is cross-posted from Professor Colker’s blog.]

What can a law school admissions officer learn from a close reading of Fisher v. University of Texas? A bit. (more…)

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests