You are currently browsing archives for April 2015.

Compared to What?

April 7th, 2015 / By

Some legal educators have a New Yorker’s view of the world. Like the parochial Manhattanite in Saul Steinberg’s famous illustration, these educators don’t see much beyond their own fiefdom. They see law graduates out there in the world, practicing their profession or working in related fields. And there are doctors, who (regrettably) make more money than lawyers do. But really, what else is there? What do people do if they don’t go to law school?

Michael Simkovic takes this position in a recent post, declaring (in bold) that: “The question everyone who decides not to go to law school . . . must answer is–what else out there is better?” In a footnote, Simkovic concedes that “[a]nother graduate degree might be better than law school for a particular individual,” but he clearly doesn’t think much of the idea.

People, of course, work in hundreds of occupations other than law. Some of them even enjoy their work. Simkovic’s concern lies primarily with the financial return on college and graduate degrees. Even here, though, the contemporary options are much broader than many legal educators realize.

Time Was: The 1990s

Financially, the late twentieth century was a good time to be a lawyer. When the Bureau of Labor Statistics (BLS) published its first Occupational Employment Statistics (OES) in 1997, the four occupations with the highest salaries were medicine, dentistry, podiatry, and law. Those four occupations topped the salary list (in that order) whether sorted by mean or median salary. [Note that OES collects data only on salaries; it does not include self-employed individuals like solo practitioners or partners–whether in law or medicine. For more on that point, see the end of this post.]

Law was a pretty good deal in those days. The graduate program was just three years, rather than four. There were no college prerequisites and no post-graduate internships. Knowledge of math was optional, and exposure to bodily fluids minimal. Imagine earning a median salary of $109,987 (in 2014 dollars) without having to examine feet! Although a willingness to spend four years of graduate school studying feet, along with a lifetime of treating them, would have netted you a 28% increase in median salary.

But let’s not dally any longer in the twentieth century.

Time Is: 2014

BLS just released its latest survey of occupational wages, and the results show how much the economy has changed. Law practice has slipped to twenty-second place in a listing of occupations by mean salary, and twenty-sixth place when ranked by median. One subset of lawyers, judges and magistrates, holds twenty-fifth place on the list of median salaries, but practicing lawyers have slipped a notch lower.

About half the slippage in law’s salary prominence stems from the splintering of medical occupations, both in the real world and as measured by BLS. We no longer visit “doctors,” we see pediatricians, general practitioners, internists, obstetricians, anesthesiologists, surgeons, and psychiatrists–often in that order. These medical specialists, along with the dentists and podiatrists, all enjoy a higher median salary than lawyers.

There are two other health-related professions, meanwhile, that have moved ahead of lawyers in wages: nurse anesthetists and pharmacists. Both of these fields require substantial graduate education: at least two years for nurse anesthetists and two to four years for pharmacists. But the training pays off with a median salary of $153,780 for nurse anesthetists and $120,950 for pharmacists.

Today’s college graduates, furthermore, don’t have to deal with teeth, airways, or medications to earn more than lawyers do. The latest BLS survey includes nine other occupations that top lawyers’ median salary: financial managers, airline pilots, natural sciences managers, air traffic controllers, marketing managers, computer and information systems managers, petroleum engineers, architectural and engineering managers, and chief executives.

How much do salaried lawyers earn in their more humble berth on the OES list? They collected a median salary of $114,970 in 2014. That’s good, but it’s only 4.5% higher (in inflation-controlled dollars) than the median salary in 1997. Pharmacists enjoyed a whopping 28% increase in median real wages to reach $120,950 in 2014. And the average nurse anesthetist earned a full third more than the average lawyer that year.

If you’re a college student willing to set your financial sights just a bit lower than the median salary in law practice, there are lots of other options. Here are some of the occupations with a 2014 median salary falling between $100,000 and $114,970: sales manager, physicist, computer hardware engineer, computer and information research scientist, compensation and benefits manager, purchasing manager, astronomer, aerospace engineer, political scientist, mathematician, software developer for systems software, human resources manager, training and development manager, public relations and fundraising manager, optometrist, nuclear engineer, and prosthodontist (those are the folks who will soon be fitting baby boomers for their false teeth).

Law graduates could apply their education to some of these jobs; with a few more years of graduate education, a savvy lawyer could offer the aging boomers a package deal on a will and a new pair of choppers. But the most common themes in these salary-leading occupations do not revolve around law. Instead, the themes are math, science, and management–none of which we teach very well in law school.

Twenty-first Century Humility

Lawyers will not disappear. Even Richard Susskind, who asked about “The End of Lawyers?” in a provocative book title, doesn’t think lawyers are done for. We still need lawyers to fill both traditional roles and new ones. Lawyers, however, will not have the same economic and social dominance that they enjoyed in the late twentieth century.

Some lawyers will still make a lot of money. As the American Lawyer proclaimed last year, the “super rich” are getting richer. But the prospects for other lawyers are less certain, and the appeal of competing fields has increased.

If law schools want to understand their decline in talented applicants, they need to look more closely at the competition. What do today’s high school students and middle schoolers think about law? Those students will choose their majors soon after arriving at college. Once they choose engineering, computer science, business, or health-related courses, a legal career will seem even less appealing. If we want potential students to find law attractive, we need to know more about their alternatives and preferences.

We also need to be realistic about how many students ultimately will–or should–pursue a law degree. As citizens of a healthy economy, we need doctors, nurse anesthetists, pharmacists, managers, and software developers. We even need the odd astronomer or two. Law is just one of the many occupations that make a society thrive. The twenty-first century is a time of interdependence that should bring a sense of humility.

Notes

Here are some key points about the method behind the OES survey. For more information, see this FAQ page, which includes the information I summarize here:

1. OES obtains wage data directly from establishments. This method eliminates bias that may occur when individuals report their own wages. The survey, however, includes only wage data for salaried employees. Solo practitioners (in any field) are excluded, as are individuals who draw their income entirely from partnerships or other forms of profit sharing.

2. “Wages” include production bonuses and tips, but not end-of-year bonuses, profit-sharing, or benefits.

3. Although BLS publishes OES data every year, the data are gathered on a rolling basis. Income for “1997” or “2014” reflects data gathered over three years, including the reference year. BLS adjusts wage figures for the two older years, using the Employment Cost Index, so the reported wages appear in then “current” dollars. The three-year collection period, however, can mask sudden shifts in employment trends.

4. BLS cautions against using OES data to compare changes in employment data over time, unless the user offers necessary context. In particular, it is important for readers to understand that short-term comparisons are difficult (because of the point in the previous paragraph) and that occupational categories change frequently. For those reasons, I have limited my cross-time comparisons and have noted the splintering of occupational categories. The limited comparison offered here, however, seems helpful in understanding the relationship of law practice to other high-paying occupations.

5. For the data used in this post, follow this link and download the spreadsheets. The HTML versions are prettier, but they do not include all of the data.

, View Comment (1)

ExamSoft and NCBE

April 6th, 2015 / By

I recently found a letter that Erica Moeser, President of the National Conference of Bar Examiners (NCBE) wrote to law school deans in mid-December. The letter responds to a formal request, signed by 79 law school deans, that NCBE “facilitate a thorough investigation of the administration and scoring of the July 2014 bar exam.” That exam suffered from the notorious ExamSoft debacle.

Moeser’s letter makes an interesting distinction. She assures the deans that NCBE has “reviewed and re-reviewed” its scoring, equating, and scaling of the July 2014 MBE. Those reviews, Moeser attests, revealed no flaw in NCBE’s process. She then adds that, to the extent the deans are concerned about “administration” of the exam, they should “note that NCBE does not administer the examination; jurisdictions do.”

Moeser doesn’t mention ExamSoft by name, but her message seems clear: If ExamSoft’s massive failure affected examinees’ performance, that’s not our problem. We take the bubble sheets as they come to us, grade them, equate the scores, scale those scores, and return the numbers to the states. It’s all the same to NCBE if examinees miss points because they failed to study, law schools taught them poorly, or they were groggy and stressed from struggling to upload their essay exams. We only score exams, we don’t administer them.

But is the line between administration and scoring so clear?

The Purpose of Equating

In an earlier post, I described the process of equating and scaling that NCBE uses to produce final MBE scores. The elaborate transformation of raw scores has one purpose: “to ensure consistency and fairness across the different MBE forms given on different test dates.”

NCBE thinks of this consistency with respect to its own test questions; it wants to ensure that some test-takers aren’t burdened with an overly difficult set of questions–or conversely, that other examinees don’t benefit from unduly easy questions. But substantial changes in exam conditions, like the ExamSoft crash, can also make an exam more difficult. If they do, NCBE’s equating and scaling process actually amplifies that unfairness.

To remain faithful to its mission, it seems that NCBE should at least explore the possible effects of major blunders in exam administration. This is especially true when a problem affects multiple jurisdictions, rather than a single state. If an incident affects a single jurisdiction, the examining authorities in that state can decide whether to adjust scores for that exam. When the problem is more diffuse, as with the ExamSoft failure, individual states may not have the information necessary to assess the extent of the impact. That’s an even greater concern when nationwide equating will spread the problem to states that did not even contract with ExamSoft.

What Should NCBE Have Done?

NCBE did not cause ExamSoft’s upload problems, but it almost certainly knew about them. Experts in exam scoring also understand that defects in exam administration can interfere with performance. With knowledge of the ExamSoft problem, NCBE had the ability to examine raw scores for the extent of the ExamSoft effect. Exploration would have been most effective with cooperation from ExamSoft itself, revealing which states suffered major upload problems and which ones experienced more minor interference. But even without that information, NCBE could have explored the raw scores for indications of whether test takers were “less able” in ExamSoft states.

If NCBE had found a problem, there would have been time to consult with bar examiners about possible solutions. At the very least, NCBE probably should have adjusted its scaling to reflect the fact that some of the decrease in raw scores stemmed from the software crash rather than from other changes in test-taker ability. With enough data, NCBE might have been able to quantify those effects fairly precisely.

Maybe NCBE did, in fact, do those things. Its public pronouncements, however, have not suggested any such process. On the contrary, Moeser seems to studiously avoid mentioning ExamSoft. This reveals an even deeper problem: we have a high-stakes exam for which responsibility is badly fragmented.

Who Do You Call?

Imagine yourself as a test-taker on July 29, 2014. You’ve been trying for several hours to upload your essay exam, without success. You’ve tried calling ExamSoft’s customer service line, but can’t get through. You’re worried that you’ll fail the exam if you don’t upload the essays on time, and you’re also worried that you won’t be sufficiently rested for the next day’s MBE. Who do you call?

You can’t call the state bar examiners; they don’t have an after-hours call line. If they did, they probably would reassure you on the first question, telling you that they would extend the deadline for submitting essay answers. (This is, in fact, what many affected states did.) But they wouldn’t have much to offer on the second question, about getting back on track for the next day’s MBE. Some state examiners don’t fully understand NCBE’s equating and scaling process; those examiners might even erroneously tell you “not to worry because everyone is in the same boat.”

NCBE wouldn’t be any more help. They, as Moeser pointed out, don’t actually administer exams; they just create and score them.

Many distressed examinees called law school staff members who had helped them prepare for the bar. Those staff members, in turn, called their deans–who contacted NCBE and state bar examiners. As Moeser’s letters indicate, however, bar examiners view deans with some suspicion. The deans, they believe, are too quick to advocate for their graduates and too worried about their own bar pass rates.

As NCBE and bar examiners refused to respond, or shifted responsibility to the other party, we reached a stand-off: no one was willing to take responsibility for flaws in a very high-stakes test administered to more than 50,000 examinees. That is a failure as great as the ExamSoft crash itself.

, No Comments Yet

The Ethics of Academia

April 2nd, 2015 / By

What obligations, if any, do academic institutions owe potential students? When soliciting these “customers,” how candid should schools be in discussing graduation rates, scholarship conditions, or the employment outcomes of recent graduates? Do the obligations differ for a professional school that will teach students about the ethics of communicating with their own future customers?

New Marketing/New Concerns

Once upon a time, we marketed law schools with a printed brochure or two. That changed with the advent of the new century and the internet. Now marketing is pervasive: web pages, emails, blog posts, and forums.

With increased marketing, some educators began to worry about how we presented ourselves to students. As a sometime social scientist, I was particularly concerned about the way in which some law schools reported median salaries without disclosing the number of graduates supplying that information. A school could report that it had employment information from 99% of its graduates, that 60% were in private practice, and that the median salary for those private practitioners was $120,000. Nowhere did the reader learn that only 45% of the graduates reported salary information. [This is a hypothetical example; it does not represent any particular law school.]

I also noticed that, although law schools know only the average “amount borrowed” by their students, schools and the media began to represent that figure as the average “debt owed.” Interest, unfortunately, accumulates while a student is in law school, so the “amount borrowed” significantly understates the “debt owed” when loans fall due.

Other educators worried about a lack of candor when schools offered scholarships to students. A school might offer an attractive three-year scholarship to an applicant, with the seemingly easy condition that the student maintain a B average. The school knew that it tightly controlled curves in first-year courses, so that a predictable number of awardees would fail that condition, but the applicants didn’t understand that. This isn’t just a matter of optimism bias; undergraduates literally do not understand law school curves. A few years ago, one law school hopeful said to me: “What’s the big deal about grade competition in law school? It’s not like there’s a limit on the number of A’s or anything.” When I explained the facts of law school life, she went off to pursue a Ph.D. in botany.

And then there was the matter of nested statistics. Schools would report the number of employed graduates, then identify percentages of those graduates working in particular job categories. Categories spawned sub-categories, and readers began to lose sight of the denominator. Even respected scholars like Steven Solomon get befuddled by these statistics. Yesterday, Solomon misinterpreted Georgetown’s 2013 employment statistics due to this type of nesting: he mistook 60% of employed graduates for 60% of the graduating class. (Georgetown, to its credit, provides clearer statistics on a different page than the one Solomon used.)

Educators, of course, weren’t the only ones who noticed these problems. We were slow–much too slow–to address our lapses, and we suffered legitimate criticism from the media and organizations like Law School Transparency. Indeed, the criticisms continue, as professors persist in making misleading statements.

For me, these are ethical issues. I believe that educators do have a special obligation to prospective students; they are not just “customers,” they are people who depend upon us for instruction and wise counsel. At law schools, prospective students are also future colleagues in the legal profession; even while we teach, we are an integral part of the profession.

With that in mind, I communicate with prospective students as I would talk to a colleague asking about an entry-level teaching position or a potential move to another school. I tell students what I would want to know if I were in their position. And, consistent with my role as a teacher and scholar, I try to present the information in a manner that is straightforward and easy to understand. For the last few years, most law schools have followed the same golden rules–albeit with considerable prodding from Law School Transparency, the ABA, and the media.

Revisionist History

Now that law schools have become more careful in their communications with potential students, revisionist history has appeared. Ignoring all of the concerns discussed above (although they appear in sources he cites), Michael Simkovic concludes that “The moral critique against law schools comes down to this: The law schools used the same standard method of reporting data as the U.S. Government.”

Huh? When the government publishes salaries in SIPP, a primary source for Simkovic’s scholarship, I’m pretty sure they disclose how many respondents refused to provide that information. Reports on the national debt, likewise, include interest accrued rather than just the original amounts borrowed–although I will concede that there’s plenty of monkey business in that reporting. I’ll also concede that welfare recipients probably don’t fully understand the conditions in the contracts they sign.

Simkovic, of course, doesn’t mean to set the government up as a model on these latter points. Instead, he ignores those issues and pretends that the ethical critique of law schools focused on just one point: calculation of the overall employment rate. On this, Simkovic has good news for law schools: they can ethically count a graduate as employed as long as the graduate was paid for a single hour of work during the reporting week–because that’s the way the government does it.

I don’t think any law school has ever been quite that audacious, and the ABA certainly would not approve. The implications of Simkovic’s argument, however, illuminate a key point: law schools communicate for a different purpose, and to a different audience, than the Bureau of Labor Statistics. The primary consumers of our employment statistics are current and potential students. We draft our employment statistics for that audience, and the information should be tailored to them.

As for scholarship, I will acknowledge that the U.S. government owns the word “unemployment.” I used a non-standard definition of that concept in a recent paper, and clearly designated it as such. But this seems to distract some readers, so I’ll refer to those graduates as “not working.” I suspect it’s all the same to them.

View Comments (3)

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests