Wood R. Foster, Jr., a Minneapolis lawyer and former president of the Minnesota State Bar Association, has written a striking review of recent changes in the legal profession. Foster spent his career as a commercial litigator with Siegel Brill, a small Minneapolis firm. Relatively few lawyers from that background have written about changes in the legal profession, and Foster does so eloquently.
Foster covers the growing surplus of lawyers, which he dates to 2000; fracturing of the profession; stalled diversity efforts; the high cost of legal education; BigLaw and its equally big shadow; and the impact of technology.
With some irony, Foster quotes a column that he wrote in 2000 after holding a series of focus groups with lawyers. “I have found,” he wrote then, “that lawyers are generally reluctant to visualize the profession’s future.” The future, however, arrived anyway. Today, he reflects, “a good argument can be made that the legal profession has changed more in the last 15 years than it did in the 150 years from 1849 to 1999.”
Foster’s views echo those I hear from many practitioners in their 60s and 70s. While academics continue to debate the existence of change, these lawyers have lived it. Their vantage point makes them particularly sympathetic to the newest generation of lawyers. “There really can be no doubt,” Foster concludes, “that it has been a rough ride for lawyers graduating from law school since 2000. . . . [The facts] add up to an unflattering picture of why so many young lawyers are finding it so hard to get the kind of start in their chosen profession that older lawyers like me were able to take for granted during the last half of the twentieth century.”
Give Foster a read. His featured series of articles absorbs much of this issue of Minnesota’s Bench and Bar journal.
An increasing number of law schools are creating online courses, certificate offerings, and degree programs. As newcomers to online education, we should look to existing programs for inspiration. One of those is Harvard Business School’s successful CORe program, an online certificate course in business basics. I wrote about CORe’s suitability for law students several weeks ago. Here, I examine three lessons that the program offers to law schools interested in online education.
At least three law firms have now adopted ROSS, an artificial legal intelligence system based on IBM’s pathbreaking Watson technology. The firms include two legal giants, Latham & Watkins and BakerHostetler, along with the Wisconsin firm vonBriesen. Commitments by these firms seem likely to spur interest among their competitors. Watch for ROSS and other forms of legal AI to spread over the next few years.
What is ROSS, what does it do, and what does it mean for lawyers and legal educators? Here are a few preliminary thoughts.
The National Conference of Bar Examiners (NCBE) has released the March 2016 issue of their quarterly publication, the Bar Examiner. The issue includes annual statistics about bar passage rates, as well as several other articles. For those who lack time to read the issue, here are a few highlights:
In his Letter from the Chair, Judge Thomas Bice sounds a disappointingly hostile note towards law students. Quoting Justice Edward Chavez of the New Mexico Supreme Court, Bice suggests that “those who attend law school have come to have a sense of entitlement to the practice of law simply as a result of their education.” Against this sentiment, he continues, bar examiners “are truly the gatekeepers of this profession.” (P. 2)
NCBE President Erica Moeser, who has recently tangled with law school deans, offers a more conciliatory tone on her President’s Page. After noting the importance of the legal profession and the challenges facing law schools, she concludes: “In many ways, we are all in this together, and certainly all of us wish for better times.” (P. 5)
A federal judge has tentatively approved settlement of consolidated class action lawsuits brought by July 2014 bar examinees against ExamSoft. The lawsuits arose out of the well known difficulties that test-takers experienced when they tried to upload their essay answers through ExamSoft’s software. I have written about this debacle, and its likely impact on bar scores, several times. For the most recent post in the series, see here.
Looking at this settlement, it’s hard to know what the class representatives were thinking. Last summer, examinees paid between $100 and $150 for the privilege of using ExamSoft software. When the uploads failed to work, they were unable to reach ExamSoft’s customer service lines. Many endured hours of anxiety as they tried to upload their exams or contact customer service. The snafu distracted them preparing for the next day’s exam or getting some much needed sleep.
What are the examinees who suffered through this “barmageddon” getting for their troubles? $90 apiece. That’s right, they’re not even getting a full refund on the fees they paid. The class action lawyers, meanwhile, will walk away with up to $600,000 in attorneys’ fees.
I understand that damages for emotional distress aren’t awarded in contract actions. I get that (and hopefully got that right on the MBE). But agreeing to a settlement that awards less than the amount exam takers paid for this shoddy service? ExamSoft clearly failed to perform its side of the bargain; the complaint stated a very straightforward claim for breach of contract. In addition, the plaintiffs invoked federal and state consumer laws that might have awarded other relief.
What were the class representatives thinking? Perhaps they used the lawsuit as a training ground to enter the apparently lucrative field of representing other plaintiffs in class action suits. Now that they know how to collect handsome fees, they’re not worried about the pocket change they paid to ExamSoft.
I believe in class actions–they’re a necessary procedure to enforce some claims, including the ones asserted in this case. But the field now suffers from so much abuse, with attorneys collecting the lions’ share of awards and class members receiving relatively little. It’s no wonder that the public, and some appellate courts, have become so cynical about class actions.
From that perspective, there’s a great irony in this settlement. People who wanted to be lawyers, and who suffered a compensable breach of contract while engaged in that quest, have now been shortchanged by the very professionals they seek to join.
In a series of posts (here, here, and here) I’ve explained why I believe that ExamSoft’s massive computer glitch lowered performance on the July 2014 Multistate Bar Exam (MBE). I’ve also explained how NCBE’s equating and scaling process amplified the damage to produce a 5-point drop in the national bar passage rate.
We now have a final piece of evidence suggesting that something untoward happened on the July 2014 bar exam: The February 2015 MBE did not produce the same type of score drop. This February’s MBE was harder than any version of the test given over the last four decades; it covered seven subjects instead of six. Confronted with that challenge, the February scores declined somewhat from the previous year’s mark. The mean scaled score on the February 2015 MBE was 136.2, 1.8 points lower than the February 2014 mean scaled score of 138.0.
The contested July 2014 MBE, however, produced a drop of 2.8 points compared to the July 2013 test. That drop was 35.7% larger than the February drop. The July 2014 shift was also larger than any other year-to-year change (positive or negative) recorded during the last ten years. (I treat the February and July exams as separate categories, as NCBE and others do.)
The shift in February 2015 scores, on the other hand, is similar in magnitude to five other changes that occurred during the last decade. Scores dropped, but not nearly as much as in July–and that’s despite taking a harder version of the MBE. Why did the July 2014 examinees perform so poorly?
It can’t be a change in the quality of test takers, as NCBE’s president, Erica Moeser, has suggested in a series of communications to law deans and the profession. The February 2015 examinees started law school at about the same time as the July 2014 ones. As others have shown, law student credentials (as measured by LSAT scores) declined only modestly for students who entered law school in 2011.
We’re left with the conclusion that something very unusual happened in July 2014, and it’s not hard to find that unusual event: a software problem that occupied test-takers’ time, aggravated their stress, and interfered with their sleep.
On its own, my comparison of score drops does not show that the ExamSoft crisis caused the fall in July 2014 test performance. The other evidence I have already discussed is more persuasive. I offer this supplemental analysis for two reasons.
First, I want to forestall arguments that February’s performance proves that the July test-takers must have been less qualified than previous examinees. February’s mean scaled score did drop, compared to the previous February, but the drop was considerably less than the sharp July decline. The latter drop remains the largest score change during the last ten years. It clearly is an outlier that requires more explanation. (And this, of course, is without considering the increased difficulty of the February exam.)
Second, when combined with other evidence about the ExamSoft debacle, this comparison adds to the concerns. Why did scores fall so precipitously in July 2014? The answer seems to be ExamSoft, and we owe that answer to test-takers who failed the July 2014 bar exam.
One final note: Although I remain very concerned about both the handling of the ExamSoft problem and the equating of the new MBE to the old one, I am equally concerned about law schools that admit students who will struggle to pass a fairly administered bar exam. NCBE, state bar examiners, and law schools together stand as gatekeepers to the profession and we all owe a duty of fairness to those who seek to join the profession. More about that soon.
I recently found a letter that Erica Moeser, President of the National Conference of Bar Examiners (NCBE) wrote to law school deans in mid-December. The letter responds to a formal request, signed by 79 law school deans, that NCBE “facilitate a thorough investigation of the administration and scoring of the July 2014 bar exam.” That exam suffered from the notorious ExamSoft debacle.
Moeser’s letter makes an interesting distinction. She assures the deans that NCBE has “reviewed and re-reviewed” its scoring, equating, and scaling of the July 2014 MBE. Those reviews, Moeser attests, revealed no flaw in NCBE’s process. She then adds that, to the extent the deans are concerned about “administration” of the exam, they should “note that NCBE does not administer the examination; jurisdictions do.”
Moeser doesn’t mention ExamSoft by name, but her message seems clear: If ExamSoft’s massive failure affected examinees’ performance, that’s not our problem. We take the bubble sheets as they come to us, grade them, equate the scores, scale those scores, and return the numbers to the states. It’s all the same to NCBE if examinees miss points because they failed to study, law schools taught them poorly, or they were groggy and stressed from struggling to upload their essay exams. We only score exams, we don’t administer them.
But is the line between administration and scoring so clear?
The Purpose of Equating
In an earlier post, I described the process of equating and scaling that NCBE uses to produce final MBE scores. The elaborate transformation of raw scores has one purpose: “to ensure consistency and fairness across the different MBE forms given on different test dates.”
NCBE thinks of this consistency with respect to its own test questions; it wants to ensure that some test-takers aren’t burdened with an overly difficult set of questions–or conversely, that other examinees don’t benefit from unduly easy questions. But substantial changes in exam conditions, like the ExamSoft crash, can also make an exam more difficult. If they do, NCBE’s equating and scaling process actually amplifies that unfairness.
To remain faithful to its mission, it seems that NCBE should at least explore the possible effects of major blunders in exam administration. This is especially true when a problem affects multiple jurisdictions, rather than a single state. If an incident affects a single jurisdiction, the examining authorities in that state can decide whether to adjust scores for that exam. When the problem is more diffuse, as with the ExamSoft failure, individual states may not have the information necessary to assess the extent of the impact. That’s an even greater concern when nationwide equating will spread the problem to states that did not even contract with ExamSoft.
What Should NCBE Have Done?
NCBE did not cause ExamSoft’s upload problems, but it almost certainly knew about them. Experts in exam scoring also understand that defects in exam administration can interfere with performance. With knowledge of the ExamSoft problem, NCBE had the ability to examine raw scores for the extent of the ExamSoft effect. Exploration would have been most effective with cooperation from ExamSoft itself, revealing which states suffered major upload problems and which ones experienced more minor interference. But even without that information, NCBE could have explored the raw scores for indications of whether test takers were “less able” in ExamSoft states.
If NCBE had found a problem, there would have been time to consult with bar examiners about possible solutions. At the very least, NCBE probably should have adjusted its scaling to reflect the fact that some of the decrease in raw scores stemmed from the software crash rather than from other changes in test-taker ability. With enough data, NCBE might have been able to quantify those effects fairly precisely.
Maybe NCBE did, in fact, do those things. Its public pronouncements, however, have not suggested any such process. On the contrary, Moeser seems to studiously avoid mentioning ExamSoft. This reveals an even deeper problem: we have a high-stakes exam for which responsibility is badly fragmented.
Who Do You Call?
Imagine yourself as a test-taker on July 29, 2014. You’ve been trying for several hours to upload your essay exam, without success. You’ve tried calling ExamSoft’s customer service line, but can’t get through. You’re worried that you’ll fail the exam if you don’t upload the essays on time, and you’re also worried that you won’t be sufficiently rested for the next day’s MBE. Who do you call?
You can’t call the state bar examiners; they don’t have an after-hours call line. If they did, they probably would reassure you on the first question, telling you that they would extend the deadline for submitting essay answers. (This is, in fact, what many affected states did.) But they wouldn’t have much to offer on the second question, about getting back on track for the next day’s MBE. Some state examiners don’t fully understand NCBE’s equating and scaling process; those examiners might even erroneously tell you “not to worry because everyone is in the same boat.”
NCBE wouldn’t be any more help. They, as Moeser pointed out, don’t actually administer exams; they just create and score them.
Many distressed examinees called law school staff members who had helped them prepare for the bar. Those staff members, in turn, called their deans–who contacted NCBE and state bar examiners. As Moeser’s letters indicate, however, bar examiners view deans with some suspicion. The deans, they believe, are too quick to advocate for their graduates and too worried about their own bar pass rates.
As NCBE and bar examiners refused to respond, or shifted responsibility to the other party, we reached a stand-off: no one was willing to take responsibility for flaws in a very high-stakes test administered to more than 50,000 examinees. That is a failure as great as the ExamSoft crash itself.
The Berkman Center for Internet and Society at Harvard Law has a live webcast on July 9th at 12:30 pm eastern called “Creating a Law School e-Curriculum.” The speaker will be Oliver R. Goodenough, a fellow at The Berkman Center and a Professor of Law at the Vermont Law School.
Here’s the description:
Legal practice and legal education both face disruptive change. Much of how and what we do as lawyers and how and what we have taught as legal educators is under scrutiny. Legal technology is an important factor in driving these challenges. Law schools reform their curriculums law and technology is an area that is ripe for expansion in our teaching. It also provides ample room for scholarly examination. Creating opportunities for learning how technology is shaping legal practice should be a priority for any school looking to provide a useful education for the lawyers of the present, let alone the future.
To watch the webcast, simply visit this page at 12:30 pm eastern on July 9th. If you’re in Boston, the same page provides a form for you to RSVP to the luncheon.
A “Coalition of Concerned Colleagues,” which includes me, has submitted a letter to the ABA Task Force on the Future of Legal Education. Although I can claim no credit for drafting the letter, I think it offers a succinct statement of the economic distress faced by law students and recent graduates: tuition has climbed dramatically, scholarships rarely address need, entry-level jobs have contracted, and salaries in those jobs have declined. The combination is oppressive for students and unsustainable for schools.
The brief letter notes a number of changes that might ameliorate this burden. All of those deserve exploration; I have posted on several already and will explore others in upcoming weeks. The letter, however, leaves a key point unstated: tenured professors at most schools will have to change their expectations if we hope to address this crisis. Faculty salaries and other perks account for a substantial share of the budget at most law schools. We can try to cut corners in other ways, by trimming staff and begging central administration to leave us a higher share of each tuition dollar. But in the end, we have to ask ourselves hard questions about the professional lives we’ve designed and the pay we demand.
Law professors earn high salaries, considerably higher than the pay drawn by most of our colleagues across the academy. Much of that money comes from the tuition paid by our students. With job and salary prospects down for lawyers, and with more transparency about those outcomes, fewer students are willing to pay our tuition. Faculty are going to have to adjust their financial expectations–and I think we should. We have enjoyed artificially high tuition and salaries for many years, due largely to our powerful economic status as gatekeepers to the legal profession. States didn’t create those restraints to enrich law schools, and we have served few interests (other than our own) by aggressively raising tuition and salaries over the last three decades.
In addition to lowering our financial expectations, faculty most likely will have to adjust the courses they teach, the ways in which they teach, and other professional activities. Distance education, for example, can help reduce the cost of legal education–but only if faculty are willing to use those techniques and then to consolidate courses across schools. One faculty member can teach Antitrust or Remedies to students at several law schools, but the faculty at those other schools must be willing to shift to other courses.
Adding apprenticeships and externships, similarly, will affect what current faculty do. We can’t expect students to pay for the full range of courses and scholarship our faculties now support plus the cost of apprenticeships or externships. These hands-on experiences will have to replace some of our current offerings, with traditional doctrinal faculty downsizing or taking on new duties.
Changes of this type are implicit in the letter from Concerned Colleagues, although I haven’t discussed these specifics with other signatories. Schools may find alternatives to the particular changes I’ve mentioned here; we need creativity to address the challenges before us. But it’s essential to avoid magic thinking when confronting those problems. The key difficulty for our graduates, students, and prospective students is that legal education has become too expensive for the career paths it supports. There is no magic solution to that problem in which we all become richer.
Coursera offers a platform for high-quality university courses delivered online. The company launched its first course less than a year ago, but has already reached 2.7 million participants. The platform now includes 62 universities located across four continents. Courses span a wide range of subjects and several languages. Imagine taking a course on Early Renaissance Architecture in Italy–from a regarded “Professore” at Sapienza University of Rome. Now imagine taking that course in your own living room, in English, for free. That’s Coursera.
As an educator, I’ve been curious about Coursera–but also vaguely uneasy. What does this type of massive online course mean for the future of education? Can it reduce the cost of legal education? Or will it further diminish the demand for our product? Does the Coursera pedagogy deploy techniques that we could borrow for smaller distance-learning initiatives? What happens when you mix high-quality educators with a type of education that some of us still associate with black-and-white televisions in the corner of a third grade classroom?
Now is your chance to explore some of those questions, while pursuing two other intellectual inquiries at the same time. I just signed up for Scott E. Page’s Coursera offering on Model Thinking. The course began on Monday, so you have time to join us.
Page is the Leonid Hurwicz Collegiate Professor of Complex Systems, Political Science, and Economics at the University of Michigan. He’s a distinguished scholar in the fields of game theory, organizational behavior, and institutional design. Some of you may know his book, The Difference, which offers an intriguing account of when diversity improves decision making. Page is also, as I’ve discovered from the first few online classes, an engaging lecturer.
Participating in Page’s course is giving me some ideas about online education. I’m tracking how he integrates lectures with readings. So far, so good: each stands alone but adds to the other. (The readings, by the way, are both free and easily downloadable from the course site). I’m experiencing the impact of hypotheticals that I answer during each lecture; I think they do engage me more in the material and add to my understanding. I’m also noting, of course, that every student is answering these questions; we’re not just listening to another student respond as we might in a lower-tech law classroom. I look forward to checking out the discussion forum and taking the quizzes.
Page’s offering is designed for tens of thousands of students; it’s a massive open online course (MOOC). The techniques used in that type of course won’t translate wholesale to every type of online offering. But I’m getting a sense of the possibilities–and some ideas for any online courses I design. That’s the first benefit of taking this course, learning something about online education.
The second benefit lies in learning about a series of social science models that touch upon legal issues. If you’ve wanted to know about Schelling’s segregation model, Granovetter’s collective behavior model, and others of their ilk, this course offers an excellent overview. So far, the lectures and readings are both comprehensible and focused; you’ll learn a lot with little wasted time. Page is especially skilled at illustrating the models in commonsense ways.
That brings me to my third, over-riding reason for taking Page’s course. Legal education rests on the premise that we teach students how to think like lawyers, and that this analytic frame adds value to many professional paths. Contemporary challenges to legal education question even that premise: Do we succeed in teaching students to think? I personally have little doubt that law school teaches students to think more critically. But do we offer special value compared to other graduate (or even undergraduate) programs? What analytic models do students learn in those other fields? Are those models equally valuable to the ones we teach in law? Are they more useful in a wider array of applications? Should we be teaching more different ways to think in law school? Or acknowledging that we offer just one of many valuable paths to success as a critical thinker?
I plan to use Page’s course as a way to think about thinking–how successful thinkers approach problems, how educators teach those approaches, and how law schools stack up compared to other disciplines. I’ll post from time to time about my reflections. Meanwhile, I hope to see you in class.
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at email@example.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.