You are currently browsing archives for the Technology category.

You Too Can Check Out Quimbee

July 30th, 2018 / By

Since I posted about Quimbee, several colleagues have asked if there are ways to check out this new study aid. That’s easy: just sign up for a free 7-day trial. The trial really is free. You don’t need to enter credit card information; nor will Quimbee hound you to purchase the service when the week ends. I signed up for a free trial before writing my blog post and, other than a polite email noting that my trial was about to expire, Quimbee did nothing to pressure me for money. Nor have I gotten spam since my trial ended. So, if you wonder what your students are reading on Quimbee, go ahead and check the site out.

, View Comment (1)

The View from Minnesota: A Profession on Edge

July 25th, 2016 / By

Wood R. Foster, Jr., a Minneapolis lawyer and former president of the Minnesota State Bar Association, has written a striking review of recent changes in the legal profession. Foster spent his career as a commercial litigator with Siegel Brill, a small Minneapolis firm. Relatively few lawyers from that background have written about changes in the legal profession, and Foster does so eloquently.

Foster covers the growing surplus of lawyers, which he dates to 2000; fracturing of the profession; stalled diversity efforts; the high cost of legal education; BigLaw and its equally big shadow; and the impact of technology.

With some irony, Foster quotes a column that he wrote in 2000 after holding a series of focus groups with lawyers. “I have found,” he wrote then, “that lawyers are generally reluctant to visualize the profession’s future.” The future, however, arrived anyway. Today, he reflects, “a good argument can be made that the legal profession has changed more in the last 15 years than it did in the 150 years from 1849 to 1999.”

Foster’s views echo those I hear from many practitioners in their 60s and 70s. While academics continue to debate the existence of change, these lawyers have lived it. Their vantage point makes them particularly sympathetic to the newest generation of lawyers. “There really can be no doubt,” Foster concludes, “that it has been a rough ride for lawyers graduating from law school since 2000. . . . [The facts] add up to an unflattering picture of why so many young lawyers are finding it so hard to get the kind of start in their chosen profession that older lawyers like me were able to take for granted during the last half of the twentieth century.”

Give Foster a read. His featured series of articles absorbs much of this issue of Minnesota’s Bench and Bar journal.

 

No Comments Yet

Lessons for Online Legal Education

June 5th, 2016 / By

An increasing number of law schools are creating online courses, certificate offerings, and degree programs. As newcomers to online education, we should look to existing programs for inspiration. One of those is Harvard Business School’s successful CORe program, an online certificate course in business basics. I wrote about CORe’s suitability for law students several weeks ago. Here, I examine three lessons that the program offers to law schools interested in online education.

(more…)

, View Comment (1)

Artificially Intelligent Legal Research

May 24th, 2016 / By

At least three law firms have now adopted ROSS, an artificial legal intelligence system based on IBM’s pathbreaking Watson technology. The firms include two legal giants, Latham & Watkins and BakerHostetler, along with the Wisconsin firm vonBriesen. Commitments by these firms seem likely to spur interest among their competitors. Watch for ROSS and other forms of legal AI to spread over the next few years.

What is ROSS, what does it do, and what does it mean for lawyers and legal educators? Here are a few preliminary thoughts.

(more…)

, No Comments Yet

The Latest Issue of the Bar Examiner

March 15th, 2016 / By

The National Conference of Bar Examiners (NCBE) has released the March 2016 issue of their quarterly publication, the Bar Examiner. The issue includes annual statistics about bar passage rates, as well as several other articles. For those who lack time to read the issue, here are a few highlights:

Bar-Academy Relationships

In his Letter from the Chair, Judge Thomas Bice sounds a disappointingly hostile note towards law students. Quoting Justice Edward Chavez of the New Mexico Supreme Court, Bice suggests that “those who attend law school have come to have a sense of entitlement to the practice of law simply as a result of their education.” Against this sentiment, he continues, bar examiners “are truly the gatekeepers of this profession.” (P. 2)

NCBE President Erica Moeser, who has recently tangled with law school deans, offers a more conciliatory tone on her President’s Page. After noting the importance of the legal profession and the challenges facing law schools, she concludes: “In many ways, we are all in this together, and certainly all of us wish for better times.” (P. 5)

(more…)

, View Comment (1)

ExamSoft Settlement

May 20th, 2015 / By

A federal judge has tentatively approved settlement of consolidated class action lawsuits brought by July 2014 bar examinees against ExamSoft. The lawsuits arose out of the well known difficulties that test-takers experienced when they tried to upload their essay answers through ExamSoft’s software. I have written about this debacle, and its likely impact on bar scores, several times. For the most recent post in the series, see here.

Looking at this settlement, it’s hard to know what the class representatives were thinking. Last summer, examinees paid between $100 and $150 for the privilege of using ExamSoft software. When the uploads failed to work, they were unable to reach ExamSoft’s customer service lines. Many endured hours of anxiety as they tried to upload their exams or contact customer service. The snafu distracted them preparing for the next day’s exam or getting some much needed sleep.

What are the examinees who suffered through this “barmageddon” getting for their troubles? $90 apiece. That’s right, they’re not even getting a full refund on the fees they paid. The class action lawyers, meanwhile, will walk away with up to $600,000 in attorneys’ fees.

I understand that damages for emotional distress aren’t awarded in contract actions. I get that (and hopefully got that right on the MBE). But agreeing to a settlement that awards less than the amount exam takers paid for this shoddy service? ExamSoft clearly failed to perform its side of the bargain; the complaint stated a very straightforward claim for breach of contract. In addition, the plaintiffs invoked federal and state consumer laws that might have awarded other relief.

What were the class representatives thinking? Perhaps they used the lawsuit as a training ground to enter the apparently lucrative field of representing other plaintiffs in class action suits. Now that they know how to collect handsome fees, they’re not worried about the pocket change they paid to ExamSoft.

I believe in class actions–they’re a necessary procedure to enforce some claims, including the ones asserted in this case. But the field now suffers from so much abuse, with attorneys collecting the lions’ share of awards and class members receiving relatively little. It’s no wonder that the public, and some appellate courts, have become so cynical about class actions.

From that perspective, there’s a great irony in this settlement. People who wanted to be lawyers, and who suffered a compensable breach of contract while engaged in that quest, have now been shortchanged by the very professionals they seek to join.

, No Comments Yet

ExamSoft Update

April 21st, 2015 / By

In a series of posts (here, here, and here) I’ve explained why I believe that ExamSoft’s massive computer glitch lowered performance on the July 2014 Multistate Bar Exam (MBE). I’ve also explained how NCBE’s equating and scaling process amplified the damage to produce a 5-point drop in the national bar passage rate.

We now have a final piece of evidence suggesting that something untoward happened on the July 2014 bar exam: The February 2015 MBE did not produce the same type of score drop. This February’s MBE was harder than any version of the test given over the last four decades; it covered seven subjects instead of six. Confronted with that challenge, the February scores declined somewhat from the previous year’s mark. The mean scaled score on the February 2015 MBE was 136.2, 1.8 points lower than the February 2014 mean scaled score of 138.0.

The contested July 2014 MBE, however, produced a drop of 2.8 points compared to the July 2013 test. That drop was 35.7% larger than the February drop. The July 2014 shift was also larger than any other year-to-year change (positive or negative) recorded during the last ten years. (I treat the February and July exams as separate categories, as NCBE and others do.)

The shift in February 2015 scores, on the other hand, is similar in magnitude to five other changes that occurred during the last decade. Scores dropped, but not nearly as much as in July–and that’s despite taking a harder version of the MBE. Why did the July 2014 examinees perform so poorly?

It can’t be a change in the quality of test takers, as NCBE’s president, Erica Moeser, has suggested in a series of communications to law deans and the profession. The February 2015 examinees started law school at about the same time as the July 2014 ones. As others have shown, law student credentials (as measured by LSAT scores) declined only modestly for students who entered law school in 2011.

We’re left with the conclusion that something very unusual happened in July 2014, and it’s not hard to find that unusual event: a software problem that occupied test-takers’ time, aggravated their stress, and interfered with their sleep.

On its own, my comparison of score drops does not show that the ExamSoft crisis caused the fall in July 2014 test performance. The other evidence I have already discussed is more persuasive. I offer this supplemental analysis for two reasons.

First, I want to forestall arguments that February’s performance proves that the July test-takers must have been less qualified than previous examinees. February’s mean scaled score did drop, compared to the previous February, but the drop was considerably less than the sharp July decline. The latter drop remains the largest score change during the last ten years. It clearly is an outlier that requires more explanation. (And this, of course, is without considering the increased difficulty of the February exam.)

Second, when combined with other evidence about the ExamSoft debacle, this comparison adds to the concerns. Why did scores fall so precipitously in July 2014? The answer seems to be ExamSoft, and we owe that answer to test-takers who failed the July 2014 bar exam.

One final note: Although I remain very concerned about both the handling of the ExamSoft problem and the equating of the new MBE to the old one, I am equally concerned about law schools that admit students who will struggle to pass a fairly administered bar exam. NCBE, state bar examiners, and law schools together stand as gatekeepers to the profession and we all owe a duty of fairness to those who seek to join the profession. More about that soon.

, No Comments Yet

ExamSoft and NCBE

April 6th, 2015 / By

I recently found a letter that Erica Moeser, President of the National Conference of Bar Examiners (NCBE) wrote to law school deans in mid-December. The letter responds to a formal request, signed by 79 law school deans, that NCBE “facilitate a thorough investigation of the administration and scoring of the July 2014 bar exam.” That exam suffered from the notorious ExamSoft debacle.

Moeser’s letter makes an interesting distinction. She assures the deans that NCBE has “reviewed and re-reviewed” its scoring, equating, and scaling of the July 2014 MBE. Those reviews, Moeser attests, revealed no flaw in NCBE’s process. She then adds that, to the extent the deans are concerned about “administration” of the exam, they should “note that NCBE does not administer the examination; jurisdictions do.”

Moeser doesn’t mention ExamSoft by name, but her message seems clear: If ExamSoft’s massive failure affected examinees’ performance, that’s not our problem. We take the bubble sheets as they come to us, grade them, equate the scores, scale those scores, and return the numbers to the states. It’s all the same to NCBE if examinees miss points because they failed to study, law schools taught them poorly, or they were groggy and stressed from struggling to upload their essay exams. We only score exams, we don’t administer them.

But is the line between administration and scoring so clear?

The Purpose of Equating

In an earlier post, I described the process of equating and scaling that NCBE uses to produce final MBE scores. The elaborate transformation of raw scores has one purpose: “to ensure consistency and fairness across the different MBE forms given on different test dates.”

NCBE thinks of this consistency with respect to its own test questions; it wants to ensure that some test-takers aren’t burdened with an overly difficult set of questions–or conversely, that other examinees don’t benefit from unduly easy questions. But substantial changes in exam conditions, like the ExamSoft crash, can also make an exam more difficult. If they do, NCBE’s equating and scaling process actually amplifies that unfairness.

To remain faithful to its mission, it seems that NCBE should at least explore the possible effects of major blunders in exam administration. This is especially true when a problem affects multiple jurisdictions, rather than a single state. If an incident affects a single jurisdiction, the examining authorities in that state can decide whether to adjust scores for that exam. When the problem is more diffuse, as with the ExamSoft failure, individual states may not have the information necessary to assess the extent of the impact. That’s an even greater concern when nationwide equating will spread the problem to states that did not even contract with ExamSoft.

What Should NCBE Have Done?

NCBE did not cause ExamSoft’s upload problems, but it almost certainly knew about them. Experts in exam scoring also understand that defects in exam administration can interfere with performance. With knowledge of the ExamSoft problem, NCBE had the ability to examine raw scores for the extent of the ExamSoft effect. Exploration would have been most effective with cooperation from ExamSoft itself, revealing which states suffered major upload problems and which ones experienced more minor interference. But even without that information, NCBE could have explored the raw scores for indications of whether test takers were “less able” in ExamSoft states.

If NCBE had found a problem, there would have been time to consult with bar examiners about possible solutions. At the very least, NCBE probably should have adjusted its scaling to reflect the fact that some of the decrease in raw scores stemmed from the software crash rather than from other changes in test-taker ability. With enough data, NCBE might have been able to quantify those effects fairly precisely.

Maybe NCBE did, in fact, do those things. Its public pronouncements, however, have not suggested any such process. On the contrary, Moeser seems to studiously avoid mentioning ExamSoft. This reveals an even deeper problem: we have a high-stakes exam for which responsibility is badly fragmented.

Who Do You Call?

Imagine yourself as a test-taker on July 29, 2014. You’ve been trying for several hours to upload your essay exam, without success. You’ve tried calling ExamSoft’s customer service line, but can’t get through. You’re worried that you’ll fail the exam if you don’t upload the essays on time, and you’re also worried that you won’t be sufficiently rested for the next day’s MBE. Who do you call?

You can’t call the state bar examiners; they don’t have an after-hours call line. If they did, they probably would reassure you on the first question, telling you that they would extend the deadline for submitting essay answers. (This is, in fact, what many affected states did.) But they wouldn’t have much to offer on the second question, about getting back on track for the next day’s MBE. Some state examiners don’t fully understand NCBE’s equating and scaling process; those examiners might even erroneously tell you “not to worry because everyone is in the same boat.”

NCBE wouldn’t be any more help. They, as Moeser pointed out, don’t actually administer exams; they just create and score them.

Many distressed examinees called law school staff members who had helped them prepare for the bar. Those staff members, in turn, called their deans–who contacted NCBE and state bar examiners. As Moeser’s letters indicate, however, bar examiners view deans with some suspicion. The deans, they believe, are too quick to advocate for their graduates and too worried about their own bar pass rates.

As NCBE and bar examiners refused to respond, or shifted responsibility to the other party, we reached a stand-off: no one was willing to take responsibility for flaws in a very high-stakes test administered to more than 50,000 examinees. That is a failure as great as the ExamSoft crash itself.

, No Comments Yet

Berkman Center Webcast on “Creating a Law School e-Curriculum”

June 27th, 2013 / By

The Berkman Center for Internet and Society at Harvard Law has a live webcast on July 9th at 12:30 pm eastern called “Creating a Law School e-Curriculum.” The speaker will be Oliver R. Goodenough, a fellow at The Berkman Center and a Professor of Law at the Vermont Law School.

Here’s the description:

Legal practice and legal education both face disruptive change. Much of how and what we do as lawyers and how and what we have taught as legal educators is under scrutiny. Legal technology is an important factor in driving these challenges. Law schools reform their curriculums law and technology is an area that is ripe for expansion in our teaching. It also provides ample room for scholarly examination. Creating opportunities for learning how technology is shaping legal practice should be a priority for any school looking to provide a useful education for the lawyers of the present, let alone the future.

To watch the webcast, simply visit this page at 12:30 pm eastern on July 9th. If you’re in Boston, the same page provides a form for you to RSVP to the luncheon.

No Comments Yet

Letter to the ABA

March 10th, 2013 / By

A “Coalition of Concerned Colleagues,” which includes me, has submitted a letter to the ABA Task Force on the Future of Legal Education. Although I can claim no credit for drafting the letter, I think it offers a succinct statement of the economic distress faced by law students and recent graduates: tuition has climbed dramatically, scholarships rarely address need, entry-level jobs have contracted, and salaries in those jobs have declined. The combination is oppressive for students and unsustainable for schools.

The brief letter notes a number of changes that might ameliorate this burden. All of those deserve exploration; I have posted on several already and will explore others in upcoming weeks. The letter, however, leaves a key point unstated: tenured professors at most schools will have to change their expectations if we hope to address this crisis. Faculty salaries and other perks account for a substantial share of the budget at most law schools. We can try to cut corners in other ways, by trimming staff and begging central administration to leave us a higher share of each tuition dollar. But in the end, we have to ask ourselves hard questions about the professional lives we’ve designed and the pay we demand.

Law professors earn high salaries, considerably higher than the pay drawn by most of our colleagues across the academy. Much of that money comes from the tuition paid by our students. With job and salary prospects down for lawyers, and with more transparency about those outcomes, fewer students are willing to pay our tuition. Faculty are going to have to adjust their financial expectations–and I think we should. We have enjoyed artificially high tuition and salaries for many years, due largely to our powerful economic status as gatekeepers to the legal profession. States didn’t create those restraints to enrich law schools, and we have served few interests (other than our own) by aggressively raising tuition and salaries over the last three decades.

In addition to lowering our financial expectations, faculty most likely will have to adjust the courses they teach, the ways in which they teach, and other professional activities. Distance education, for example, can help reduce the cost of legal education–but only if faculty are willing to use those techniques and then to consolidate courses across schools. One faculty member can teach Antitrust or Remedies to students at several law schools, but the faculty at those other schools must be willing to shift to other courses.

Adding apprenticeships and externships, similarly, will affect what current faculty do. We can’t expect students to pay for the full range of courses and scholarship our faculties now support plus the cost of apprenticeships or externships. These hands-on experiences will have to replace some of our current offerings, with traditional doctrinal faculty downsizing or taking on new duties.

Changes of this type are implicit in the letter from Concerned Colleagues, although I haven’t discussed these specifics with other signatories. Schools may find alternatives to the particular changes I’ve mentioned here; we need creativity to address the challenges before us. But it’s essential to avoid magic thinking when confronting those problems. The key difficulty for our graduates, students, and prospective students is that legal education has become too expensive for the career paths it supports. There is no magic solution to that problem in which we all become richer.

, View Comments (3)

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests