You are currently browsing archives for the Teaching category.

The Bot Takes a Bow

November 16th, 2023 / By

Late law month I wrote about a sample NextGen question that GPT-4 discovered was based on an outdated, minority rule of law. NCBE has now removed the question from their website, although it is still accessible (for those who are curious) through the Wayback Machine. While the Bot takes a small bow for assisting NCBE on this question, I’ll offer some reflections.

We hear a lot about mistakes that GPT-4 makes, but this is an example of GPT-4 correcting a human mistake. Law is a vast, complex field, especially considering state-to-state variations in the United States. Both humans and AI will make mistakes when identifying and interpreting legal rules within this large universe. This story shows that AI can help humans correct their mistakes: We can partner with AI to increase our knowledge and better serve clients.

At the same time, the partnership requires us to acknowledge that AI is also fallible. That’s easier said than done because we rely every day on technologies that are much more accurate than humans. If I want to know the time, my phone will give a much more accurate answer than my internal clock. The odometer in my car offers a more accurate measure of the car’s speed than my subjective sense. We regularly outsource many types of questions to highly reliable technologies.

AI is not the same as the clocks on our phones. It knows much more than any individual human, but it still makes mistakes. In that sense, AI is more “human” than digital clocks, odometers, or other technologies. Partnering with AI is a bit like working with another human: we have to learn this partner’s strengths and weaknesses, then structure our working relationship around those characteristics. We may also have to think about our own strengths and weaknesses to get the most out of the working relationship.

GPT-4’s review of the NextGen question suggests that it may be a useful partner in pretesting questions for exams. Professors read over their exam questions before administering them, looking for ambiguities and errors. But we rarely have the opportunity to pretest questions on other humans–apart from the occasional colleague or family member. Feeding questions to GPT-4 could allow us to doublecheck our work. For open-ended questions that require a constructed response, GPT-4 could help us identify issues raised by the question that we might not have intended to include. Wouldn’t it be nice to know about those before we started grading student answers?

I hope that NCBE and other test-makers will also use AI as an additional check on their questions. NCBE subjects questions to several rounds of scrutiny–and it pretests multiple-choice questions as unscored questions on the MBE–but AI can offer an additional check. Security concerns might be addressed by using proprietary AI.

Moving beyond the testing world, GPT-4 can offer a doublecheck for lawyers advising clients. In some earlier posts, I suggested that new lawyers could ask GPT-4 for pointers as they begin working on a client problem. But GPT-4 can assist later in the process as well. Once a lawyer has formulated a plan for addressing a problem, why not ask GPT-4 if it sees any issues with the plan or additional angles to consider? (Be sure, of course, to redact client identifying information when using a publicly accessible tool like GPT-4.)

Our partnership with GPT-4 and other types of AI is just beginning. We have much to learn–and many potential benefits to reap.

, No Comments Yet

GPT-4 Beats the Bar Exam

October 26th, 2023 / By

In the first three posts in this series, I used a bar exam question as an example of the type of problem a new lawyer might confront in practice. I then explored how GPT-4 might help a new lawyer address that practice problem. In this post, I’ll work with another sample question that NCBE has released for the NextGen bar exam. On this question, GPT-4 beats the bar exam. In other words, a new lawyer using GPT-4 would obtain better answers than one who remembered material studied for the bar exam.

Take a look at the seventh multiple-choice question on the NextGen list of sample questions. In this question, the client is a state college that wants to fire a tennis coach. The question format parallels the one we discussed in my previous posts: It asks which “legal topics are the most important for you to research before advising the client” and instructs the test-taker to choose two topics from a list of six.

When prompted with the multiple-choice question, GPT-4 quickly identifies the two correct answers: employment at will and procedural due process. The Bot also offers succinct explanations about why those topics are the most important of the six to research. But that’s not the interesting part–we already know that GPT-4 can easily pass the bar exam.

Instead, I prompted GPT-4 (on a different day and with a new conversation) with the fact pattern in this MCQ and asked: “What legal topics are most important for you to research before advising the client?” Here’s how GPT-4 responded:

When representing a state college considering terminating an employee, several legal considerations come into play. Given the provided facts, here are the most important legal topics to research before advising the client:

  1. At-Will Employment vs. Contractual Employment:

In many U.S. jurisdictions, unless there is an employment contract stating otherwise, employment is typically considered “at-will.” This means that either the employer or the employee can terminate the employment relationship at any time for any reason (that isn’t illegal). However, if the coach and college had an oral contract, its terms will need to be considered.

2. Oral Contracts:

While written contracts are easier to enforce, oral contracts can still be binding. The challenge is proving their terms and existence. Research would be needed to determine the enforceability of oral contracts in your jurisdiction and what evidence would be required to prove the terms of such an agreement

3. Implied Contract:

Even if there was no formal oral agreement about the contract’s duration, the actions and communications between the college and the coach might have implied a certain term of employment. Research would be needed to determine if an implied contract was potentially formed and its implications.

4. Due Process Rights for Public Employees:

Since the college is a state institution, the coach may have constitutional due process rights. If the coach has a property interest in their job (i.e., a reasonable expectation of continued employment), the state cannot deprive them of this without due process. Depending on local laws and court rulings, this might entail the need for a hearing before termination.

5. Public Policy and Whistleblower Protections:

The coach cannot be terminated for reasons that violate public policy (e.g., reporting illegal activities). Ensure that the budget cuts are the true and only reason for the contemplated termination.

6. Discrimination Laws:

Federal and state laws prohibit employment discrimination based on race, color, religion, sex, national origin, age, disability, and sometimes sexual orientation and gender identity. Ensure the coach’s termination isn’t grounded in any discriminatory reasons.

7. Breach of Good Faith and Fair Dealing:

In some jurisdictions, there’s an implied covenant of good faith and fair dealing in employment relationships. This means that even in at-will employment, employers cannot terminate employees to avoid paying earned benefits or for other bad faith reasons.

8. Potential Damages:

If the coach were to bring a wrongful termination suit and win, what would the potential damages be? This might include lost wages, emotional distress, punitive damages, and attorney’s fees.

9. Alternative Dispute Resolution:

Does the state college have any policies or practices regarding mediation or arbitration before proceeding with termination?

10. Release and Settlement Options:

If the college wants to avoid potential litigation, they might consider offering the coach a severance package in exchange for a release of claims.

After researching these topics, it would be crucial to provide a well-informed opinion to the client about the potential risks and benefits of terminating the coach, as well as any alternative strategies that might minimize legal exposure.

This answer includes the two points tested by the MCQ, but it is much more comprehensive. A new lawyer would do well following the leads offered by GPT-4. In contrast, a lawyer who simply remembered material tested on the bar exam would fall woefully short of serving this client competently. Comparing these two responses points out three problems with the bar exam–and with legal education more generally.

First, the subjects tested by the bar exam–and that we teach as “foundational” in law school–do not track contemporary law practice. The discrimination laws cited by GPT-4 are as (or more) critical to an employment law practice as the notion of employment at will. One of the first questions I would ask my client in this situation is whether the coach is a member of any protected class. But discrimination laws are not part of the subject matter tested on the bar exam, so the question doesn’t embrace them.

Second, both law school and the bar exam teach aspiring lawyers to think narrowly. Courses, classroom hours, and exam questions (especially multiple-choice ones) march students through hundreds of isolated doctrinal rules. Except in clinics, we rarely ask students to solve whole legal problems. GPT-4 is absolutely right that the terms of the oral contract, the possibility of an implied contract, and the extent of damages are as important for the lawyer to consider as the concept of employment at will. Yet we rarely encourage students to consider multiple doctrines within a single subject area, much less across subjects.

Students may need to learn some legal principles in isolation but I think we, as legal educators, need to recognize how atomized our teaching and testing have become. Over the last few decades, we have squeezed ever more doctrine into each credit hour, losing opportunities to help students integrate those pieces. Our shift towards multiple-choice testing, meanwhile, reinforces the notion that law consists of discrete legal principles operating in isolation.

Third, our teaching and licensing focus too much on reciting legal rights–and too little on helping clients solve problems more holistically. Even when a client asks only about the “legal repercussions” of an action, as the client did in this bar exam question, an effective lawyer would explore the problem more holistically with the client. Is abrupt termination the only solution to the client’s problem? Could the client offer the coach a severance package that would avoid litigation? Even if the client has no legal obligation to offer the coach a hearing, would a hearing help ease the blow to the coach? Could a hearing even lead to a new solution to the client’s underlying problem (a budget shortfall)? Maybe the coach would be willing to cut back their hours and continue serving the team.

Even without special prompting, GPT-4 recognized one of these more holistic solutions (a severance package) as an issue for the new attorney to consider. The bar exam question, in contrast, forces the aspiring lawyer to focus single-mindedly on the “legal topics.” Who is the machine in this comparison? Could working with GPT-4, ironically, lead us back to more holistic problem-solving with our clients?

, No Comments Yet

Caste Revisited

August 13th, 2021 / By

I’ve written several times about the caste system in legal education: a hierarchy that favors professors who teach torts, contracts, and other legal “doctrine” over those who teach legal writing, clinics, and other legal “skills.” This favoritism includes higher pay, more job security, and greater respect. Many schools maintain third and fourth classes that rank even lower than the second class citizens of clinics and legal writing. Academic support professors, teaching fellows, contract faculty, adjuncts, librarians, and other staff members often occupy those lowest rungs of the academic hierarchy.

California Western Steps Up

I’m returning to this topic because several related items recently hit my inbox. First, I received a press release from the California Western School of Law announcing that it had adopted a unitary tenure track that “creates opportunities for its clinical, Legal Skills, and other skills professors who were hired as full-time faculty to achieve tenure, with the same faculty governance and voting rights that come with an existing tenure-stream faculty position.” Kudos!

The press release, however, leaves several open questions. Will pay be equalized for professors on this unitary tenure track? Or will some professors still be more equal than others? How much research will be required for professors to join this unitary tenure track? Will the currently tenured professors turn their noses up at the scholarly focus of their new colleagues? And what about professors who choose not to join the unitary tenure track? Will the school recognize their ongoing contributions through higher pay and respect?

I’m not trying to rain on California Western’s parade: they have taken a hard step that many other schools are still resisting. I hope they will also find answers to these remaining questions, which schools face whether or not they embrace a unitary tenure track. What type of distinctions are appropriate among employees in a single organization? How do we value different types of contributions to the overall enterprise? Are the answers different for an academic institution and a manufacturing plant?

(more…) , No Comments Yet

What’s Your Story?

October 6th, 2018 / By

I’ve been attending the SALT Teaching Conference, hosted by Penn State Law in the aptly named Happy Valley, Pennsylvania. It was a great conference, with many thought-provoking ideas: I hope to share several of them over the coming days.

Here’s my first pass-along idea: Mariela Olivares from Howard University’s School of Law told us that she asks students in her Immigration Law course to write their personal immigration stories. When did members of their families arrive in the United States? Was the immigration voluntary or forced? What challenges did they face? What opportunities? Mariela allows students to choose whether these stories are confidential (for her eyes only) or can be shared with the rest of the class.

What a great way to engage students in the course content! As the course proceeds, students can reflect on how the laws affected their own family’s experience–and how that experience might differ under contemporary regulations. Even a class of 20 students will generate a rich set of stories that, if students are willing to share, could illuminate many corners of the course content.

This is also a wonderful way to build empathy in a doctrinal classroom. Empathy begins with self knowledge, and Mariela’s exercise requires students to confront their own history and feelings about the immigration system. Then, as students share their stories with others, they can begin to experience the system from a variety of perspectives.

I think it would be easy to expand this technique to almost every course in the curriculum. A Torts professor could ask students to write about an incident in which they or a family member suffered a physical or emotional injury. As with immigration stories, the exercise probably would generate stories relevant to every legal principle covered in the course. Which injuries could have been addressed by the tort system? Which ones were left out? Why? If the student/family member did not seek redress, why not?

Even courses about procedural rules could incorporate stories. Next time I teach Evidence, I may begin the semester by asking students to write about an incident in which a piece of evidence contributed to a decision they made–and they later discovered that the evidence was false or misleading. I won’t be looking for stories about the courtroom, but about everyday life. I suspect that the everyday stories may help me illuminate the problems with character evidence, hearsay, eyewitness identifications, and other evidence challenges.

So what’s your story? And how could personal stories kick off a course that you teach?

, No Comments Yet

You Too Can Check Out Quimbee

July 30th, 2018 / By

Since I posted about Quimbee, several colleagues have asked if there are ways to check out this new study aid. That’s easy: just sign up for a free 7-day trial. The trial really is free. You don’t need to enter credit card information; nor will Quimbee hound you to purchase the service when the week ends. I signed up for a free trial before writing my blog post and, other than a polite email noting that my trial was about to expire, Quimbee did nothing to pressure me for money. Nor have I gotten spam since my trial ended. So, if you wonder what your students are reading on Quimbee, go ahead and check the site out.

, View Comment (1)

Can We Teach Students to Read Cases?

July 24th, 2018 / By

I suggested in my last three posts that law students don’t learn how to read judicial opinions as carefully and thoughtfully as they should. Can we fix this? Can we modify legal education so that JD’s develop stronger case-reading skills? Solving this problem is important in itself: most lawyers interpret appellate opinions at least some of the time. Considering how to fix this problem, furthermore, can shed light on other pedagogical challenges. (more…)

, No Comments Yet

But Can They Read Cases?

July 8th, 2018 / By

I recently suggested that the case method fails to achieve one of its central goals: teaching students how to read and synthesize judicial opinions effectively. I identified three reasons for this shortfall: the format of law school exams, a growing emphasis on teaching doctrine, and the impact of contemporary study aids. But is it true? Are law students failing at case analysis?

An empirical study led by education scholar Dorothy Evensen suggests that they are. Evensen collaborated with Laurel Oates, an internationally recognized expert on legal analysis, and two other empiricists (James Stratman and Sarah Zappe) to examine the case reading skills of more than 300 students at five different law schools. The four published their study ten years ago, but it is just as relevant today. Let’s take a look at the study’s method, findings, and import. (more…)

, View Comments (4)


July 2nd, 2018 / By

What is Quimbee? It’s a database of more than 13,800 case briefs summarizing the cases presented in almost 200 casebooks. The covered casebooks include all of the bar subjects plus many more: antitrust, copyright, counterterrorism, cyberspace, education law, health, insurance, oil and gas, sports law, and others. The briefs seem thoughtful, complete, and well organized. If users discover an error, Quimbee invites them to submit a correction.

About two dozen law schools (including heavy hitters like Yale, Berkeley, and Northwestern) have signed up for Quimbee; students at those schools use Quimbee for free. Students at other schools can access Quimbee for a modest fee: just $15 per month for all of the site’s case briefs. For $24 per month, students can use the briefs, flashcards, video lessons, multiple choice questions, essays with model answers, and professional development courses.

Even the ABA has lent its endorsement; students who elect Quimbee’s platinum subscription receive an ABA premium membership as part of the deal. For a one-time payment of $499, these students get three full years’ of access to Quimbee’s case briefs and other services, downloads of course outlines, and the ABA membership. That’s a law school career of case briefs and other study aids (plus a bit of networking) for the price of two casebooks.

Plenty of other publishers have developed attractive study-aid subscriptions. There are also websites that offer case briefs and law school outlines for little or no charge. I focus here on Quimbee because it’s a good example of the resources that today’s law students use. These easily tapped sources raise the question: How much pedagogic value does the case method deliver when most students are using canned case briefs and course outlines? (more…)

, View Comments (5)

The Strange Case of the Case Method

June 29th, 2018 / By

The case method is legal education’s signature pedagogy. Law professors point to the method with pride, and that pride has considerable foundation. In theory, the case method accomplishes at least five pedagogic goals:

  1. It demonstrates that law is not static; law evolves through judicial interpretation. On some topics, students also see how the law evolves through legislation and administrative regulations.
  2. It teaches students how to read and synthesize judicial interpretations. Depending on the subject, students also learn how to read statutes and harmonize them with judicial opinions.
  3. It prepares students to advocate for changes in the law–primarily in the courts, but with some approaches that can be used with legislators and other decision-makers.
  4. It develops critical thinking skills (careful reading, analogical reasoning, identification of patterns and distinctions) that are transferable to many other contexts.
  5. It instructs students on the doctrinal principles discussed in the cases and accompanying statutes.

Can the case method accomplish all of these goals–especially when it is used in a large classroom with a single end-of-semester exam? I doubt that the method ever achieved as much as it claims, except perhaps for the highest achieving students in a classroom. Today, the method has been quietly subverted to accomplish primarily the fifth goal: instructing students on doctrinal principles. Law schools stake their value on teaching the other four cognitive skills listed above, but we deliver less of that learning than we believe. (more…)

, View Comments (4)

Professional Skills

May 2nd, 2018 / By

Robert Kuehn has written a thoughtful review of the history of professional skills education in legal education. As Bob notes, the ABA has been notably reluctant to require law schools to educate students on the skills they will use in law practice. Our accrediting body did not require any instruction in professional skills until 2005 and, even then, the accreditors required only “one solid credit” of that training. More recently, the ABA mandated six credits of experiential work for every law student–a total that still seems grudging for skills that lawyers use heavily in practice.

Students and some law schools have been more foresighted. As Bob documents, one-fifth of law schools now require all students to complete a clinic or externship; ninety percent have enough clinic or externship spots to accommodate all of their students. Students, meanwhile, show increasing interest in learning professional skills: enrollments in clinics, externships, and simulation courses have all climbed during the last decade.

This is a good news/bad news report. Student demand for professional training has increased, schools have shown an ability to meet that demand, and the ABA has finally imposed a meaningful requirement for experiential education. At the same time, tenure-track faculty continue to distance themselves from these educational experiences and the six-credit requirement is unrealistically light for students who will build their professional success on their skills.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe


Enter your email address to receive notifications of new posts by email.


Recent Comments

Recent Posts

Monthly Archives


Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests