But Can They Read Cases?

July 8th, 2018 / By

I recently suggested that the case method fails to achieve one of its central goals: teaching students how to read and synthesize judicial opinions effectively. I identified three reasons for this shortfall: the format of law school exams, a growing emphasis on teaching doctrine, and the impact of contemporary study aids. But is it true? Are law students failing at case analysis?

An empirical study led by education scholar Dorothy Evensen suggests that they are. Evensen collaborated with Laurel Oates, an internationally recognized expert on legal analysis, and two other empiricists (James Stratman and Sarah Zappe) to examine the case reading skills of more than 300 students at five different law schools. The four published their study ten years ago, but it is just as relevant today. Let’s take a look at the study’s method, findings, and import.


Students at five different law schools volunteered to participate in the Evensen study; they represented a mix of first-semester, second-semester, third-semester, and sixth-semester students. Each subject received $50 for 1-2 hours of work. During those hours, the subjects read three court opinions on a topic that was rarely taught in law school. They then responded to fourteen multiple choice questions testing their ability to analyze and synthesize the opinions.

The questions required students to perform these basic tasks of case analysis:

  • Identify the issue and holding articulated by a court
  • Describe the reasoning supporting the court’s decision
  • Recognize the importance of specific facts to the court’s reasoning
  • Evaluate the manner in which the decision drew upon precedent
  • Identify issues common to several cases
  • Recognize similarities/differences in fact patterns of cases addressing the same issue
  • Understand the similarities/differences in reasoning used by two courts analyzing the same issue
  • Recognize ambiguities in a judicial opinion
  • Use those ambiguities to craft arguments for a client
  • Recognize arguments that opposing counsel might raise

Evensen’s team devoted considerable time to constructing and evaluating the questions used in their assessment. They also developed two sets of case opinions and questions, adding power to their study.


Student performance in the Evensen study was distinctly mediocre. On average, first-semester law students answered just eight out of fourteen questions (57%) correctly; second-semester students fared no better. Nor did case-reading skills improve after the first year: Students in the third and sixth semesters of law school continued to average less than 60% on this test of case-reading skills. As Evensen and her colleagues concluded, “it looks as though our students’ case-reading abilities need attention.” [p. 39]

What factors contributed to this lackluster performance? In a supplemental study, Evensen’s team used think-aloud sessions to track students’ reasoning as they responded to the test questions. They found that “[t]he most frequent reason that students missed an item was their over-reliance on memory and a failure to return to the available texts to reread.” [p. 36] Students also failed to read carefully, seizing words out of context or ignoring broader implications of a court’s language. [Id.] Some subjects, finally, tended to respond as judges or policymakers, substituting their policy preferences for the rulings in the cases. [Id.]


Can we trust the results of the Evensen study? Evensen and her team voiced several caveats about their work: Although they tested more than 300 students at five schools, they used only two testing instruments. Those instruments were newly developed and might need further refinement. The testing design, finally, was stronger for some aspects of the research than others.

I would add three caveats to the ones cited by the researchers. First, the study relied upon volunteers, which could have skewed their results. Less accomplished students, for example, might have volunteered for the study to improve their performance; that would have depressed the test results. It seems equally likely, however, that more accomplished students volunteered, seizing any opportunity to hone their skills. It’s hard to determine the direction of any skew in the results.

Second, the test did not have any real-world implications for the subjects; they might not have tried their hardest to answer the questions. That’s always an issue with laboratory studies, but the subjects in this study had several strong incentives: They received $50 apiece for just 1-2 hours of work, and the study offered them an opportunity to practice desirable law school skills. Law students are also known for their achievement orientation and competitive nature. Those factors should have produced a serious effort on the Evensen exam, even if not the all-out effort evoked by a real law school exam.

Evensen and her colleagues, finally, conducted their tests between 2003 and 2006. Would they obtain the same results today? Both law schools and their students have changed during the last twelve years, so that’s difficult to predict. Increased attention to legal writing classes might have enhanced case-reading abilities over the last decade. Students, on the other hand, have increased access to the study aids I discussed in my last column. At many schools, moreover, student credentials have fallen.


Despite any caveats, the results of the Evensen study are alarming. The ability to read cases is a basic skill that all law schools purport to teach: We devote a considerable portion of the curriculum to reading and analyzing cases. Yet the average student in the Evensen study reached only a basic level of case reading proficiency–and failed to improve over three years of law school.

Other studies, unfortunately, suggest that law schools fall short in teaching other types of critical thinking;

  • David Herring and Collin Lynch found that students failed to improve their ability to link legal principles with hypotheticals during the first six weeks of law school.
  • Stefan Krieger reported that second- and third-year students failed to identify key facts in a simulated client problem.
  • In a different studyKrieger found that third-year students proposed more legal rules than first- and second-year students when exploring a client problem, but a majority of those rules were irrelevant. “At the end of their third year,” Krieger concluded, students “seem prone to generate indiscriminately a large number of rules, many of which are irrelevant.” [p. 349]

The subjects in these studies included students from the University of Pittsburgh, Hofstra Law School, and the University of Chicago; problems in our pedagogy do not seem limited to a single law school or group of schools.

We need to update and expand upon these studies. If this type of mediocrity marks contemporary legal education, we need to know that–and we have to figure out ways to improve. Law schools will not serve clients–or continue to attract students–if we don’t live up to our claim of teaching students how to “think like a lawyer.”






About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe


Enter your email address to receive notifications of new posts by email.


Recent Comments

Recent Posts

Monthly Archives


Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests