More on the Bar Exam: Correlation and Competence

May 31st, 2017 / By

Derek Muller has identified an intriguing study of alternative ways to assess bar applicants. In 1980, the California bar examiners worked with a research team to explore the desirability of testing a wider range of lawyering skills on the bar exam. The researchers designed a two-day supplement to the bar exam and invited all July test-takers to participate in the supplemental exercise. More than 4,000 test-takers volunteered and, using appropriate sampling methods, the researchers chose 500 to participate. A few volunteers were unable to complete the exercise due to illness, so the final sample included 485 bar examinees.

These examinees completed the supplemental exercises in August 1980, shortly after taking the regular July exam. For two days, the examinees interviewed clients, drafted discovery plans, prepared letters, wrote trial briefs, cross-examined witnesses, and made arguments to mock juries. Each day’s work involved 5-6 tasks focused on a single client matter. Professional actors played the role of clients, and the researchers developed elaborate protocols for scoring the exercises.

How did results on the supplemental exam compare to those on the conventional test?


Both Professor Muller and the original research report stress the “correlation” between conventional exam scores and those awarded during the client-focused exercises. The correlation coefficient for the two sets of scores, the researchers found, was .72. That correlation was “about as strong” as the one between essay and MBE scores on the traditional bar exam (which was .78 for the subjects in their study). P. 21.

This correlation, the researchers acknowledged, wasn’t strong enough to suggest that the two exams tested the same abilities. Instead, they concluded that the two assessments “appear to be measuring similar but not identical abilities.” Id. They also noted that adding a “clinical skills” component to the bar exam would “provide some unique information about an applicant’s legal skills.” P. 37.

Despite these caveats, the words “strong” and “correlation” dominate the report. They suggest to lay readers that there wasn’t much difference between scores on the traditional bar exam and performance on the more client-focused assessment. On average, examinees who did well on multiple-choice and essay questions also performed well on client interviewing, writing briefs, and other practice tasks. If that’s true, the report implies, why make the bar more complicated by incorporating practice-oriented tasks?

Correlation, however, is not the same as competence. The language in this research report glossed over some key findings.


Of the 485 examinees who took part in the study, just 172 (35.5%) passed the conventional bar exam. California demanded a high passing score even in 1980, and the research study included all types of bar examinees (repeaters, graduates of California-accredited schools, graduates of unaccredited schools, etc.). It was not limited to first-time takers from ABA-accredited law schools.

Of those 172 subjects who passed the conventional bar exam, 69 failed the client-focused exercises. They weren’t just mediocre; they failed. A whopping 40.1% of conventional bar passers, in other words, were incompetent at performing basic practice tasks.

The high failure rate didn’t stem from idiosyncratic, overly harsh examiners. At least two (and more often three) examiners scored each portion of the practice assessment. They participated in training sessions before beginning their work and followed detailed guidelines while grading candidates. An expert panel, finally, provided an independent check on their work.

Nor is it likely that the failures stemmed from lack of effort by the candidates. In return for participating in the alternative assessments, bar examiners promised the subjects that they could use their alternative score to compensate for either a low MBE score or a deficient essay one. This incentive, which gave participants an opportunity to “substantially increase but . . . not decrease the likelihood” of passing the bar, drew the large number of applicants to the study and assured their enthusiastic participation.

Instead, it seems that 40% of the examinees who passed the conventional bar–and were eligible to represent clients immediately–lacked minimal competence in tasks like interviewing clients, drafting briefs, arguing to a jury, or writing a client letter. That is a scary finding.

It is also disconcerting that 20.7% of the study subjects who passed the client-focused test failed the conventional bar exam (at least before substitution of their alternative scores). Consumers lose when the bar admits incompetent attorneys, but they also lose when the bar fails to admit competent ones. As a consumer of legal services, I would much rather hire a lawyer who knows how to gather facts, research the law, prepare a brief, and argue to a jury–all skills tested on the alternative exam–than hire one who is able to pass a multiple choice and essay exam.


As Professor Muller notes, this study is a bit old; today’s examinees might fare quite differently than those from 1980. The practice tasks in the alternative assessment also featured more trial-related tasks than I would prefer. Pass/fail rates might have been more congruent if the experimenters had focused on negotiations, contract drafting, or deal making.

Perhaps most important, the researchers’ sampling frame produced a set of subjects with lower-than-average scores on the conventional bar exam. That might have exaggerated the differences between the two assessments; pass/fail rates swing more dramatically when subjects are near the passing line. The researchers also scaled the alternative scores to MBE scores; that limited the percentage of participants who could “pass” the alternative assessment.

Still, the study suggests the disturbing conclusion that a significant percentage of conventional bar passers (about two of every five) lack basic practice skills that are essential in representing clients. Clients don’t want a lawyer whose score on a multiple-choice/essay test “correlates” with the ability to write a trial brief or draft a letter; they want a lawyer who has shown minimal competence on those lawyering essentials.

The Multistate Peformance Test (MPT), notably, incorporated some of the skills tested in this 1980 experiment. Examinees who take the MPT write briefs, memos, contracts, and other practice documents based on a library of materials provided to them. These tasks are quite similar to the written assignments in the 1980 alternative assessments. With the MPT as part of the bar, additional lawyering tasks might change outcomes less than the alternative assessments did in this case.

We need more contemporary research on the bar exam and its alternatives. If this 1980 study is any guide, however, we will be shocked by the disconnect between the current exam and lawyering competence.

A Note on the Numbers

Many of the numbers I cite above do not appear explicitly in the researchers’ report of the 1980 study; I had to calculate them from the numbers that were provided. Here is what I did:

  • Table 2.2 on page 6 reports the total number of subjects within each of four racial groups.
  • Table 5.5 on page 23 reports the percentage of subjects within each racial group who obtained specific outcomes on the two assessments. E.g., 39% of “Anglo” subjects failed both the conventional exam and the alternative assessment; 9% of “Hispanic” subjects passed both versions.
  • By combining the numbers in Table 2.2 with the percentages reported in the “Fail AC/Pass GBX” column of Table 5.5., I determined how many members of each racial group passed the conventional bar exam but failed the alternative one. E.g., Table 5.5 shows that 16% of Anglo subjects fell in that category, and Table 2.2 reports that there were 268 Anglo subjects in all. Forty-three Anglo subjects, therefore, passed the conventional bar exam but failed the alternative assessment.
  • I then summed the numbers from the four racial groups to find that a total of 69 subjects failed the alternative assessment after passing the conventional exam.
  • Similarly, I combined the far right-hand column of Table 5.5 with figures reported in Table 2.2 to calculate the total number of subjects who passed the conventional bar exam. That number was 172.
  • I then divided 69 by 172 to determine that 40.1% of those who passed the conventional exam failed the client-centered exercises.
  • I followed an analogous approach to determine the percentage of conventional bar failers who passed the alternative assessment.

Why didn’t the researchers report these figures? That’s a good question. They did allude to the fact that “[a]bout 76% of the [subjects] had their pass/fail status classified the same way” by the two tests–which means that 24% had results that differed. But they did not feature the latter finding; nor did they report the more detailed numbers I give above. As statisticians know, the same numbers can tell multiple stories.



About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe


Enter your email address to receive notifications of new posts by email.


Recent Comments

Recent Posts

Monthly Archives


Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests