Improving Bar Passage

March 13th, 2016 / By

Scott Johns, Professor of Practice and Director of the Bar Success Program at the University of Denver Sturm College of Law, has posted a thoughtful empirical analysis of the college’s bar preparation program. Johns analyzed 642 students who graduated from the college in 2008–2010 and then immediately sat for the Colorado bar exam. He knew the exam score for each graduate, rather than simply pass-fail status, which allowed for a particularly nuanced analysis. Using multiple linear regression, Johns found the following associations with bar exam score:

  • Law school GPA showed the strongest association. An increase of one point in GPA was associated, on average, with an increase of 46.5 points in bar exam score.
  • LSAT score was the next strongest predictor. A one-point increase on the LSAT correlated with a 1.1 point increase in bar exam score.
  • Participation in two of the college’s bar success programs each correlated with higher bar exam scores. A third program did not show a significant correlation.
  • Neither sex nor minority status correlated significantly with bar exam scores.
  • Age correlated negatively with bar exam scores; on average, older students achieved lower scores.
  • Participation in the college’s part-time program likewise correlated significantly with lower bar exam scores.

All of these associations occurred while controlling for the other variables listed above. Participation in one of the successful bar preparation programs, for example, was significantly correlated with a higher bar exam score after controlling for LSAT, law school grades, sex, minority status, and other factors listed above.

Successful Programs

Johns notes that “a significant number of graduates . . . either barely pass or fail Colorado’s Bar Exam” (p. 16). Bar scores, in other words, are not bimodal. They follow a normal distribution with the passing score falling on the left side of the bell curve. For students lying along that part of the curve, a single point spells the difference between passage and failure. Given that reality, it seems worth investing in programs that significantly affect bar scores–even if the effects are relatively modest.

Johns’ analysis suggests that two of Denver’s programs have that desirable effect. The first is Legal Analysis Strategies, a course offered during the final semester of law school. According to the online catalog, the course focuses substantively on core bar exam subjects. Students review these subjects by completing essays, multiple-choice questions, and performance tests. In short, Johns concludes, the course “provides a jump-start to bar review.”

The second program, a post-graduate course that compliments commercial bar review courses, is the “heart” of Denver’s bar success program. This course gives graduates multiple opportunities to write answers to essay questions or performance test hypotheticals. Contract faculty and local attorneys offer individualized feedback on each of these exercises. Graduates, notably, are allowed to refer to notes and other materials while writing their answers. Although the bar itself is closed book, the course organizers believe that this approach better prepares graduates for both the bar exam and law practice.

Johns acknowledges that his empirical study of Denver graduates cannot “prove” that the two bar preparation courses help students succeed. There was no control group for the study, and other factors might account for the associations detected through multiple regression. The results, however, are suggestive enough that other schools might want to examine seriously the approach taken by Denver.

Other Recommendations

In a final section of the article, Johns turns to the high correlation of bar exam scores with both LSAT scores and law school grades. He wonders whether, given these relationships, law schools can identify “the specific skills” that produce those achievements. If so, perhaps schools could help more students achieve greater success throughout law school and on the bar exam. As one approach, Johns suggests that schools might incorporate more writing exercises throughout the first year. If first-year students repeatedly wrote essay answers, and received immediate feedback on those answers, perhaps they would develop more of the analytic skills that law school and the bar exam reward.

That type of education, based on intensive writing and feedback, is one that some legal educators have advocated for years. Faculty pushback has focused on the time required for this type of education. But perhaps, with both class sizes and entering credentials falling at many schools, it’s time to reopen this question. It’s hard to believe that more writing and feedback would harm first-year students; perhaps that educational investment would pay off throughout law school, on the bar exam, and in practice.

,

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests