The ABA Section of Legal Education’s Standards Review and Data Policy Committee voted unanimously today to recommend that the Section’s Council approve revisions to Standards 501 and 316.
This comes on the heels of a multi-month notice and comment period, which saw a number of comments about the revisions.
The committee recommended that the revised standards be adopted as proposed.
By taking this action, the committee acknowledges that its primary responsibilities are protecting the public and students, not law schools.
To many, late October signals nothing more than fall in full swing, pumpkins, or costumes. In late May, we look forward to the Memorial Day holiday and long weekends. Yet, the last weekend of every October and May, Georgia bar takers anxiously await exam results. Some stalk the postman. Most spend the day refreshing a webpage, hoping and praying their name appears on the public pass list.
The stages of grief—denial, anger, bargaining, depression, and acceptance—are experienced by one who fails a state bar exam. Imagine discovering that a family member is alive after grieving their death for ten months. This week, 90 Georgia bar takers—45 from July 2015 and 45 from February 2016—were informed that the thing they grieved was, in fact, alive. Though their names failed to appear on that very public pass list, they indeed passed the Georgia bar exam.
In the memo announcing results from the July 2016 MBE, Erica Moeser also notified law school deans about an upcoming change in the test. For many years the 200-question exam has included 190 scored items and 10 pre-test questions. Starting in February 2017, the numbers will shift to 175 scored items and 25 pre-test ones.
Pre-testing is an important feature of standardized exams. The administrator uses pre-test answers to gauge a question’s clarity, difficulty, and usefulness for future exams. When examinees answer those questions, they improve the design of future tests.
From the test-taker’s perspective, these pre-test questions are indistinguishable from scored ones. Like other test-makers, NCBE scatters its pre-test questions throughout the exam. Examinees answer each question without knowing whether it is a “real” item that will contribute to their score or a pre-test one that will not.
So what are the implications of NCBE’s increase in the number of pre-test items? The shift is relatively large, from 10 questions (5% of the exam) to 25 (12.5% of the exam). I have three concerns about this change: fair treatment of human research subjects, reliability of the exam, and the possible impact on bar passage rates. I’ll explore the first of these concerns here and turn to the others in subsequent posts.
Erica Moeser, President of the National Conference of Bar Examiners, sent a memo to law school deans today. The memo reported the welcome, but surprising, news that the national mean score on the MBE was higher in July 2016 than in July 2015. Last year, the national mean was just 139.9. This year, it’s 140.3.
That’s a small increase, but it’s nonetheless noteworthy. LSAT scores for entering law students have been falling for several years. The drop between fall 2012 and fall 2013 was quite noticeable: Seventy percent of ABA-accredited law schools experienced a drop in the 25th percentile score of their entering class. At 19 schools, that score fell 3 points. At another five, it was 4 points.
LSAT scores correlate with MBE scores, so many observers expected July 2016 MBE scores to be lower than those recorded in 2015. Moeser, for example, has repeatedly stressed the link between LSAT scores and MBE ones. She recently declared: “What would surprise me is if LSAT scores dropped and bar pass rates didn’t go down.”
Moeser just received that surprise: Students who began law school in fall 2013 had lower LSAT scores than those who began a year earlier. The former students, however, beat the latter on the MBE after graduation.
So What Happened?
Unpacking this news will take more time and data. Moeser mentions in her memo that the mean MBE score increased in 22 jurisdictions, fell in 26, and remained stable in two. Teasing apart the jurisdictions will provide insights. School-specific results will be even more informative in exploring why the overall score rose.
For now, I offer four hypotheses in descending order of likelihood (from my perspective):
The Law School Admissions Council has thrown its latest tantrum.
In a letter to admissions professionals around the country, LSAC’s president, Daniel Bernstine, signaled that LSAC would stop certifying the accuracy of each law school’s LSAT and undergraduate GPA statistics. The certification is a joint effort between LSAC and the ABA to prevent law schools from lying about their admissions statistics.
LSAC agreed to certify admissions statistics in 2012 after months of roundly dismissing calls for certification. The group had claimed that certification would be cost prohibitive, despite nearly $60 million in total revenue in 2011 and a $10.7 million surplus in 2012. The group also claimed that certification was outside the scope of its organizational mission, despite its member law schools saying that LSAC was best positioned to protect the integrity of the admissions process.
Pressure mounted in 2011 and 2012 for LSAC to help the ABA after two law schools intentionally reported fraudulent data to the ABA and elsewhere, including to U.S. News and World Report for their annual law school rankings. In February 2011, Villanova University School of Law reported that an official at the law school intentionally reported fabricated LSAT and GPA statistics for an unknown number of years prior to 2010. Later that year, the University of Illinois College of Law admitted to intentionally fabricating the same statistics over a seven-year period. The school’s assistant dean for admissions and financial aid, Paul Pless, resigned as a result of the controversy.
This tantrum is LSAC’s second one this year. Both came after the University of Arizona James E. Rogers College of Law announced that the school would allow applicants to submit GRE scores in place of LSAT scores.
At that time, LSAC threatened to strip Arizona of its membership, which would eliminate access to a variety of services. LSAC walked back the threat in May after pressure from its membership and anti-trust concerns.
So why is the ABA now the latest recipient of LSAC’s retribution?
In response to law schools hoping to utilize the GRE as a non-exclusive alternative to the LSAT, which is designed and administered by LSAC, the ABA is examining whether the GRE meets Standard 503. That standard provides that schools must use a “valid and reliable admission test to assist the school and the applicant in assessing the applicant’s capability of satisfactorily completing the school’s program of legal education.” The LSAT is the only nationally validated test as of right now, though Arizona independently validated the GRE and other schools are trying to also.
Earlier this summer, a federal panel recommended suspending the ABA’s power to accredit new law schools for one year. The transcript for that meeting has now been published, so we can examine in detail what happened. It’s clear that the panel intended its action to “send a signal” to the ABA Council that accredits law schools. All of us in legal education need to hear that signal: It affects the standards we adopt for accrediting law schools, as well as the eligibility of our students to take the bar exam.
I just finished reading a transcript of the meeting during which the National Advisory Committee on Institutional Quality and Integrity recommended that the Department of Education suspend the ABA’s power to accredit new law schools for one year. The transcript reveals some interesting details about the committee’s concerns; I will summarize those soon.
But before I do that, I can’t resist reporting the views of two “third party commenters” who spoke during the hearing. Committee rules gave each of these individuals 3 minutes to share their views.
Wood R. Foster, Jr., a Minneapolis lawyer and former president of the Minnesota State Bar Association, has written a striking review of recent changes in the legal profession. Foster spent his career as a commercial litigator with Siegel Brill, a small Minneapolis firm. Relatively few lawyers from that background have written about changes in the legal profession, and Foster does so eloquently.
Foster covers the growing surplus of lawyers, which he dates to 2000; fracturing of the profession; stalled diversity efforts; the high cost of legal education; BigLaw and its equally big shadow; and the impact of technology.
With some irony, Foster quotes a column that he wrote in 2000 after holding a series of focus groups with lawyers. “I have found,” he wrote then, “that lawyers are generally reluctant to visualize the profession’s future.” The future, however, arrived anyway. Today, he reflects, “a good argument can be made that the legal profession has changed more in the last 15 years than it did in the 150 years from 1849 to 1999.”
Foster’s views echo those I hear from many practitioners in their 60s and 70s. While academics continue to debate the existence of change, these lawyers have lived it. Their vantage point makes them particularly sympathetic to the newest generation of lawyers. “There really can be no doubt,” Foster concludes, “that it has been a rough ride for lawyers graduating from law school since 2000. . . . [The facts] add up to an unflattering picture of why so many young lawyers are finding it so hard to get the kind of start in their chosen profession that older lawyers like me were able to take for granted during the last half of the twentieth century.”
Give Foster a read. His featured series of articles absorbs much of this issue of Minnesota’s Bench and Bar journal.
Casebooks are shockingly expensive. The latest edition of Stone, Seidman, Sunstein, Tushnet, and Karlan’s Constitutional Law has a list price of $242. It’s even more shocking when you consider where the money goes. Not to pay for the cases and other primary materials that make up most of a casebook’s contents: they’re public domain and free to all. Mostly not to cover printing costs: the paperback edition of The Power Broker (to pick a book with the same word count and heft as a casebook) has a list price of $26, and you can buy it on Amazon for $18. Mostly not to authors: royalty rates are typically 10% to 20%. No, most of that money ends up in the pockets of the casebook publishers and other middlemen in the casebook chain. This is a tax on legal education, sucking money from law students and from the taxpayers underwriting their student loans.
In a perceptive and persuasive recent essay, Choosing a Criminal Procedure Casebook: On Lesser Evils and Free Books, Ben Trachtenberg runs through these numbers and reaches the obvious conclusion: law schools shouldn’t be asking students to shell out the big bucks to read public-domain legal materials. Casebooks should be cheap or free.
Trachtenberg’s preferred solution is that law schools, alone or together, fund the creation of “top-quality casebooks” which could then be made available to students for the cost of printing. Here at Law School Cafe, Kyle McEntee endorsed Trachtenberg’s suggestion and added that “it may make more sense to do this through an external organization funded through grants” to save students even more.
Originally published on Above The Law.
Law students spend between $3,000 and $4,000 on books during law school. For those that borrow, add another $1,000 on the 10-year plan or $2,000 on the 20-year plan. While a drop in the bucket compared to tuition and living expenses, $4,000 to $6,000 for books is not insignificant.
Shaving these costs down to the cost of printing is a common suggestion, but it does not appear to have been done at scale. In a new article in the Saint Louis University Law Journal, Professor Ben Trachtenberg from the University of Missouri School of Law outlines how to actually do it with the goal of encouraging action.
The question is: will it happen?
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at firstname.lastname@example.org. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.