The ABA Section of Legal Education’s Standards Review and Data Policy Committee voted unanimously today to recommend that the Section’s Council approve revisions to Standards 501 and 316.
This comes on the heels of a multi-month notice and comment period, which saw a number of comments about the revisions.
The committee recommended that the revised standards be adopted as proposed.
By taking this action, the committee acknowledges that its primary responsibilities are protecting the public and students, not law schools.
In the memo announcing results from the July 2016 MBE, Erica Moeser also notified law school deans about an upcoming change in the test. For many years the 200-question exam has included 190 scored items and 10 pre-test questions. Starting in February 2017, the numbers will shift to 175 scored items and 25 pre-test ones.
Pre-testing is an important feature of standardized exams. The administrator uses pre-test answers to gauge a question’s clarity, difficulty, and usefulness for future exams. When examinees answer those questions, they improve the design of future tests.
From the test-taker’s perspective, these pre-test questions are indistinguishable from scored ones. Like other test-makers, NCBE scatters its pre-test questions throughout the exam. Examinees answer each question without knowing whether it is a “real” item that will contribute to their score or a pre-test one that will not.
So what are the implications of NCBE’s increase in the number of pre-test items? The shift is relatively large, from 10 questions (5% of the exam) to 25 (12.5% of the exam). I have three concerns about this change: fair treatment of human research subjects, reliability of the exam, and the possible impact on bar passage rates. I’ll explore the first of these concerns here and turn to the others in subsequent posts.
The Law School Admissions Council has thrown its latest tantrum.
In a letter to admissions professionals around the country, LSAC’s president, Daniel Bernstine, signaled that LSAC would stop certifying the accuracy of each law school’s LSAT and undergraduate GPA statistics. The certification is a joint effort between LSAC and the ABA to prevent law schools from lying about their admissions statistics.
LSAC agreed to certify admissions statistics in 2012 after months of roundly dismissing calls for certification. The group had claimed that certification would be cost prohibitive, despite nearly $60 million in total revenue in 2011 and a $10.7 million surplus in 2012. The group also claimed that certification was outside the scope of its organizational mission, despite its member law schools saying that LSAC was best positioned to protect the integrity of the admissions process.
Pressure mounted in 2011 and 2012 for LSAC to help the ABA after two law schools intentionally reported fraudulent data to the ABA and elsewhere, including to U.S. News and World Report for their annual law school rankings. In February 2011, Villanova University School of Law reported that an official at the law school intentionally reported fabricated LSAT and GPA statistics for an unknown number of years prior to 2010. Later that year, the University of Illinois College of Law admitted to intentionally fabricating the same statistics over a seven-year period. The school’s assistant dean for admissions and financial aid, Paul Pless, resigned as a result of the controversy.
This tantrum is LSAC’s second one this year. Both came after the University of Arizona James E. Rogers College of Law announced that the school would allow applicants to submit GRE scores in place of LSAT scores.
At that time, LSAC threatened to strip Arizona of its membership, which would eliminate access to a variety of services. LSAC walked back the threat in May after pressure from its membership and anti-trust concerns.
So why is the ABA now the latest recipient of LSAC’s retribution?
In response to law schools hoping to utilize the GRE as a non-exclusive alternative to the LSAT, which is designed and administered by LSAC, the ABA is examining whether the GRE meets Standard 503. That standard provides that schools must use a “valid and reliable admission test to assist the school and the applicant in assessing the applicant’s capability of satisfactorily completing the school’s program of legal education.” The LSAT is the only nationally validated test as of right now, though Arizona independently validated the GRE and other schools are trying to also.
In a recent column, Professor Stephen Davidoff Solomon observes that the legal job market “is a world of haves and have-nots.” With BigLaw firms raising entry-level salaries from $160,000 to $180,000, he concludes, “[t]op law graduates are doing better than ever.” Conversely, “it is clear that it is harder out there for the lower-tier law schools and their graduates.”
I agree with Professor Solomon about the divided nature of our profession; that reality has haunted American lawyers for decades. Solomon, however, significantly overstates the percentage of law graduates who fall within his world of “haves” (those whose salaries recently climbed from $160,000 to $180,000).
Originally published on Above The Law.
Deborah Merritt, a law professor at the Ohio State University, published an informative analysis on her blog yesterday about the new market rate salary for large law firms, which has been extensively covered here on ATL.
To her and virtually every other observer, the increase to $180,000 signals that many large firms are prospering. In part the increase reflects a small but steady increase in associate productivity since 2008, reaching roughly the levels from the last market rate increase in 2007. The following chart is from the 2016 Report on the State of the Legal Market, issued by Georgetown Law’s Center for the Study of the Legal Profession:
Associates are continuously more productive by this measure than any other category of worker, although at lower billable rates than partners. Interestingly, the gap in productivity between associates and other groups is significantly greater post-recession.
Originally published on Above the Law
Welcome to the second installment of Caveat Venditor, a series that assesses claims made by law schools to separate truth from fiction. This week we look at Brooklyn Law School’s employment rate of 92.2% posted on its “By The Numbers” infographic.
I noticed this claim on Brooklyn’s website after investigating the concern of a prelaw advisor. At the quadrennial Pre-Law Advisor National Council conference, this prelaw advisor asked what to do when a law school does not meet the accreditation requirements by not publishing the required disclosures. Indeed, Brooklyn was publishing an old report nearly six months after the ABA required them to publish its new one. Brooklyn remedied this problem on Monday, citing an “oversight due to transitions in several administrative departments in the last year.” According to a spokesperson from the law school, the ABA did not follow up with the law school to make sure it published the materials on time or at all.
This piece was originally published on Above the Law.
Welcome to Caveat Venditor, a new series that assesses claims made by law schools to separate truth from fiction. This week, we look at a threatening letter sent to a documentary film maker by Tom Clare, a lawyer for The Infilaw System.
InfiLaw owns three law schools — Arizona Summit, Charlotte School of Law, and Florida Coastal — and several legal education-related management companies. These are three of six total for-profit law schools approved by the ABA, although two of the other three are transitioning to non-profit status. InfiLaw also tried and failed to purchase Charleston School of Law after faculty, alumni, students, and the local legal community revolted.
Hat tip to Paul Campos for the full text of the letter:
I write on behalf of my client, The InfiLaw System (“InfiLaw”), regarding your inquiry into interviews with Florida Coastal School of Law officials for a documentary you are making. I write to caution you as you proceed with fact-finding and information gathering associated with your planned documentary.
Prior reporting on the issues you plan to address, including law school attrition rates and student success, has been plagued by gross misinformation, factual errors, and a general misuse and distortion of available data and analysis. This is especially true as they have been applied to InfiLaw schools such as Florida Coastal. Individuals, such as Paul Campos, have distorted facts and data and engaged in nefarious and inappropriate investigative tactics in order to accomplish a false agenda attacking law school admissions and career advancement policies. As such, I caution you to carefully assess any information and facts you gather from Mr. Campos and any other purported “authorities” on law school success metrics and the risks and rewards of attending law school in this day and age. InfiLaw and its affiliated schools will carefully analyze and assess any statements made about them and will not be afraid to pursue legal recourse to protect its reputation against any false and reckless statements.
In addition, InfiLaw requests that you notify me immediately upon any decisions to include any references to or subject matter about InfiLaw or any of its affiliate schools in your documentary, and provide InfiLaw the opportunity to review and comment on them prior to any public dissemination.
I wrote earlier this week about employment trends for doctors and lawyers. There is a third occupation that now vies with these professions for the affections of talented college graduates: software developer. Examining this occupation explains where some might-have-been lawyers are headed.
What Is a Software Developer?
Software developers, who are also called software engineers, are not programmers. They have a deep understanding of code, and know how to program, but that is not their primary focus. Instead, developers design the programs that give us so much delight–and occasional frustration. The developers also test programs to try to forestall that frustration and, when glitches happen, work with the programmers to fix the errant program.
Once you understand the nature of software development, you can see it’s attractions for students who might also consider law school. Software developers use their intellects, solve puzzles, and help people. They know more math than the typical lawyer, but their work focuses on logic and strategy rather than equations.
Add in these facts: It’s pretty cool to develop “apps,” many software companies are hip places to work, and you could become famous (and very rich) creating the next big program.
Medicine and law are highly regarded professions; talented students used to eagerly seek entry to both of these fields. But now applications to law schools are falling while those to medical schools are rising. What’s behind that phenomenon? Let’s take a look at employment trends in these two professions over the last forty years.
How much should a professional worker earn? The Department of Labor (DOL) recently decided that salaried professionals who work full-time should earn at least $47,476 per year. Under the Department’s new overtime rules, salaried workers earning less than that amount will be entitled to overtime pay for extra hours. A real professional, in DOL’s eyes, earns at least $913 per week–or $47,476 for a year of full-time work.
Hold your excitement: This salary test will not apply to lawyers, because DOL counts lawyers as professionals no matter how little they earn. See 29 CFR 541.600(e); 29 CFR 29 C.F.R. 541.304. Employers are free to continue working their lawyers for long hours and low pay. It’s worth considering, however, what this rule tells us about societal expectations for professional pay. If professionals earn at least $47,476 per year, how do lawyers stack up?
Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at firstname.lastname@example.org. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.