Data on the Proposed Date Change

June 6th, 2013 / By

Kyle wrote yesterday about a proposal to push back the date on which law schools calculate their employment outcomes. Schools currently measure those outcomes on February 15 of each year, nine months after graduation. The proposal would nudge that date to March 15, ten months after graduation. The proposal comes from the Data Policy and Collection Committee of the ABA’s Section of Legal Education and Admissions to the Bar. The Section’s Council will consider the recommendation tomorrow.

Kyle explained how the committee’s report overlooks the needs of prospective law students, focusing instead on accommodating the interests of law schools. I agree with that critique and return to it below. First, however, I want to focus on some mistakes in the committee’s interpretation of the data provided to them by committee member Jerry Organ. Professor Organ was kind enough to share his spreadsheets with me, so I did not have to duplicate his work. He did an excellent job generating raw data for the committee but, as I explain here, the numbers cited by the committee do not support its recommendation. Indeed, they provide some evidence to the contrary.

The Committee’s Rationale and Data

The committee bases its recommendation on the facts that New York and California report bar results later than many other states, and that this hampers students seeking legal jobs in those markets. New York and California law schools, in turn, may have unduly depressed employment outcomes because their newly licensed graduates have less time to find jobs before February 15.

To substantiate this theory, the committee notes that “for graduates in the years 2011 and 2012, 18 of the bottom 37 schools in reported employment rates for the ‘Bar Passage Required, JD Advantage and Other Professional’ categories were located in New York and California.” This statistic is true for 2011, but not quite true for 2012: In 2012, the number is 15 out of 37 schools. But that’s a minor quibble. The bigger problem is that separating the results for California and New York creates a different picture.

California law schools are, in fact, disproportionately represented among the schools that perform worst on the employment metric cited by the committee. The committee examined 2011 employment statistics for 196 law schools and 2012 statistics for 198 schools. California accounted for twenty of the schools in 2011 (10.2%) and twenty-one of them in 2012 (10.6%). In contrast, the bottom 37 schools included 14 California schools in 2011 (37.8%) and 13 California schools in 2012 (35.1%). That’s a pretty large difference.

The New York law schools, on the other hand, are not disproportionately represented among the schools that performed worst on the committee’s reported metric. Fifteen of the examined schools (7.7% in 2011, 7.6% in 2012) are in New York state. The 37 schools that scored lowest on the employment metric, however, include only four New York schools in 2011 (10.8%) and two in 2012 (5.4%). One year is a little higher than we might predict based on the total number of NY schools; the other is a little lower.

Using the committee’s rudimentary analysis, in other words, the data show that one late-reporting state (California) is disproportionately represented among the bottom 37 schools, but another late-reporting state (New York) is not. That evidence actually cuts against the committee’s conclusion. If the timing of bar results accounts for the poor showing among California schools, then we should see a similar effect for New York schools. To compound this NY error, the committee mistakenly names Cardozo and Brooklyn as law schools that fall among the 37 lowest performing schools on the employment metric. Neither of those schools falls in that 37-school category in either year.

It’s possible that a different measure would show a disproportionate impact in New York. I haven’t had time to conduct other analyses; I simply repeated the one that the committee cites. Even if other analyses could show a discrepancy in New York, the committee’s reported data don’t line up with its conclusion. That’s a sloppy basis to support any action by the Section’s Council.

Better Analyses

If the committee (or Council) wants to explore the relationship between bar-result timing and employment outcomes, there are better ways to analyze the data provided by Professor Organ. This issue calls out for regression analysis: that technique could examine more closely the relationship between bar-result timing and employment outcomes, while controlling for factors like each school’s median LSAT, a measure of each school’s reputation, and the state’s bar passage rate. Regression is a routine tool used by many legal educators; it would be easy for the committee to supplement the dataset and conduct the analysis. That would be the best way to discern any relationship between the timing of bar results and employment outcomes.

But I have good news for the committee: There’s no need to improve the data analysis, because we already know enough to reject the proposed timing change.

What Really Matters?

Although the committee’s analysis is weak, I personally have no doubt that the timing of bar admission has some quantifiable relationship with employment outcomes. As the months roll on, more graduates find full-time, long-term professional employment (the outcome examined by the committee). In addition to the simple passage of time, we can all postulate that bar passage helps applicants secure jobs that require bar admission! The question isn’t whether there is some relationship between the timing of bar admission and employment outcomes. Even if that’s true, the questions for a policy-making committee are:

(a) How big is that effect compared to other effects?
(b) How much would a shift from February 15 to March 15 alter that effect?
(c) What negative impacts would that shift have?
(d) Do the costs outweigh the benefits?

Let’s take a look at each question.

How Big Is the Timing Effect?

We could answer this first question pretty precisely by doing the regression analysis outlined above. Without doing the additional data collection or math, I predict the following outcomes: First, median LSAT or law school reputation will show the greatest correlation with employment outcomes. In other words, each of those variables will correlate significantly with employment outcomes after controlling for other variables, and each of them will account for more variance in employment outcomes than any other variable in the equation. Second, bar passage rates will also have a significant impact on employment outcomes (again while controlling for other factors). Third, other factors (like measures of the strength of the entry-level legal market in each state) will also play a role in predicting employment outcomes. After controlling for factors like these, I predict that the timing of bar admission would show a statistically significant relationship with employment outcomes–but that it would be far from the weightiest factor.

I mentioned an important factor in that last paragraph, one that the committee report does not mention: bar passage rates. States have very different bar passage rates, ranging from 68.23% in Louisiana to 93.08% in South Dakota. (Both of those links will take you to Robert Anderson’s excellent analysis of bar exam difficulty. For purposes of this discussion, look at the far right-hand column, which gives actual pass rates.) When talking about employment outcomes, I suspect that differences in bar passage rates are far more important than differences in the timing of bar results. Waiting for bar results can slow down job offers, but flunking the bar hurts a lot more. People who fail the bar, in fact, may lose jobs they had already lined up.

California has the second lowest pass rate in the nation, second only to Louisiana (a state that is distinctive in many ways). Even graduates of ABA-accredited schools in California have a relatively low pass rate (76.9% for first-timers in July 2012) compared to exam-takers in other states. I suspect that much of the “California effect” detected by the ABA committee stems from the state’s high bar failure rate rather than its late reporting of bar results. Bar passage rates alone won’t fully explain differences in employment outcomes; I would perform a full regression analysis if I wanted to explore the factors related to those outcomes. But consider the relative impact of late results and poor results: Graduates who find out in November that they passed the bar may be a few weeks behind graduates in other states when seeking jobs. But graduates who find out in November that they failed the July bar have a whole different problem. Those graduates won’t be working on February 15, because they’ll be studying for the February bar.

California schools and graduates may face a bar timing problem, but they face a much larger bar passage problem. If we’re concerned with leveling the playing field for law schools, that’s a pretty rough terrain to tackle. As I suggest further below, moreover, the Data Committee shouldn’t worry about leveling the field for inter-school competition; after all, the ABA and its Section of Legal Education explicitly repudiate rankings. The committee should focus on the important task of gathering thoughtful data that informs accreditation and protects the public (including potential law students).

How Much Would the Date Shift Help?

Even if California (and maybe NY) schools have a problem related to the timing of bar results, how much would the proposed remedy help? Not very much. As Kyle pointed out yesterday, the date shift will give every school’s graduates an extra month to obtain full-time, long-term employment. Employment rates will go up for all schools, but will any difference between NY/California schools and other schools diminish? The committee actually could address that question with existing data, because there are several states that release bar results considerably earlier than other states. Do schools in those “early release” states have an employment advantage over other schools during October and November? If so, when does the advantage dissipate? A more refined regression analysis could suggest how much difference the proposed change would actually make.

I am relatively confident, meanwhile, that shifting the employment measurement date to March 15 would not address the bar-passage discrepancy I discuss above. The February bar exam occurs during the last week of the month. If low employment rates for California schools stem partly from a disproportionate number of graduates taking the February exam, a March 15 employment date doesn’t help much. Two weeks, give or take a day or two, isn’t much time to recover from the exam, apply for jobs, persuade an employer that you probably passed the exam you just took, and start work.

Negatives

What about downsides to the committee’s proposal? Kyle ably articulated four substantial ones yesterday. First, prospective students will receive employment information a month later, and this is a month that matters. Many law schools require seat deposits by May 1, and admitted students are actively weighing offers throughout April. Providing employment data in late April, rather than by March 31 (the current standard), leaves students waiting too long for important information. We should be striving to give prospective students information earlier in the spring, not later.

In fact, the committee’s report contains a helpful suggestion on this score: It indicates that law schools could submit March 15 employment data by April 7. If that’s true, then schools should be able to submit February 15 data by March 7–allowing the ABA to publish employment information a full week earlier than it currently does. Again, that’s a key week for students considering law school acceptances.

Second, the nine-month measurement day is already three months later than the day that would make most sense to prospective students and graduates. The grace period for repayment of direct unsubsidized loans ends six months after graduation; deferral of repayment for PLUS loans ends at the same time. For prospective students, a very important question is: What are the chances that I’ll have a full-time job when I have to start repaying my loans? We don’t currently answer that question for students. Instead, we tell them how many graduates of each law school have full-time jobs (and other types of jobs) three months after they’ve had to start repaying loans. If we’re going to change the reporting date for employment outcomes, we should move to six months–not ten. Schools could complement the six-month information with nine-month, ten-month, one-year, two-year, or any other measures. Employment rates at six months, however, would be most meaningful to prospective law students.

Third, changing the measurement day impedes comparisons over time. Partly for that reason, I haven’t advocated for a change to the six-month measure–although if change is on the table, I will definitely advocate for the six-month frame. The employment rates collected by the ABA allow comparison over time, as well as among schools. If schools begin reporting 10-month employment rates for the class of 2013, that class’s employment rate almost certainly will be higher than the class of 2012’s nine-month rate. But will the increase be due to improvements in the job market or to the shift in measurement date? If we want to comprehend changes in the job market, and that understanding is as important for schools as it is for students and graduates, there’s a strong reason to keep the measurement constant.

Finally, changing to a ten-month measurement date will make law schools–and their accrediting body–look bad. The committee’s report shows a great concern for “the particular hardship on law schools located in late bar results states,” the “current penalty on law schools who suffer from late bar results,” and the need for “a more level playing field” among those schools. There’s scant mention of the graduates who actually take these exams, wait for the results, search for jobs, remain unemployed nine months after graduation, and begin repaying loans before that date.

Prospective students, current students, graduates, and other law school critics will notice that focus–they really will. Why do law schools suddenly need to report employment outcomes after 10 months rather than nine? Is it because the information will be more timely for prospective students? Or because the information will be more accurate? No, it’s because some schools are suffering a hardship.

The Data Committee and Council need to pay more attention to the needs of students. From the student perspective, the “hardship” or “penalty” that some schools suffer is actually one that their graduates endure. If it takes longer to get a full-time lawyering job in NY or California than in North Carolina or New Mexico, that’s a distinction that matters to the graduates, not just the schools. It’s the graduates that will be carrying very large loans, with ever accumulating interest, for that extra month or two.

Similarly, if the real “penalty” stems from bar passage rates, that’s a difference that matters a lot to prospective students. It’s harder to pass the bar exam in California than in forty-eight other states and the District of Columbia. If you can’t pass the bar on your first try, your chances of working as a lawyer nine months after graduation fall significantly. Those are facts that affect graduates in the first instance, not law schools. They’re facts that prospective students need to know, not ones that we should in any way smooth over by creating a “level playing field” in which all graduates eventually obtain jobs.

Striking the Balance

The committee’s case for the proposed change is weak: the cited data don’t support the recommendation, the method of analyzing the data is simplistic, and the report doesn’t discuss costs of the proposal. Worse, law students and graduates will read the report’s reasoning as focused on the reputation of law schools, rather than as concerned about providing helpful, timely information to the people who we hope will work in our legal system. The committee could improve its analyses and reasoning, but the better move would be to reject the proposal and focus on more important matters.

,

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests