Council Declines to Act

June 7th, 2013 / By

The ABA Section of Legal Education’s Council voted unanimously today to defer any action on the Data Committee’s proposal to push back the date on which the ABA measures JD employment outcomes. We expressed our disapproval of this proposal over the last two days. Now others will have a chance to express their views to the Council before its August meeting. Measuring employment outcomes is important for schools, students, prospective students, graduates, and scholars who study the legal market. Any change from the current date requires careful evaluation–and, given the value of comparing outcomes over time, should have to overcome a strong presumption against change.

, No Comments Yet

Data on the Proposed Date Change

June 6th, 2013 / By

Kyle wrote yesterday about a proposal to push back the date on which law schools calculate their employment outcomes. Schools currently measure those outcomes on February 15 of each year, nine months after graduation. The proposal would nudge that date to March 15, ten months after graduation. The proposal comes from the Data Policy and Collection Committee of the ABA’s Section of Legal Education and Admissions to the Bar. The Section’s Council will consider the recommendation tomorrow.

Kyle explained how the committee’s report overlooks the needs of prospective law students, focusing instead on accommodating the interests of law schools. I agree with that critique and return to it below. First, however, I want to focus on some mistakes in the committee’s interpretation of the data provided to them by committee member Jerry Organ. Professor Organ was kind enough to share his spreadsheets with me, so I did not have to duplicate his work. He did an excellent job generating raw data for the committee but, as I explain here, the numbers cited by the committee do not support its recommendation. Indeed, they provide some evidence to the contrary.

The Committee’s Rationale and Data

The committee bases its recommendation on the facts that New York and California report bar results later than many other states, and that this hampers students seeking legal jobs in those markets. New York and California law schools, in turn, may have unduly depressed employment outcomes because their newly licensed graduates have less time to find jobs before February 15.

To substantiate this theory, the committee notes that “for graduates in the years 2011 and 2012, 18 of the bottom 37 schools in reported employment rates for the ‘Bar Passage Required, JD Advantage and Other Professional’ categories were located in New York and California.” This statistic is true for 2011, but not quite true for 2012: In 2012, the number is 15 out of 37 schools. But that’s a minor quibble. The bigger problem is that separating the results for California and New York creates a different picture.

California law schools are, in fact, disproportionately represented among the schools that perform worst on the employment metric cited by the committee. The committee examined 2011 employment statistics for 196 law schools and 2012 statistics for 198 schools. California accounted for twenty of the schools in 2011 (10.2%) and twenty-one of them in 2012 (10.6%). In contrast, the bottom 37 schools included 14 California schools in 2011 (37.8%) and 13 California schools in 2012 (35.1%). That’s a pretty large difference.

The New York law schools, on the other hand, are not disproportionately represented among the schools that performed worst on the committee’s reported metric. Fifteen of the examined schools (7.7% in 2011, 7.6% in 2012) are in New York state. The 37 schools that scored lowest on the employment metric, however, include only four New York schools in 2011 (10.8%) and two in 2012 (5.4%). One year is a little higher than we might predict based on the total number of NY schools; the other is a little lower.

Using the committee’s rudimentary analysis, in other words, the data show that one late-reporting state (California) is disproportionately represented among the bottom 37 schools, but another late-reporting state (New York) is not. That evidence actually cuts against the committee’s conclusion. If the timing of bar results accounts for the poor showing among California schools, then we should see a similar effect for New York schools. To compound this NY error, the committee mistakenly names Cardozo and Brooklyn as law schools that fall among the 37 lowest performing schools on the employment metric. Neither of those schools falls in that 37-school category in either year.

It’s possible that a different measure would show a disproportionate impact in New York. I haven’t had time to conduct other analyses; I simply repeated the one that the committee cites. Even if other analyses could show a discrepancy in New York, the committee’s reported data don’t line up with its conclusion. That’s a sloppy basis to support any action by the Section’s Council.

Better Analyses

If the committee (or Council) wants to explore the relationship between bar-result timing and employment outcomes, there are better ways to analyze the data provided by Professor Organ. This issue calls out for regression analysis: that technique could examine more closely the relationship between bar-result timing and employment outcomes, while controlling for factors like each school’s median LSAT, a measure of each school’s reputation, and the state’s bar passage rate. Regression is a routine tool used by many legal educators; it would be easy for the committee to supplement the dataset and conduct the analysis. That would be the best way to discern any relationship between the timing of bar results and employment outcomes.

But I have good news for the committee: There’s no need to improve the data analysis, because we already know enough to reject the proposed timing change.

What Really Matters?

Although the committee’s analysis is weak, I personally have no doubt that the timing of bar admission has some quantifiable relationship with employment outcomes. As the months roll on, more graduates find full-time, long-term professional employment (the outcome examined by the committee). In addition to the simple passage of time, we can all postulate that bar passage helps applicants secure jobs that require bar admission! The question isn’t whether there is some relationship between the timing of bar admission and employment outcomes. Even if that’s true, the questions for a policy-making committee are:

(a) How big is that effect compared to other effects?
(b) How much would a shift from February 15 to March 15 alter that effect?
(c) What negative impacts would that shift have?
(d) Do the costs outweigh the benefits?

Let’s take a look at each question.

How Big Is the Timing Effect?

We could answer this first question pretty precisely by doing the regression analysis outlined above. Without doing the additional data collection or math, I predict the following outcomes: First, median LSAT or law school reputation will show the greatest correlation with employment outcomes. In other words, each of those variables will correlate significantly with employment outcomes after controlling for other variables, and each of them will account for more variance in employment outcomes than any other variable in the equation. Second, bar passage rates will also have a significant impact on employment outcomes (again while controlling for other factors). Third, other factors (like measures of the strength of the entry-level legal market in each state) will also play a role in predicting employment outcomes. After controlling for factors like these, I predict that the timing of bar admission would show a statistically significant relationship with employment outcomes–but that it would be far from the weightiest factor.

I mentioned an important factor in that last paragraph, one that the committee report does not mention: bar passage rates. States have very different bar passage rates, ranging from 68.23% in Louisiana to 93.08% in South Dakota. (Both of those links will take you to Robert Anderson’s excellent analysis of bar exam difficulty. For purposes of this discussion, look at the far right-hand column, which gives actual pass rates.) When talking about employment outcomes, I suspect that differences in bar passage rates are far more important than differences in the timing of bar results. Waiting for bar results can slow down job offers, but flunking the bar hurts a lot more. People who fail the bar, in fact, may lose jobs they had already lined up.

California has the second lowest pass rate in the nation, second only to Louisiana (a state that is distinctive in many ways). Even graduates of ABA-accredited schools in California have a relatively low pass rate (76.9% for first-timers in July 2012) compared to exam-takers in other states. I suspect that much of the “California effect” detected by the ABA committee stems from the state’s high bar failure rate rather than its late reporting of bar results. Bar passage rates alone won’t fully explain differences in employment outcomes; I would perform a full regression analysis if I wanted to explore the factors related to those outcomes. But consider the relative impact of late results and poor results: Graduates who find out in November that they passed the bar may be a few weeks behind graduates in other states when seeking jobs. But graduates who find out in November that they failed the July bar have a whole different problem. Those graduates won’t be working on February 15, because they’ll be studying for the February bar.

California schools and graduates may face a bar timing problem, but they face a much larger bar passage problem. If we’re concerned with leveling the playing field for law schools, that’s a pretty rough terrain to tackle. As I suggest further below, moreover, the Data Committee shouldn’t worry about leveling the field for inter-school competition; after all, the ABA and its Section of Legal Education explicitly repudiate rankings. The committee should focus on the important task of gathering thoughtful data that informs accreditation and protects the public (including potential law students).

How Much Would the Date Shift Help?

Even if California (and maybe NY) schools have a problem related to the timing of bar results, how much would the proposed remedy help? Not very much. As Kyle pointed out yesterday, the date shift will give every school’s graduates an extra month to obtain full-time, long-term employment. Employment rates will go up for all schools, but will any difference between NY/California schools and other schools diminish? The committee actually could address that question with existing data, because there are several states that release bar results considerably earlier than other states. Do schools in those “early release” states have an employment advantage over other schools during October and November? If so, when does the advantage dissipate? A more refined regression analysis could suggest how much difference the proposed change would actually make.

I am relatively confident, meanwhile, that shifting the employment measurement date to March 15 would not address the bar-passage discrepancy I discuss above. The February bar exam occurs during the last week of the month. If low employment rates for California schools stem partly from a disproportionate number of graduates taking the February exam, a March 15 employment date doesn’t help much. Two weeks, give or take a day or two, isn’t much time to recover from the exam, apply for jobs, persuade an employer that you probably passed the exam you just took, and start work.

Negatives

What about downsides to the committee’s proposal? Kyle ably articulated four substantial ones yesterday. First, prospective students will receive employment information a month later, and this is a month that matters. Many law schools require seat deposits by May 1, and admitted students are actively weighing offers throughout April. Providing employment data in late April, rather than by March 31 (the current standard), leaves students waiting too long for important information. We should be striving to give prospective students information earlier in the spring, not later.

In fact, the committee’s report contains a helpful suggestion on this score: It indicates that law schools could submit March 15 employment data by April 7. If that’s true, then schools should be able to submit February 15 data by March 7–allowing the ABA to publish employment information a full week earlier than it currently does. Again, that’s a key week for students considering law school acceptances.

Second, the nine-month measurement day is already three months later than the day that would make most sense to prospective students and graduates. The grace period for repayment of direct unsubsidized loans ends six months after graduation; deferral of repayment for PLUS loans ends at the same time. For prospective students, a very important question is: What are the chances that I’ll have a full-time job when I have to start repaying my loans? We don’t currently answer that question for students. Instead, we tell them how many graduates of each law school have full-time jobs (and other types of jobs) three months after they’ve had to start repaying loans. If we’re going to change the reporting date for employment outcomes, we should move to six months–not ten. Schools could complement the six-month information with nine-month, ten-month, one-year, two-year, or any other measures. Employment rates at six months, however, would be most meaningful to prospective law students.

Third, changing the measurement day impedes comparisons over time. Partly for that reason, I haven’t advocated for a change to the six-month measure–although if change is on the table, I will definitely advocate for the six-month frame. The employment rates collected by the ABA allow comparison over time, as well as among schools. If schools begin reporting 10-month employment rates for the class of 2013, that class’s employment rate almost certainly will be higher than the class of 2012’s nine-month rate. But will the increase be due to improvements in the job market or to the shift in measurement date? If we want to comprehend changes in the job market, and that understanding is as important for schools as it is for students and graduates, there’s a strong reason to keep the measurement constant.

Finally, changing to a ten-month measurement date will make law schools–and their accrediting body–look bad. The committee’s report shows a great concern for “the particular hardship on law schools located in late bar results states,” the “current penalty on law schools who suffer from late bar results,” and the need for “a more level playing field” among those schools. There’s scant mention of the graduates who actually take these exams, wait for the results, search for jobs, remain unemployed nine months after graduation, and begin repaying loans before that date.

Prospective students, current students, graduates, and other law school critics will notice that focus–they really will. Why do law schools suddenly need to report employment outcomes after 10 months rather than nine? Is it because the information will be more timely for prospective students? Or because the information will be more accurate? No, it’s because some schools are suffering a hardship.

The Data Committee and Council need to pay more attention to the needs of students. From the student perspective, the “hardship” or “penalty” that some schools suffer is actually one that their graduates endure. If it takes longer to get a full-time lawyering job in NY or California than in North Carolina or New Mexico, that’s a distinction that matters to the graduates, not just the schools. It’s the graduates that will be carrying very large loans, with ever accumulating interest, for that extra month or two.

Similarly, if the real “penalty” stems from bar passage rates, that’s a difference that matters a lot to prospective students. It’s harder to pass the bar exam in California than in forty-eight other states and the District of Columbia. If you can’t pass the bar on your first try, your chances of working as a lawyer nine months after graduation fall significantly. Those are facts that affect graduates in the first instance, not law schools. They’re facts that prospective students need to know, not ones that we should in any way smooth over by creating a “level playing field” in which all graduates eventually obtain jobs.

Striking the Balance

The committee’s case for the proposed change is weak: the cited data don’t support the recommendation, the method of analyzing the data is simplistic, and the report doesn’t discuss costs of the proposal. Worse, law students and graduates will read the report’s reasoning as focused on the reputation of law schools, rather than as concerned about providing helpful, timely information to the people who we hope will work in our legal system. The committee could improve its analyses and reasoning, but the better move would be to reject the proposal and focus on more important matters.

, No Comments Yet

Proposed Employment Data Change

June 5th, 2013 / By

On Friday, the ABA Section of Legal Education considers a recommendation from the section’s data policy committee about when schools collect graduate employment data. Instead of collecting data nine months after graduation, schools would collect data ten months after graduation.

The change looks minor, but it’s misguided. Though the council should dismiss the recommendation outright for reasons outlined below, the council needs to at least decline to act on the recommendation this week.

The Committee’s Justification

The committee’s reasoning is straightforward: some graduates don’t obtain jobs by the nine-month mark because some state bars have a slow licensing process. As committee chair Len Strickman puts it in the committee’s recommendation memo, the data policy change would have “the benefit of a more level playing field.”

Several New York and California deans have lobbied for the policy change because those jurisdictions release July bar results so late. Last year, California provided results on November 16th, with swearing-in ceremonies in the following weeks. New York provided results earlier, on November 1st, but many struggled to be sworn in for months.

A variety of employers, such as small firms and the state government, tend to hire licensed graduates. Compared to schools in states with a quicker credentialing process, New York and California schools are disadvantaged on current employment metrics. Changing the measurement date to mid-March instead of mid-February would allegedly take some bite out the advantage.

To check for a quantifiable advantage, the data policy committee considered two sets of data. First, the committee sorted schools by the percentage of 2012 graduates working professional jobs (lawyers or otherwise) as of February 15, 2013. Second, the committee sorted schools by the percentage of 2012 graduates who were unemployed or had an unknown employment status. For both measures, the committee determined that New York and California schools were disproportionally represented on the bad end of the curve.

Poorly Supported Justification

Professor Strickman notes in his committee memo that many of the poorly-performing schools are “are broadly considered to be highly competitive schools nationally.” I’m not sure exactly what this means, but it sounds a lot like confirmation bias. Is he suggesting that the employment outcomes don’t match U.S. News rankings? The committee’s collective impression of how relatively well the schools should perform? Faculty reputation? It’s a mystery and without further support, not at all compelling.

Professor Strickman acknowledges that other factors may explain the relative placement. He does not name or address them. Here are some factors that may explain the so-called disadvantage:

(1) Graduate surplus (not just 2012, but for years);
(2) Attractiveness of certain states to graduates from out-of-state schools;
(3) Overall health of local legal markets;
(4) Graduate desirability;
(5) Ability of schools to fund post-graduation jobs.

Neither do we even know whether the rule revision would level the playing field. In other words, one extra month may not capture more professional job outcomes for graduates of New York and California schools than graduates of other schools. More time, after all, ought to produce better results for all schools with high under- and unemployment.

In sum, the committee should have declined to recommend the ten-month proposal until its proponents meet their burden of persuasion. The problem has not been well articulated, and the data do not support the conclusion.

The Accreditor’s Role

Worse than recommending an unsupported policy change, the committee ignores the group for whom law schools produce job statistics: prospective students. Prospective students, students, and a society that depends on lawyers are the Section of Legal Education’s constituents. Calling the uneven playing field a “disadvantage,” “penalty,” and “hardship” for law schools shows from where the committee obtained its perspective.

(1) Is there a normative problem with an uneven playing field?

It’s not apparent that there’s an issue to resolve. Grant the committee its premise that state credentialing timelines affect performance on employment metrics. Is it the ABA’s job to ensure that schools compete with each other on a level playing field?

In one sense, yes, of course. When a school lies or cheats or deceives it gains an undeserved advantage and ABA Standard 509 prohibits this behavior. But it does not prohibit that behavior because of how it affects school-on-school competition. Prohibitions are a consequence of the ABA’s role in protecting consumers and the public.

The ABA was ahead of the curve when it adopted Standard 509 in the 1990’s. The organization interpreted its accreditation role to include communicating non-educational value to these constituents through employment information.

Here, the ABA failed to adequately consider the prospective students who want to make informed decisions, and the public which subsidizes legal education.

Prospective students received only a passing mention in Professor Strickman’s memo. In describing why the committee rejected several schools’ request to move the measurement back to one year, Professor Strickman’s explains:

The Data Policy and Collection Committee decided to reject this request because that length of delay would undermine the currency of data available to prospective law students.

As it happens, the committee’s chosen proposal also has a currency problem. The committee also failed to convey whether or how, if at all, it considered the change’s impact on the value of the consumer information.

(2) Does the new policy impede a prospective student’s ability to make informed decisions?

One of the ABA’s recent accomplishments was accelerating the publication of employment data. Previously, the ABA published new employment data 16 months after schools measured employment outcomes. In 2013, the ABA took only six weeks.

But if the Section of Legal Education adopts the ten-month proposal, it pushes data publication to the end of April—after many deposit deadlines and on the eve of others. While applicants should not overrate the importance of year-to-year differences, they should have the opportunity to evaluate the changes.

The new policy also makes the information less useful.

At one time, schools reported graduate employment outcomes as of six months after graduation. In 1996, NALP began measuring outcomes at nine months instead. The ABA, which at that time only asked schools to report their NALP employment rate, followed.

The six-month measurement makes far more sense than the nine-month date. Six months after graduating, interest accumulated during school capitalizes and the first loan payment is due. Ideally that six-month period would be used to pay down the accumulated interest so that less interest is paid later. The credentialing process makes this a rarity. Adding another month to the measurement makes the figure even less valuable.

Reducing comparability also dilutes the value of recent employment information. Students should not consider one year of data in isolation, but should analyze changes and the reasons for those changes. It’s for this reason that the ABA requires schools to publish three years of employment data as of last August.

Conclusion: Dismiss or Wait

The council needs to add additional viewpoints to the data policy committee. Right now, the committee is dominated by law school faculty and administrators. All twelve members are current faculty, deans, or other administrators. The name change from the “Questionnaire Committee” to the “Data Policy and Collection Committee” envisions a policy role for the group.

Just like the council, standards committee, and accreditation committee need a diversity of viewpoints, so too does the data policy committee. Perhaps if this diversity existed on the committee to begin with the new measurement date would not have been recommended too soon or at all.

As the council considers whose interests it serves and whether the data policy recommendation is ripe for adoption, I hope its members also consider the drivers of the policy beyond a law school lobby promoting its own interests.

The policy presupposes a reality where there are enough graduates who cannot derive economic value from their law degrees nine months after graduating that the ABA needs to modify its collection policy in order to count them.

Let me repeat that. It takes so long to become a lawyer that almost a year can pass and it’s reasonable to think many people are not yet using a credential they invested over three years of time, money, and effort to receive. A career is (hopefully) decades long, but the brutal reality of credentialing is that its costs go beyond what any fair system would contemplate. A change to the data policy as a solution would be funny were the economics of legal education not so tragic.

, View Comment (1)

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests