Arizona Summit Does Still Have Conditional Scholarships

January 22nd, 2016 / By

On December 16th, I wrote a column for Above the Law on the ABA’s annual data dump. In it I highlighted nine schools that “reportedly” eliminated conditional scholarship programs. I used the quoted caveat in my column because I was skeptical that a few of these schools had actually eliminated the program.

One school I contacted was Arizona Summit. The school previously operated a very large conditional scholarship program and had a substantial percentage of students who lost these scholarships after the first year. It would have been a substantial budgetary hit to change the program at Arizona Summit in particular. However, the school’s 509 report indicated that it had. (more…)

, View Comments (5)

Peking University

January 18th, 2016 / By

In August 2012, the ABA’s Council of Legal Education and Admissions to the Bar decided not to accredit any law schools located outside the United States. Many observers assumed that action would put an end to Peking University’s upstart enterprise, a School of Transnational Law. Instead, the school, popularly known as “STL,” is thriving.

Philip McConnaughay and Colleen Toomey, STL’s Dean and Associate Dean, explain the school’s success in a recent paper. Their insights are important for anyone seeking to understand the globalization of law practice and legal education. The story of Peking University and STL also offers a cautionary tale about American protectionism.

(more…)

, No Comments Yet

The ABA Council and LSAT Scores

January 10th, 2016 / By

As Law School Transparency documented last fall, LSAT scores have plunged at numerous law schools. The low scores, combined with previous research, suggest that some schools are admitting students at high risk of failing the bar exam. If true, the schools are violating ABA Standard 501(b).

Two leaders of the ABA’s Section of Legal Education and Admissions to the Bar recently offered thoughts on this issue. Justice Rebecca White Berch, Chair of the Section’s Council, and Barry Currier, Managing Director of Accreditation and Legal Education, each addressed the topic in the Section’s winter newsletter.

Taking Accreditation Seriously

Berch and Currier both affirm the importance of enforcing the Council’s standards; they also indicate that the Council is already considering school admissions practices. Justice Berch reminds readers that the Council enforces its standards largely through review of each school’s responses to the annual questionnaire. This year, more than half of approved schools are replying to inquiries based on their questionnaire responses–although Berch does not indicate how many of those inquiries relate to admissions standards.

Currier, similarly, endorses the Council’s process and promises that: “If the evidence shows that a law school’s admissions process is being driven by the need to fill seats and generate revenue without taking appropriate steps to determine that students who enroll have a reasonable chance to succeed in school and on the bar examination, as ABA Standard 501(b) requires, then that school should be, and I am confident will be, held accountable.”

This is good news, that the Council is investigating this troubling issue. If we want to maintain legal education’s status, we have to be serious about our accreditation standards. But two points in the columns by Justice Berch and Managing Director Currier trouble me.

The Significance of LSAT Scores

Both Justice Berch and Currier stress that LSAT scores reveal only a small part of an individual’s potential for law study or practice. As Justice Berch notes, “an LSAT score does not purport to tell the whole story of a person.” This is undoubtedly true. Many law schools place far too much emphasis on LSAT scores when admitting students and awarding financial aid. Applicants’ work history, writing ability, prior educational achievements, and leadership experience should play a far greater role in admissions and scholarships. Rather than targeting high LSAT scores for admission and scholarships, schools should be more aggressive in rewarding other indicia of promise.

At the other end of the scale, I don’t think anyone would endorse an absolute LSAT threshold that every law school applicant must meet for admission–although we do, of course, require all applicants to take the test. There are too many variables that affect an admissions decision: a particular applicant with a very low LSAT may have other characteristics signaling a special potential for success.

LSAT scores, however, possess a different meaning when reported for a group, like a law school’s entering class. A law school may find one or two applicants with very low LSAT scores who display other indicia of success. That type of individualized decisionmaking, however, should have little impact on a school’s median or 25th percentile scores.

When a law school’s 25th percentile score plunges 10 points to reach a low of 138, that drop belies the type of individualized decisionmaking that responsible educators pursue. This is particularly true when the drop occurs during a period of diminished applications and financial stress.

The Charlotte School of Law displayed just that decline in entering credentials between 2010 and 2014. Nor was Charlotte alone. The Ave Maria School of Law dropped its 25th percentile LSAT score from 147 to 139. Arizona Summit fell from 148 to 140. You can see these and other drops in the detailed database compiled by Law School Transparency here.

We shouldn’t confuse the meaning of LSAT scores for an individual with the significance of those scores for a group. As I have suggested before, the score drops at some law schools are red flags that demand immediate attention.

Limited Resources

Justice Berch reminds readers that the Council’s accreditation process is “volunteer-driven” and that those volunteers already “give thousands of hours of their time each year.” More, she suggests, “should not be asked of them.” Even making the best use of those volunteers’ hours, she warns, careful review of the LSAT issue will take time.

This caution sounds the wrong tone. As professionals, we owe duties to both our students and their future clients. If law schools are violating the professional commitments they made through the accreditation process, then our accrediting body should act promptly to investigate, remedy, and–if necessary–sanction the violations.

Of course schools deserve “an opportunity to justify the admissions choices they have made before sanctions may be imposed.” But students also deserve fair treatment. If schools are admitting students who cannot pass the bar exam, that conduct should stop now–not a year or two from now, after more students have been placed into the same precarious position.

The LSAT drops cited above occurred between 2010 and 2014. More than a year has passed since schools reported those 2014 LSAT scores to the ABA. Isn’t that enough time to investigate schools’ admissions processes? What has the Council done during the last year, while more students were admitted with weak scores–and more graduates failed the bar?

Accreditation signals to students that schools and their accrediting body are watching out for their interests. If schools need to contribute more money or volunteer time to provide prompt review of red flags like these LSAT scores, we should ante up. Maintaining an accreditation process that fails to act promptly smacks of protectionism rather than professional responsibility.

, View Comment (1)

10-9 for Nine to Ten

August 10th, 2013 / By

By a narrow vote of 10-9, the ABA’s Legal Education Council has approved a proposal to move back the reporting date for new-graduate employment–from nine months after graduation to ten months after earning a degree. Kyle and I have each written about this proposal, and we each submitted comments opposing the change. The decision, I think, tells prospective students and the public two things.

First, the date change loudly signals that the entry-level job market remains very difficult for recent graduates, and that law schools anticipate those challenges continuing for the foreseeable future. This was the rationale for the proposal, that large firms are hiring “far fewer entry level graduates,” that “there is a distinct tendency of judges” to seek experienced clerks, and that other employers are reluctant to hire graduates until they have been admitted to the bar.

The schools saw these forces as ones that were unfairly, and perhaps unevenly, affecting their employment rates; they wanted to make clear that their educational programs were as sound as ever. From a prospective student’s viewpoint, however, the source of job-market changes doesn’t matter. An expensive degree that leads to heavy debt, ten months of unemployment, and the need to purchase still more tutoring for the bar, is not an attractive degree. Students know that the long-term pay-off, in job satisfaction or compensation, may be high for some graduates. But this is an uncertain time in both the general economy and the regulation of law practice; early-career prospects matter to prospective students with choices.

Second, and more disappointing to me, the Council’s vote suggests a concern with the comparative status of law schools, rather than with the very real changes occurring in the profession. The ABA’s Task Force on the Future of Legal Education has just issued a working paper that calls upon law faculty to “reduce the role given to status as a measure of personal and institutional success.” That’s a hard goal to reach without leadership from the top.

Given widespread acknowledgement that the proposal to shift the reporting date stemmed from changes in the US News methodology, we aren’t getting that leadership. Nor are we getting leadership on giving students the information they need, when they need it. This is another black eye for legal education.

, View Comments (2)

Proposed Employment Data Change

June 5th, 2013 / By

On Friday, the ABA Section of Legal Education considers a recommendation from the section’s data policy committee about when schools collect graduate employment data. Instead of collecting data nine months after graduation, schools would collect data ten months after graduation.

The change looks minor, but it’s misguided. Though the council should dismiss the recommendation outright for reasons outlined below, the council needs to at least decline to act on the recommendation this week.

The Committee’s Justification

The committee’s reasoning is straightforward: some graduates don’t obtain jobs by the nine-month mark because some state bars have a slow licensing process. As committee chair Len Strickman puts it in the committee’s recommendation memo, the data policy change would have “the benefit of a more level playing field.”

Several New York and California deans have lobbied for the policy change because those jurisdictions release July bar results so late. Last year, California provided results on November 16th, with swearing-in ceremonies in the following weeks. New York provided results earlier, on November 1st, but many struggled to be sworn in for months.

A variety of employers, such as small firms and the state government, tend to hire licensed graduates. Compared to schools in states with a quicker credentialing process, New York and California schools are disadvantaged on current employment metrics. Changing the measurement date to mid-March instead of mid-February would allegedly take some bite out the advantage.

To check for a quantifiable advantage, the data policy committee considered two sets of data. First, the committee sorted schools by the percentage of 2012 graduates working professional jobs (lawyers or otherwise) as of February 15, 2013. Second, the committee sorted schools by the percentage of 2012 graduates who were unemployed or had an unknown employment status. For both measures, the committee determined that New York and California schools were disproportionally represented on the bad end of the curve.

Poorly Supported Justification

Professor Strickman notes in his committee memo that many of the poorly-performing schools are “are broadly considered to be highly competitive schools nationally.” I’m not sure exactly what this means, but it sounds a lot like confirmation bias. Is he suggesting that the employment outcomes don’t match U.S. News rankings? The committee’s collective impression of how relatively well the schools should perform? Faculty reputation? It’s a mystery and without further support, not at all compelling.

Professor Strickman acknowledges that other factors may explain the relative placement. He does not name or address them. Here are some factors that may explain the so-called disadvantage:

(1) Graduate surplus (not just 2012, but for years);
(2) Attractiveness of certain states to graduates from out-of-state schools;
(3) Overall health of local legal markets;
(4) Graduate desirability;
(5) Ability of schools to fund post-graduation jobs.

Neither do we even know whether the rule revision would level the playing field. In other words, one extra month may not capture more professional job outcomes for graduates of New York and California schools than graduates of other schools. More time, after all, ought to produce better results for all schools with high under- and unemployment.

In sum, the committee should have declined to recommend the ten-month proposal until its proponents meet their burden of persuasion. The problem has not been well articulated, and the data do not support the conclusion.

The Accreditor’s Role

Worse than recommending an unsupported policy change, the committee ignores the group for whom law schools produce job statistics: prospective students. Prospective students, students, and a society that depends on lawyers are the Section of Legal Education’s constituents. Calling the uneven playing field a “disadvantage,” “penalty,” and “hardship” for law schools shows from where the committee obtained its perspective.

(1) Is there a normative problem with an uneven playing field?

It’s not apparent that there’s an issue to resolve. Grant the committee its premise that state credentialing timelines affect performance on employment metrics. Is it the ABA’s job to ensure that schools compete with each other on a level playing field?

In one sense, yes, of course. When a school lies or cheats or deceives it gains an undeserved advantage and ABA Standard 509 prohibits this behavior. But it does not prohibit that behavior because of how it affects school-on-school competition. Prohibitions are a consequence of the ABA’s role in protecting consumers and the public.

The ABA was ahead of the curve when it adopted Standard 509 in the 1990’s. The organization interpreted its accreditation role to include communicating non-educational value to these constituents through employment information.

Here, the ABA failed to adequately consider the prospective students who want to make informed decisions, and the public which subsidizes legal education.

Prospective students received only a passing mention in Professor Strickman’s memo. In describing why the committee rejected several schools’ request to move the measurement back to one year, Professor Strickman’s explains:

The Data Policy and Collection Committee decided to reject this request because that length of delay would undermine the currency of data available to prospective law students.

As it happens, the committee’s chosen proposal also has a currency problem. The committee also failed to convey whether or how, if at all, it considered the change’s impact on the value of the consumer information.

(2) Does the new policy impede a prospective student’s ability to make informed decisions?

One of the ABA’s recent accomplishments was accelerating the publication of employment data. Previously, the ABA published new employment data 16 months after schools measured employment outcomes. In 2013, the ABA took only six weeks.

But if the Section of Legal Education adopts the ten-month proposal, it pushes data publication to the end of April—after many deposit deadlines and on the eve of others. While applicants should not overrate the importance of year-to-year differences, they should have the opportunity to evaluate the changes.

The new policy also makes the information less useful.

At one time, schools reported graduate employment outcomes as of six months after graduation. In 1996, NALP began measuring outcomes at nine months instead. The ABA, which at that time only asked schools to report their NALP employment rate, followed.

The six-month measurement makes far more sense than the nine-month date. Six months after graduating, interest accumulated during school capitalizes and the first loan payment is due. Ideally that six-month period would be used to pay down the accumulated interest so that less interest is paid later. The credentialing process makes this a rarity. Adding another month to the measurement makes the figure even less valuable.

Reducing comparability also dilutes the value of recent employment information. Students should not consider one year of data in isolation, but should analyze changes and the reasons for those changes. It’s for this reason that the ABA requires schools to publish three years of employment data as of last August.

Conclusion: Dismiss or Wait

The council needs to add additional viewpoints to the data policy committee. Right now, the committee is dominated by law school faculty and administrators. All twelve members are current faculty, deans, or other administrators. The name change from the “Questionnaire Committee” to the “Data Policy and Collection Committee” envisions a policy role for the group.

Just like the council, standards committee, and accreditation committee need a diversity of viewpoints, so too does the data policy committee. Perhaps if this diversity existed on the committee to begin with the new measurement date would not have been recommended too soon or at all.

As the council considers whose interests it serves and whether the data policy recommendation is ripe for adoption, I hope its members also consider the drivers of the policy beyond a law school lobby promoting its own interests.

The policy presupposes a reality where there are enough graduates who cannot derive economic value from their law degrees nine months after graduating that the ABA needs to modify its collection policy in order to count them.

Let me repeat that. It takes so long to become a lawyer that almost a year can pass and it’s reasonable to think many people are not yet using a credential they invested over three years of time, money, and effort to receive. A career is (hopefully) decades long, but the brutal reality of credentialing is that its costs go beyond what any fair system would contemplate. A change to the data policy as a solution would be funny were the economics of legal education not so tragic.

, View Comment (1)

2012 Employment Outcomes

April 2nd, 2013 / By

The ABA has posted employment data for the Class of 2012. The figures are grim by any measure. The downturn in entry-level employment, which schools dismissed as temporary in 2009 and 2010, has persisted for four years–with a fifth year about to graduate. Only 56.2% of 2012 graduates had found full-time, long-term jobs requiring bar admission by nine months after graduation. More than a tenth of the class–10.6%–was still unemployed and actively seeking work at the nine-month mark. Those are shocking numbers for graduates with a professional degree.

The national unemployment rate was just 7.7% in February; the rate for law graduates was almost 3 points higher. Law schools, moreover, reported that another 2.2% of their graduates were “unemployed but not seeking work,” while still another 2.6% had an employment status that could not be confirmed. The graduates in those categories may belong with the plain old “unemployed”; lower ranked law schools have a suspiciously high number of graduates who either are not seeking work or refuse to disclose their job status.

All told, therefore, the unemployment rate for graduates of ABA-accredited law schools could be as high as 15.4%–more than one in every seven graduates.

Nor does the bad news stop there. Only 56.2% of graduates found full-time, long-term work that required a bar license. Another 9.5% reported full-time, long-term work for which the JD was an “advantage.” That’s a loosely defined category that includes paralegals and other positions that do not need graduate training. But even if we generously count all of those jobs as worthwhile outcomes for law graduates, less than two-thirds of all graduates (65.7%) secured a full-time, long-term job using their degree. And that’s nine months after law school graduation; more than six months after taking the bar.

Will the class of 2013 fare better? That seems unlikely. The class is larger than the class of 2012; it’s the largest class ever to move through ABA-accredited schools. There has been no noticeable upsurge in hiring at private firms, and government budgets are tighter than ever. My admittedly anecdotal sense is that law schools called in all of their remaining favors for the class of 2012. Alumni have already stretched to hire one more graduate; schools are running through funds for short-term jobs. When the class of 2013 joins their still under-employed peers from the classes of 2009 through 2012, the results won’t be pretty.

, No Comments Yet

Transparency Today

March 4th, 2013 / By

ABA Standard 509 governs the consumer information that accredited law schools provide to prospective students. The ABA Section of Legal Education and Admissions to the Bar approved changes to that standard in June 2012, and the revised standard took effect on August 6.

The revised standard was widely publicized; indeed, it followed more than a year of lively discussion about misleading practices in the way some schools reported scholarship retention and employment rates. In response to those concerns, the revised standard includes a requirement that schools publish simple tables disclosing specified information about scholarships and jobs. The ABA provides the tables through downloadable worksheets; law schools have the applicable data readily at hand.

Given the widespread attention to Standard 509, the clear obligation of law schools to provide accurate information to potential students, and the specific worksheets offered by the ABA, quick compliance with Standard 509 should have been a breeze. By December 2012, surely every accredited law school in the country would have published the two mandatory tables.

Sadly, no. In late December and early January, two members of Law School Transparency (LST) visited the website of every ABA-accredited school, searching for the tables mandated by Standard 509. Almost two-thirds of law schools still had not posted one or both of the tables mandated by Standard 509. These schools were actively–even passionately–recruiting students for the fall of 2013. Yet they had allowed an entire semester to pass without posting the basic information about scholarship retention and employment rates that these prospective students deserve to know.

Kyle McEntee and Derek Tokaz, the Executive Director and Research Director respectively of LST, detail these disappointing results in a new paper. At the same time, they have published their findings on LST’s updated Transparency Index.

Before publishing, LST sent each law school the results of their website study. More than 100 law schools contacted LST and, over the next three weeks, Kyle and Derek counseled them on how to improve their compliance with Standard 509. As a result of these efforts, the percentage of schools failing to publish one or both of the mandatory charts has fallen from two-thirds to one-third. The online index reveals each school’s compliance status during the initial LST search (click “Winter 2013 Version”) and the school’s current status (click “Live Index”).

It’s hard to find any cheer in these numbers–other than to applaud LST for their tireless and unpaid work. Schools should have complied with the basics of Standard 509 by October 2012 at the latest. Two months is more than enough time to put readily available information into a spreadsheet and post the information on the web. How many times did non-compliant law schools update their websites between August and January? How much upbeat information did they add to attract applicants? What possibly excuses the failure to post information mandated for the benefit of those applicants? Facts about scholarship retention and employment matter to prospective students; that’s why the ABA requires their disclosure.

Missing 509 charts is just the beginning of the transparency problems that LST identified in its latest sweep of law school websites. The online index reveals still more sobering information. This report raises a serious question for law schools: If we want to provide “complete, accurate and nonmisleading” information to prospective students, and I think that most of us do, then what institutional mechanisms can we adopt to achieve that goal? Our current methods are not working well.

, View Comment (1)

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe

Subscribe

Enter your email address to receive notifications of new posts by email.

Categories

Recent Comments

Recent Posts

Monthly Archives

Participate

Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at merritt52@gmail.com. We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests