Transparency Review in Advance of New Law School Jobs Data

April 4th, 2014 / By

Since 1974, the National Association for Law Placement has surveyed ABA-approved law school graduates with the help of roughly 200 schools and a nod from the ABA. NALP’s annual survey asks graduates to describe their jobs, their employers, how and when they obtained the positions, and their starting salaries. (More details here and here.)

NALP checks the data for discrepancies and produces statistical reports of post-graduation employment outcomes for each law school. NALP must keep these “NALP reports” confidential, but individual schools may publish their reports.

NALP Reports

Before the law school transparency movement, law schools did not publish NALP reports online for prospective students and others to see. Instead, these detailed, immensely useful reports occupied dusty filing cabinets. I recall when my organization first requested these reports from law schools, several career services deans told me they did not know where they were.

Though publishing a NALP report carries zero cost, skeptics doubted we’d succeed: “However worthy the effort, I doubt that this group will have much success ….” We obtained just 34 NALP reports from the initial request, but that number grew to 54 reports just a few months later after a handful of LST initiatives.

For the class of 2011, 68 schools published a NALP report until our annual Transparency Index, which grew the number of participating schools to 85. Prospective students and interested readers were even more fortunate for the class of 2012. To date, 108 schools—that’s 55% of possible schools—made their NALP reports public.

If you’re interested in viewing the data we gathered from these NALP reports, please head over to the LST Score Reports. We indicate on school profiles whether a school has decided to withhold information from the public. You can also view a list of the schools publishing NALP reports for the classes of 2010, 2011, and 2012 in our NALP Report Database. Note that we now have access to 60 2010 reports, 94 2011 reports, and 108 2012 reports.

Actual Law School Transparency

Law schools deserve a lot of credit for increasingly living up to proclamations in favor of transparency. So too do prospective students, current students, and alumni for demanding information. We accomplished actual transparency without formal legal requests, though we also believe it’s time that the non-participating schools subject to open record laws be ushered into the era of transparency.

Law school opacity harms not only the reputation of the schools who do not participate, but of the legal education system at large. Law schools are tasked with training the legal professionals of the future. They hold students to honor codes, require them to attend a class on professional responsibility and ethics, and send them into a profession where they must uphold the values of that profession on a daily basis. However, when it comes to their own conduct, too many schools take a position that the minimal level of integrity required to maintain ABA accreditation is good enough. Our hope is that schools who value their academic and social leadership roles will go beyond the bare minimum—and do so without sticks and carrots from LST.

Class of 2013 Employment Data and NALP Reports

Next week, the ABA will publish much of the class of 2013 employment data it collected from law schools in accordance with recently-refined accreditation requirements. Many law schools are already publishing information above and beyond the ABA requirements, and we hope these schools continue this positive practice later this summer when they receive their class of 2013 reports from NALP.

If your school does not yet publish what it has at its fingertips, ask why and explain how inaction is unprincipled, prevents informed decision-making by applicants, and harms the school and profession’s reputation. Our profession needs affordable, transparent, and fair entry. It starts with something as simple as law schools doing the obvious.

, View Comment (1)

Assuring Transparency

August 3rd, 2013 / By

Law School Transparency (LST) made news last week when several blogs reported that the organization had designed a certification program for law schools. For an annual fee, LST is offering to vet a school’s website and marketing materials for consistency with ABA standards and other best practices; create user-friendly graphics that would inform potential applicants; and certify the school’s transparency to those applicants. The proposal evoked charges that LST was operating a Mafia-like protection scheme, and even violating the Hobbs Act.

Really? Let’s revisit the history behind LST’s proposal.

ABA Requirements

In August 2012, the ABA’s Section of Legal Education and Admissions to the Bar adopted new standards governing law school disclosure of employment outcomes and scholarship retention rates. The Section explained these requirements in a memo distributed to all schools, and directed schools to comply with the mandates by October 5, 2012. Schools already possessed the information required by the new standard; they needed only to publish the data. To make that task as easy as possible, the ABA gave schools two simple templates for displaying data.

In late December of 2012 and early January of 2013, LST’s executive director (Kyle McEntee) and research director (Derek Tokaz) checked compliance with these requirements and issued a report. Despite the ABA’s clear mandate–and the ease of complying with those requirements–LST found that only one-third of accredited law schools had complied. Three months after the mandate took effect, 65.3% of schools had failed to publish at least one of the required tables. One in five schools (20.6%) had not published either chart.

The required charts were not mindless boilerplate. The ABA designed them to offer prospective students (1) key information about the percentage of students retaining conditional scholarships, and (2) basic employment outcomes for recent graduates. The information was essential to balance claims schools were making about scholarships and employment outcomes. Despite widespread recognition of the need for increased transparency, two-thirds of law schools failed to meet the ABA’s minimum standards.

After gathering this disheartening information, McEntee sent customized information to the dean, career services office, and admissions office of each accredited school. Those memos indicated whether the school had posted the ABA-required charts, whether other potentially misleading information appeared on the school’s site, and whether the school “went above and beyond the minimum regulatory standards” by publishing additional accurate, useful data for prospective students. After receiving this information, individuals from 102 different law schools communicated with McEntee, requesting more information about their school’s compliance or counseling on how to improve transparency. [You can find all of these details in the LST report cited above.]

After LST’s feedback, the percentage of schools complying with the ABA requirement doubled. Ninety percent (90.5%) of schools published at least one of the charts required by the ABA, while two-thirds (65.3%) provided both. Numerous schools improved other aspects of their communications with potential students, adopting some or all of the best practices suggested by LST.

In sum:

1. Despite frequent protestations of their improved transparency in communicating with potential applicants, two-thirds of accredited law schools had not complied with the ABA’s minimal disclosure requirements by early January of 2013.

2. Intervention by LST substantially improved compliance.

3. Even after that intervention, one-third of schools still failed to provide basic, required consumer information to law students.

How Do We Secure More Compliance?

As a legal educator, I find that lack of compliance astounding. How could so many law schools fail to comply with the ABA’s minimum transparency standards? These issues aren’t new. The press began spotlighting disclosure gaps in spring 2011, more than a year before the ABA issued its simple requirements. Law deans had vowed adherence to a new era of transparency, suggesting quick compliance with the new standards.

Some schools matched deeds to these words, but a majority did not. The foot dragging hurts the reputation of all law schools, but it hurts compliant schools more than the careless ones. We can’t regain the public’s trust, or recruit students to our programs, if we don’t adhere to our own accreditation standards governing transparency. Rules and lip service aren’t enough; we need compliance.

Who is going to take responsibility for achieving compliance? Do we as faculty have to police law school websites, sending polite notes to deans, admissions directors, and career services directors about omissions? If our own schools are in compliance, will we hound colleagues and deans at other schools about their failures? As scholars, we care about data integrity; as legal educators, we care about the reputation of our community. But how much time are we going to spend vetting the communications of 200 law schools?

LST proposed a solution: It would check transparency on law school websites, assuring consistency with ABA requirements as well as best practices in presenting data. Schools that followed those practices would receive a certification signaling their compliance with LST standards, which would be clearly identified to schools and the public. LST would charge for its time doing this work. That’s not a surprise: Most of us charge for our time when we work. The only surprise was that LST performed this work for free over the last few years.

This solution also addressed a request that LST had received from several deans. After receiving a high rating on LST’s transparency index, some deans asked if LST would give them a letter attesting to their transparency. Others blogged, tweeted, or posted about their success (see footnote 23 of this review). Schools clearly wanted to demonstrate their commitment to transparency, a desire that LST could fulfill–as long as someone was willing to pay for their time.

LST’s certification program is designed to fill the above needs. The price, $1,925 for the first year, would cover modest salaries for the individuals doing this work. For a price comparison, consider that the ABA is paying $75,000 for an advisory firm simply to design a protocol for reviewing the integrity of data generated by law schools (a somewhat different need than the one LST proposes addressing). That $75,000 fee won’t cover any actual reviews; it will support only design of a protocol. LST has already created its protocol–for free. With $75,000, it could apply the protocol to assure that thirty-nine different law schools are providing accurate, transparent data to prospective students.

What Now?

Will LST’s certification program go forward? Fortified by a few negative blog posts, law deans may decide to forego certification and the best practices it requires. If they do, I hope that faculty at their schools will be willing to pick up the slack. Slipshod practices in reporting data are embarrassing to all of us. For years, I shook my head at the way schools reported salary information without noting response rates. We wouldn’t tolerate those practices in scholarly papers; they’re even less appropriate when urging potential students to attend our schools.

I hope some deans will embrace LST’s certification process. It’s a good way to move forward, demonstrate a real commitment to transparency, and give prospective students the information they need.

Personal Disclosure

LST doesn’t have investors; it’s a nonprofit without shares to sell. It does, however, have some donors and I am one. I gave the organization $500 in 2012 and $5,000 earlier this year; the latter is a bit less than the amount I have been giving each year to the law school where I teach (with that money going to summer fellowships for students). In addition to my financial gifts, I have served as an unpaid adviser to LST.

What do I get for my donations and free advice to LST? No football tickets, mugs, stickers, or expectations of profit. All I “get” is the satisfaction that potential law students will receive the information they need to make good decisions about their careers–and that law schools themselves, encouraged by LST, will volunteer that information more freely.

LST never solicited me for my donations. I was impressed with their work and offered the support I could afford. I am paid well for the work I do, and I think LST deserves to be paid for their work. They have done much to create needed transparency at law schools and to serve prospective law students. I wish other law professors would support LST, even at much lower levels than I have provided. With more donations, LST would not need to charge for the transparency work that it does.

I was sufficiently impressed with Kyle McEntee that I invited him to moderate this blog with me. I don’t agree with everything he writes (and he doesn’t agree with everything I write), but I thought it was important to include a recent graduate’s perspective in a blog about legal education. There are blogs written by professors, and blogs written by recent graduates, but I believe we are one of the few sites trying to combine those perspectives.

And, yes, this blog is “as purely non-profit as the driven snow.” It’s not just non-profit; it’s non-income. No advertising, no trinkets for sale, just ideas to discuss.

, View Comments (4)

Transparency Today

March 4th, 2013 / By

ABA Standard 509 governs the consumer information that accredited law schools provide to prospective students. The ABA Section of Legal Education and Admissions to the Bar approved changes to that standard in June 2012, and the revised standard took effect on August 6.

The revised standard was widely publicized; indeed, it followed more than a year of lively discussion about misleading practices in the way some schools reported scholarship retention and employment rates. In response to those concerns, the revised standard includes a requirement that schools publish simple tables disclosing specified information about scholarships and jobs. The ABA provides the tables through downloadable worksheets; law schools have the applicable data readily at hand.

Given the widespread attention to Standard 509, the clear obligation of law schools to provide accurate information to potential students, and the specific worksheets offered by the ABA, quick compliance with Standard 509 should have been a breeze. By December 2012, surely every accredited law school in the country would have published the two mandatory tables.

Sadly, no. In late December and early January, two members of Law School Transparency (LST) visited the website of every ABA-accredited school, searching for the tables mandated by Standard 509. Almost two-thirds of law schools still had not posted one or both of the tables mandated by Standard 509. These schools were actively–even passionately–recruiting students for the fall of 2013. Yet they had allowed an entire semester to pass without posting the basic information about scholarship retention and employment rates that these prospective students deserve to know.

Kyle McEntee and Derek Tokaz, the Executive Director and Research Director respectively of LST, detail these disappointing results in a new paper. At the same time, they have published their findings on LST’s updated Transparency Index.

Before publishing, LST sent each law school the results of their website study. More than 100 law schools contacted LST and, over the next three weeks, Kyle and Derek counseled them on how to improve their compliance with Standard 509. As a result of these efforts, the percentage of schools failing to publish one or both of the mandatory charts has fallen from two-thirds to one-third. The online index reveals each school’s compliance status during the initial LST search (click “Winter 2013 Version”) and the school’s current status (click “Live Index”).

It’s hard to find any cheer in these numbers–other than to applaud LST for their tireless and unpaid work. Schools should have complied with the basics of Standard 509 by October 2012 at the latest. Two months is more than enough time to put readily available information into a spreadsheet and post the information on the web. How many times did non-compliant law schools update their websites between August and January? How much upbeat information did they add to attract applicants? What possibly excuses the failure to post information mandated for the benefit of those applicants? Facts about scholarship retention and employment matter to prospective students; that’s why the ABA requires their disclosure.

Missing 509 charts is just the beginning of the transparency problems that LST identified in its latest sweep of law school websites. The online index reveals still more sobering information. This report raises a serious question for law schools: If we want to provide “complete, accurate and nonmisleading” information to prospective students, and I think that most of us do, then what institutional mechanisms can we adopt to achieve that goal? Our current methods are not working well.

, View Comment (1)

Take This Job and Count It

January 19th, 2013 / By

In an article in the Journal of Legal Metrics, two Law School Transparency team members outline LST’s methodology for the LST Score Reports, an online tool designed to improve decisions by prospective law students. LST uses employment outcomes, projected costs, and admissions stats to help prospective students navigate their law school options.

Kyle McEntee and Derek Tokaz, the authors of both this paper and the online tool, resist the urge to rank schools on a national scale. Instead, they sort schools by where their graduates work post-graduation, allowing applicants to consider schools by geographic profile. The reports then use reader-friendly terms, like the percentage of graduates who secured full-time legal jobs, to help prospective students make educated decisions about which schools, if any, can meet their needs.

McEntee and Tokaz designed the reports to help prospective law students, but this article has important information for legal educators as well. The U.S. News rankings won’t disappear any time soon, but I think prospective students will begin looking at LST’s Score Reports in addition to the rankings. The reports contain more nuanced information, which prospective applicants will value; they also try to direct applicants into deeper exploration of their law school options.

As McEntee and Tokaz show, employment scores correlate imperfectly with U.S. News rank. As applicants begin to consider these scores, together with more transparent employment information on the schools’ websites, some schools will benefit while others suffer. Schools that under-perform their U.S. News score in job placement may want to explore why. Prospective students certainly will.

The other lesson for educators is that the vast majority of legal hiring is local. Students tend to stay in the city, state, and general region where they earned their law degree. As employers increasingly demand internships and unpaid apprenticeships, this trend may become even more dominant. It is hard to work part-time for a firm in one city while attending class in another. It’s far from impossible these days, with internet commuting, but students who lack face-time with prospective employers will be at a disadvantage. It’s also daunting to relocate after law school without a job in hand.

Law schools may find this information discouraging; most schools cherish their “national reputation” and want to extend it. It’s important to recognize, however, that the best job opportunities for graduates may be local ones. Time that a school spends promoting its national brand may deliver less return for graduates than time spent at local bar meetings.

On the bright side, schools should understand that a “national reputation” can co-exist with primarily local placement rates. That, in fact, is the reality for a vast number of law schools today. People around the country have heard about many law schools, even when those schools place most of their graduates locally. National reputation takes many forms and can pay off in many ways–even for graduates in later years. One lesson that I take from McEntee and Tokaz’s paper, however, is that schools should focus more diligently on their local, state, and regional reputations. That’s where the majority of job opportunities for graduates will lie.

, No Comments Yet

About Law School Cafe

Cafe Manager & Co-Moderator
Deborah J. Merritt

Cafe Designer & Co-Moderator
Kyle McEntee

ABA Journal Blawg 100 HonoreeLaw School Cafe is a resource for anyone interested in changes in legal education and the legal profession.

Around the Cafe


Enter your email address to receive notifications of new posts by email.


Recent Comments

Recent Posts

Monthly Archives


Have something you think our audience would like to hear about? Interested in writing one or more guest posts? Send an email to the cafe manager at We are interested in publishing posts from practitioners, students, faculty, and industry professionals.

Past and Present Guests