Law School Rankings Metrics? It's All About Jobs

What factors should be taken into account when ranking law schools?

Before taking on the massive commitment and expense of a law school education, prospective students need to do some serious homework. But let’s face it: not everyone will. The prospect of analyzing the available data is sufficiently great that many won’t bother.

In spite of concerns that rankings “facilitate laziness” or “pervert incentives,” we can agree that rankings aren’t going to disappear any time soon. People will still demand guidance, preferably in the form of easy-to-understand lists. For our part, ATL will continue to produce our own version of law school rankings. (We are releasing the 2014 rankings next Tuesday. You can register to see a live broadcast here.)

Last week we surveyed our readers for their views on what would be the most relevant elements of a law school rankings methodology. What did the readers have to say?

We were pleased to have confirmed for us the primacy of employment outcomes in evaluating school choices. A relief, because it was too late to change our ranking formula, which focuses on such outputs rather than “inputs” such as LSAT, GPA, and “resources” (i.e., spending).

Also, it must be noted that the ATL Law School Rankings are the only rankings of which we’re aware to incorporate the latest ABA employment data concerning the class of 2013. (For example, this well-known law school ranking, though labeled “2015,” relies on class of 2012 employment stats.)

Expectations about transparency and law graduate employment data have changed so fundamentally and quickly that it seems strange that until the class of 2011, law schools were only required by the ABA to report how many students were employed nine months after graduation, without any regard for the type of job. Those were days of widespread 90% + “employment rates.” But when the ABA began requiring schools to give more detailed information, we all learned that barely a majority of graduates across all law schools had secured full-time jobs requiring bar passage even nine months after graduation.

Sponsored

So in this legal job market, it makes sense that people want to know which grads are getting what jobs. When we asked these questions last year, our findings were similar, though in 2014 this emphasis on outcomes is even more pronounced. For example, this year only 18% of respondents tell us that they believe LSAT scores are “highly relevant” in assessing a law school. Last year it was 27%. Here’s a visual representation of the survey results:

So employment data was the one factor not a single respondent deemed “irrelevant.” “Library resources” was considered least relevant. Note that factors cited for the “Other” category revolved around diversity, starting salary, and debt load. Some representative survey comments:

“Not clerkships again. No one cares except Lat.”

“Faculty scholarly productivity and federal clerkship placement are both totally worthless measurements for different reasons. Faculty publication in a non-scientific field like law is simply irrelevant to the real world. It exists as a (flawed) tool for the employer (the university) to evaluate the performance of an employee (the professor).”

“Rankings should account for inclusiveness and diversity. Focusing only on LSAT score and GPA may discount schools that actively include students from cultural or socio economic circumstances that face greater challenges with respect to standardized tests. Diversity of the backgrounds of students should be a positive factor because it brings richness to classroom discussion etc.”

“I’m not sure why LSAT and GPA are factors at all. What do undergrads know about the reputations of practicing alums? It should all be about, in no particular order, Salary-debt ratio, % of legal employment, quality of employment, cost. Everything else is icing.”

“Employment matters above all else. Law schools are trade schools. If a school’s students cannot profitably practice the trade upon graduation, then the school should be shuttered. Period. All other factors are niceties.”

After considering all the feedback and criticism we received last year, we’ve decided to maintain our original methodology, which is as follows:

Sponsored

  • 30% Employment Score (full-time, bar passage required, no school-funded positions)
  • 30% “Quality Jobs” Score (members of the class of 2013 securing positions with the largest U.S. law firms or federal judicial clerkships)
  • 15% Cost
  • 10% Alumni Rating
  • 7.5% Supreme Court Clerkships
  • 7.5% Active Federal Judgeships

Some have complained that the latter two metrics are of little relevance to the vast majority of schools and students. They certainly are — and that is sort of the point. As Karen Sloan noted just yesterday in the National Law Journal, there are two distinct legal education worlds. Real legal job outcomes matter across all law schools, but certain indicia of “prestige” help us make small distinctions among the schools in the highest tier, where small distinctions are the only kind.


Kaplan is a premier provider of educational and career services for individuals, schools and businesses. Established in 1938, Kaplan is the world leader in the test prep industry. Kaplan offers comprehensive LSAT preparation nationwide and bar review preparation in 43 jurisdictions.