What Would a More Relevant Law School Ranking Look Like? You Told Us

What does the ATL readership believe should matter when it comes to ranking law schools?

Last week, we asked for your thoughts on what an improved, more relevant approach to law school rankings would look like. This request was of course prompted by U.S. News’s revisions to its rankings methodology, which now applies different weights to different employment outcomes, giving full credit only to full-time jobs where “bar passage is required or a J.D. gives them an advantage.” U.S. News is of course bowing to the realities of the horrific legal job market and the spreading realization that, for many if not most, pursuing a J.D. makes little economic sense.

Yet U.S. News’s revamped methodology feels like a half-measure at best, as employment outcomes make up less than 20% of the rankings formula. Compare this to the 40% of the score based on “quality assessment” surveys of practicing lawyers, judges, and law school faculty and administrators. Shouldn’t those numbers be reversed?

In any event, last week about 500 of you weighed in with your opinions on which criteria should matter and which should not when it comes to ranking law schools. The results are after the jump….

The subject of law school rankings sets off the ATL audience like few others. Our survey elicited a high volume of impassioned and thoughtful commentary. Here is small representative sample:

Faculty scholarship is less than useless and should not be a factor in anything –anywhere– at all. Most of those hacks have never actually worked for a living — who cares what bullsh*t they push in hopes of establishing a “space law” concentration that might sucker a few kids into coming to their school?

Law school is a trade school. What matters most are: (a) where people work after they leave; (b) what caliber of students attend the school, as measured by LSAT and undergrad GPA (which is a back-door way of figuring out which schools are most in demand by students and law firms); and (c) what other practitioners think of the school.

Library resources, particularly library size, is a wholly useless metric to use in the Internet Age. Reputation is equally irrelevant when the sampling is taken from judges (who tend to be curmudgeonly old law hounds with reputational perceptions that have been set in stone since the Cold War), and legal academics (law hounds of varying ages inordinately drawn from the elite schools, and therefore reflecting strictly elite perceptions of reputation). What on earth does the average legal academic know about which schools turn out good litigators?

[Clinics are] the only thing that count at all in law school. How many more articles can I read that law school teaches students how to “think like a lawyer”? Okay, so do that for six months or a year, not three. It is important for people to learn how to practice law, it’s a shame how many people in law school or recent graduates have not the slightest idea about how to draft a motion or file a complaint.

There are two categories of metrics that can be used in devising a rankings formula: “inputs” (e.g., LSAT scores, undergraduate GPAs), and “outputs” (e.g., employment data). The results of our survey last week show a resounding preference on your part for the latter. Below are the rankings criteria, ranked from least to most relevant (click to enlarge):

Sponsored

The top three most relevant factors are employment outcomes. When it came to “other” or write-in factors, these three were by far the most common: bar passage rate, average student debt, and clinical programs.

Six-tenths of one percent of you (i.e., three people) suggested that “suicide rate” would be handy data point to consider in any rankings scheme. Today, that suggestion is merely tasteless; five years ago it would have a bit puzzling as well.

Sponsored