Law School Rankings

Since we released the ATL Top 50 Law Schools last week, we’ve received a fair amount of feedback and criticism regarding our approach to ranking schools. As noted (again and again), our methodology considers “outcomes” only — the idea being that, in this dismal legal job market, that’s all that truly matters. Our rankings formula weighs six outcomes; these three below were the most disputed:

Supreme Court Clerks. This is simply the number of SCOTUS clerks produced by the school over the last five years, adjusted for the schools’ size. By far, this is the most heavily criticized aspect of our methodology. “Preposterous!” “Irrelevant!” “Reflective of some weird fetish on the part of one of your editors!” And so on. To which we say, sure, SCOTUS clerkships are irrelevant in assessing the vast majority of schools. Properly considered, this component is a sort of “extra credit question” that helps make fine distinctions among a few top schools.

Federal Judgeships. The number of sitting Article III judges who are alumni of the school, adjusted for size. Some complain that this is a lagging indicator that tells us something about graduates from 25 years ago but little about today’s students’ prospects. Besides, aren’t these appointments just a function of the appointees’ connections? True enough, but this is certainly an indicator of the enduring strength and scope of a school’s graduate network — surely a worthwhile consideration. Connections matter.

Quality Jobs Score. The percentage of students securing jobs at the nation’s largest law firms combined with those landing federal clerkships. The principal criticism with this metric is that it fails to include some categories of desirable job outcomes, including so-called “JD Advantage” jobs and certain public interest/government positions. However, parsing out the “good” jobs from the rest is the problem. Whenever we could, we used the most straightforward, obtainable, and well-defined data points, with the goal of a “quality jobs score” as a reasonable proxy for quality jobs generally.

Read on for a look at which schools rated best in each of the above categories, as well as on Employment Score and Lowest Cost. We’ll also look at some of the biggest gainers and losers in the ATL 50, plus significant differences between our rankings and U.S. News….

double red triangle arrows Continue reading “Again With The Law School Rankings: Winners, Losers, Critics, And U.S. News”

As all sentient beings are aware, we have a terrible, horrible, no good, very bad legal job market. According to NALP data, the industry is down 50,000 jobs since 2008 and there is no reason to believe they will ever reappear. If you ignore school-funded positions (5% of the total number of jobs), this market is worse than its previous low point of 1993-1994. In light of these grim economic realities, we feel that potential law students should prioritize their future job prospects over other factors in deciding whether to attend law school. To put it mildly, inputs- (LSATs, GPAs, per capita spending, etc.) and reputational survey-based law school rankings schemes have proved unsatisfactory. Hence our release last week of the ATL Top 50 Law Schools, which is based on nothing but outcomes.

(Although he probably disapproves of all rankings, it must be said that the legal world owes a great debt to Kyle McEntee and his colleagues at Law School Transparency. LST has forced us all to look at the publicly available employment data, submitted by the schools to the ABA, in a more meaningful way. Like all good ideas, it seems obvious in retrospect.)

We received a ton of feedback and comments regarding our rankings and our methodology, much of it thoughtful and substantive. (Our very own Elie Mystal weighed in with this takedown the day after we published.) Quite a few recurrent criticisms emerged from the comments. Of course there’s no perfect dataset or methodology. At best, rankings are a useful simulacrum and just one of many tools available to 0Ls for researching and comparing schools.

What follows are the most common criticisms of the ATL Top 50 Law Firms rankings….

double red triangle arrows Continue reading “The ATL Top 50 Law Schools: A Roundup of Criticism”

Yesterday, we released the inaugural ATL Top 50 Law School rankings. A lot of us here worked really hard on it. I’d be lying if I said I wasn’t proud of the effort.

But I haven’t made my career based on liking things. I hate things. If anybody else released a new law school rankings, I’d be critical of it. There’s no reason I should give ATL special treatment.

No rankings are perfect — ours certainly aren’t — so we should talk about the problems. And I mean the real problems, not the stupid interview answer of, “I think my biggest weakness is that sometimes I try too damn hard.”

Let’s douse these new rankings in a cold shower of haterade….

double red triangle arrows Continue reading “Everything That Is Wrong With The Above the Law Law School Rankings”

We present the inaugural ATL Top 50 Law School Rankings. Our rankings methodology is based purely on outcomes, especially on the schools’ success in placing its graduates into quality, real attorney jobs.

Read more »