ATL Law School Rankings

Recently, a solo practitioner somewhere in the Midwest posted on Facebook about her “incredible” annoyance at the fact that the ATL Law School Rankings do not count solos (and therefore her) as part of a school’s “employment score.”1

That’s unremarkable, of course. We don’t expect or intend that our approach will please everybody. Anyway, the resultant comment thread was, for the most part, a thoughtful discussion of the pros and cons of excluding solo practitioners in evaluating a particular law school school class’s employment outcomes. Again, all of this is unremarkable, and — especially considering the ATL rankings were published back in April — hardly worth noting now. But one particular commenter really, seriously disliked the ATL rankings methodology. Before you say “so what?” (or “me too”), consider the commenter is indisputably one of the most influential law school deans in the country. Not only that, this dean made a “suggestion” in the course of the discussion that, if it were adopted, would be a game changer for how law schools would share employment data….


1 It must be noted that the solo did not read or did not understand our methodology in the first place. Our employment scores measure the most recent class ten months after graduation. She only recently began her practice. Prior to that she worked for a couple years as a public defender, a job that would have been counted under our formula.

double red triangle arrows Continue reading “ATL vs. Law Dean vs. Common Sense”

Since we released the ATL Top 50 Law Schools last week, we’ve received a fair amount of feedback and criticism regarding our approach to ranking schools. As noted (again and again), our methodology considers “outcomes” only — the idea being that, in this dismal legal job market, that’s all that truly matters. Our rankings formula weighs six outcomes; these three below were the most disputed:

Supreme Court Clerks. This is simply the number of SCOTUS clerks produced by the school over the last five years, adjusted for the schools’ size. By far, this is the most heavily criticized aspect of our methodology. “Preposterous!” “Irrelevant!” “Reflective of some weird fetish on the part of one of your editors!” And so on. To which we say, sure, SCOTUS clerkships are irrelevant in assessing the vast majority of schools. Properly considered, this component is a sort of “extra credit question” that helps make fine distinctions among a few top schools.

Federal Judgeships. The number of sitting Article III judges who are alumni of the school, adjusted for size. Some complain that this is a lagging indicator that tells us something about graduates from 25 years ago but little about today’s students’ prospects. Besides, aren’t these appointments just a function of the appointees’ connections? True enough, but this is certainly an indicator of the enduring strength and scope of a school’s graduate network — surely a worthwhile consideration. Connections matter.

Quality Jobs Score. The percentage of students securing jobs at the nation’s largest law firms combined with those landing federal clerkships. The principal criticism with this metric is that it fails to include some categories of desirable job outcomes, including so-called “JD Advantage” jobs and certain public interest/government positions. However, parsing out the “good” jobs from the rest is the problem. Whenever we could, we used the most straightforward, obtainable, and well-defined data points, with the goal of a “quality jobs score” as a reasonable proxy for quality jobs generally.

Read on for a look at which schools rated best in each of the above categories, as well as on Employment Score and Lowest Cost. We’ll also look at some of the biggest gainers and losers in the ATL 50, plus significant differences between our rankings and U.S. News….

double red triangle arrows Continue reading “Again With The Law School Rankings: Winners, Losers, Critics, And U.S. News”