Just yesterday, the latest batch of starry-eyed dreamers sat for the LSAT (although the number of these hopeful 0Ls seems to be in freefall). As they wait for the scores to come in, these aspiring JDs will no doubt be doing their research and narrowing down where to apply. Law school applicants have no shortage of resources at their disposal to help them in making their decisions and navigating the process: from U.S. News to Princeton Review, from Anna Ivey to Top Law Schools. But we all know that there is no decision-making tool as beloved as a ranked list. People love rankings — such time and energy savers! We suspect more application and matriculation decisions are made by perusing rankings than will ever be admitted to.

Regular readers of this site might recall that a little while back we published our inaugural ATL Top 50 Law Schools ranking. We are proud that we, rather than burying our methodology in the footnotes or an obscure appendix, prefaced our rankings release with a detailed discussion about the choices we made in devising our methodology.

Whatever the subject matter, anyone looking to rate or rank anything has to make some choices between three basic methodological approaches:

1. Just make stuff up. This is the “trust me, I’m an expert” approach. Surprisingly widespread (e.g., Robert Parker, Mel Kiper). Clearly this would not do for a law school ranking.

2. Focus on a single data point. Most data is flawed in one way or another. Thus the advantage of multiple data points – you hope for some flattening out of the self-interest, human errors, and other factors that affect the quality of your data. Obviously, all rankings schemes are essentially reductionist, but to focus on but one piece of data even more so. If we are talking about PPP or “prestige,” those can serve as a fine organizing principle for comparing law firms. Yet choosing which law school to attend is a complex and highly personal process. Or, at least it should be. No single data point can capture all the various considerations of a real-life potential law school applicant.

3. Apply a formula that blends various relevant metrics. This is the approach ATL used. In the apportionment of the weights of one’s methodology, the ranker’s agenda is revealed. Like a wise man once said, the objective is subjective. In our case, we prioritized quality employment outcomes above all. But if we broke down the rankings by the individual data points (employment score, SCOTUS clerkships, cost, etc.), what would the top schools be in each category?

Below are the top five law schools based on each individual data point composing our rankings formula. With each of these metrics, we controlled for the law school’s enrollment:

Employment Score (full-time, long-term jobs requiring bar passage, excluding solos and school-funded positions):

1. Penn (overall ATL ranking: 5)
2. Stanford (2)
3. Chicago (4)
4. UC Berkeley (9)
5. Columbia (8)

Quality Jobs Score (percentage of 2012 graduates placed in federal clerkships and the largest 250 law firms):

1. Stanford (2)
2. Penn (5)
3. Chicago (4)
4. Harvard (3)
5. Yale (1)

Cost (cheapest, adjusted for regional cost of living):

1. BYU – J Reuben Clark (28)
2. University of New Mexico (26)
3. University of Alabama Law (27)
4. Georgia State (42)
5. University of Georgia (19)

Adjusted Percentage of U.S. Supreme Court Clerks(not really relevant for most schools, yet useful in making distinctions among the elite schools):

1. Yale (1)
2. Harvard (3)
3. Stanford (2)
4. Virginia (7)
5. Chicago (4)

Adjusted Percentage of Sitting Federal Judges (ditto above):

1. Yale (1)
2. Harvard (3)
3. Stanford (2)
4. Texas (14)
5. Virginia (7)

Finally, for those of you who have yet to do so, please take a few minutes and take our absolutely confidential ATL Insider Survey to tell us about your firm or law school. Thanks.


comments sponsored by

13 comments (hidden for your protection) Show all comments