As all sentient beings are aware, we have a terrible, horrible, no good, very bad legal job market. According to NALP data, the industry is down 50,000 jobs since 2008 and there is no reason to believe they will ever reappear. If you ignore school-funded positions (5% of the total number of jobs), this market is worse than its previous low point of 1993-1994. In light of these grim economic realities, we feel that potential law students should prioritize their future job prospects over other factors in deciding whether to attend law school. To put it mildly, inputs- (LSATs, GPAs, per capita spending, etc.) and reputational survey-based law school rankings schemes have proved unsatisfactory. Hence our release last week of the ATL Top 50 Law Schools, which is based on nothing but outcomes.
(Although he probably disapproves of all rankings, it must be said that the legal world owes a great debt to Kyle McEntee and his colleagues at Law School Transparency. LST has forced us all to look at the publicly available employment data, submitted by the schools to the ABA, in a more meaningful way. Like all good ideas, it seems obvious in retrospect.)
We received a ton of feedback and comments regarding our rankings and our methodology, much of it thoughtful and substantive. (Our very own Elie Mystal weighed in with this takedown the day after we published.) Quite a few recurrent criticisms emerged from the comments. Of course there’s no perfect dataset or methodology. At best, rankings are a useful simulacrum and just one of many tools available to 0Ls for researching and comparing schools.
What follows are the most common criticisms of the ATL Top 50 Law Firms rankings….
1. New York City gets the shaft.
Please. Columbia and NYU have somehow gotten the shaft in your methodology – which means the methodology is flawed. Most “Biglaw” firms in their “Biglaw” nerve center (NYC) are fed by the “Biglaw” feeders – Harvard, Columbia, NYU. The rest have some good representation, but, again, please.
This is true. Columbia and NYU fare worse in our rankings than those schools do in the U.S. News rankings. This can be explained by looking at the confluence of two factors:
a.) Of all legal markets, New York was hit the hardest by the recession. According to NALP data, back in 2007, New York accounted for more than 12% of all legal jobs in the country. That is now down to 8.44%.
b.) Of all sectors of the legal employment market, large firms have been the most volatile and have undergone the biggest contraction.
Thus, it follows that schools which have been long dependent on Biglaw hiring in New York are at a disadvantage if we’re strictly looking at employment outcomes.
2. The “Quality Jobs” category is too limited.
How about consulting firms? A number of graduates in my class (at a top school) went to McKinsey, Bain, or BCG. I’d argue that these jobs are as hard to get as top firm jobs and require an MBA, JD or MD, so should also be treated the same as the “quality jobs.”
Yes, the proverbial high-paid consulting gig (usually contrasted to a mere barista in order to make a point about the range of non-legal careers out there). The ATL Top 50 excluded so-called “JD Advantage” jobs. Dan Rodriguez of Northwestern makes many fair points in criticizing this omission. No doubt there are many great jobs in the JD Advantage category, but parsing out the “good” non-legal jobs from the rest is the problem. Whenever we could, we used the most straightforward, well-defined data points, such as large firm and federal clerk placement statistics. Even if the DOJ Honors Program, Skadden Fellows, or many other great jobs aren’t included, we still feel that, based on the currently available data, our “quality jobs” score is a reasonable proxy for quality jobs generally.
3. Cost: What about tuition assistance? Scholarships? Buying power index?
Did ATL cost/affordability component account for scholarship/grants and post-graduation loan repayment assistance, or is it only tuition cost?
We just looked at sticker price, adjusted for local cost of living. Of course, it would be preferable to be able to incorporate the “average amount of non-dischargeable debt” into our methodology, but that data is not forthcoming from the schools.
4. Including SCOTUS and federal clerks as well as federal judgeships doesn’t make sense. Those positions are obtained by people with connections and they are rare — to give them so much weight is elitist.
It’s true that these positions are rare, but we felt it made sense to include them as they helped differentiate among the top schools. Farther down the list, they just aren’t much of factor. As for the elitism charge? Guilty. It’s simply the nature of the profession. Our rankings aren’t for everyone.
5. The top schools are the same as USN, just shuffled.
Not quite. In the U.S. News rankings top 10, you see, at most, a couple of schools switching positions each year. So if you compare the two lists from the perspective of the schools which have been holding steady at certain spot for a while, the ATL top 10 does actually feel like a bit of a shake-up by comparison (e.g., Columbia and NYU). Of course, the top three are the same — is anyone seriously arguing that those aren’t the three top schools? Any rankings approach that found otherwise could not be taken seriously. And outside the top tier, the changes are more significant. Some schools that fly a bit under the national radar but do well in their local job markets benefited from our rankings, including New Mexico (+38 places compared to U.S. News), SMU (+26), Seton Hall (+28).
Finally, if you haven’t already, please be sure to take our (brief and confidential) ATL Insider Survey.