employment outcomes

As all sentient beings are aware, we have a terrible, horrible, no good, very bad legal job market. According to NALP data, the industry is down 50,000 jobs since 2008 and there is no reason to believe they will ever reappear. If you ignore school-funded positions (5% of the total number of jobs), this market is worse than its previous low point of 1993-1994. In light of these grim economic realities, we feel that potential law students should prioritize their future job prospects over other factors in deciding whether to attend law school. To put it mildly, inputs- (LSATs, GPAs, per capita spending, etc.) and reputational survey-based law school rankings schemes have proved unsatisfactory. Hence our release last week of the ATL Top 50 Law Schools, which is based on nothing but outcomes.

(Although he probably disapproves of all rankings, it must be said that the legal world owes a great debt to Kyle McEntee and his colleagues at Law School Transparency. LST has forced us all to look at the publicly available employment data, submitted by the schools to the ABA, in a more meaningful way. Like all good ideas, it seems obvious in retrospect.)

We received a ton of feedback and comments regarding our rankings and our methodology, much of it thoughtful and substantive. (Our very own Elie Mystal weighed in with this takedown the day after we published.) Quite a few recurrent criticisms emerged from the comments. Of course there’s no perfect dataset or methodology. At best, rankings are a useful simulacrum and just one of many tools available to 0Ls for researching and comparing schools.

What follows are the most common criticisms of the ATL Top 50 Law Firms rankings….

double red triangle arrows Continue reading “The ATL Top 50 Law Schools: A Roundup of Criticism”

As some of you may have heard, U.S. News & World Report, which used to be a magazine found in dentists’ offices, released its annual law school rankings last week. This event sparked even more than the usual amount of angst and hysteria among law deans and students. Well, then again, this is already the 9th post on ATL concerning this set of rankings, so maybe we’re not helping. Some deans’ heads have rolled already, and angry student petitions are calling for more blood. (Do these reactions among law students run one way though? The anger sparked by a drop in rankings does not necessarily mean an inverse spike in happiness when a school climbs up, as this great pairing of gifs from someone at Chicago Law illustrates.)

Anyway, much of the heightened attention is due to the revisions U.S. News made to their rankings methodology, which now applies different weights to different employment outcomes, giving full weight only to full-time jobs where “bar passage is required or a J.D. gives them an advantage.” Whatever that last bit means. And they won’t tell us exactly how “part-time” and other categories of employment outcomes factor in. But it is at least an acknowledgement on their part of the perception that, as Staci said yesterday, “all anyone cares about are employment statistics.” (We’ll get back to whether that’s strictly true.) Then again, if employment outcomes make up only 14% of your ranking formula for a professional school, you’re doing it wrong. What would a better, more relevant rankings methodology even look like?

double red triangle arrows Continue reading “What Would a More Relevant Law School Ranking Look Like? You Tell Us”