Talent Assessment at Law Firms

Noah Messing explores how law firms might reassess how they select junior associates and how law students should pay attention to the skills that law firms want most, like legal writing.

Companies like Google, Microsoft, and Facebook have hired thousands of employees over the last decade by relying on brain teasers such as “Why are manhole covers round?” and “How would you weigh your head?” One psychology professor concluded last year that this sort of “puzzle interview is being used with greater frequency by employers in a variety of industries.” Earlier this week, however, a top Human Resources executive at Google reported that his company had scrapped the practice, offering the following admission: “brainteasers are a complete waste of time.” Google realized that its tests failed to identify the traits that correlate with success. For instance, Google now seeks managers who are “consistent and fair” even if they aren’t good at estimating how many golf balls can fit inside a school bus.

Law firms are overdue for a similar reassessment of how they select junior associates. And as a corollary, law students should pay attention to the skills that law firms ask them about.

Let’s start with the employers. Several years ago, I organized a focus group of partners from top-10 Vault firms. I wanted to learn which skills Yale Law should emphasize as we continue to modernize the way that we train our students. The partners (including two corporate attorneys) all said that legal writing was the most important skill for junior associates.

The simplest way to know how candidates write, of course, is to evaluate their writing. Yet many firms don’t even ask for a writing sample. Earlier this week, I asked roughly fifteen students whether firms demanded a writing sample and whether interviewers asked about those writing samples during callbacks. On the first question, one student reported that none of the eight New York firms to which he applied asked for a writing sample. Numerous other students told me that fewer than half of firms asked for one. On the second question—how frequently interviewers inquired about the writing samples during callbacks—the most common answer, by far, was never.

Perhaps law firms study each applicant’s writing sample without asking about it. Admittedly, asking a law student about a writing sample might be akin to asking computer programmers how they would weigh their heads—irritating and unrevealing. As one student wrote to me, the one firm that probed him about his writing sample “took a more aggressive approach than I was expecting,” which “gave me some sense of the firm’s personality,” making him “much more reluctant to accept an offer.” But even if firms privately scrutinize writing samples before extending any offers, writing samples are, even under the best of circumstances, flawed diagnostic tools. Employers have no way of knowing whether a more-experienced lawyer edited the document, whether the applicant prepared the project efficiently, whether the assignment’s “prompt” gave numerous hints to make the challenge easier for students, whether a student omitted an argument for strategic reasons or because of an oversight, or—perhaps most importantly—how well other applicants could have handled the same problem. And most writing samples reveal nothing about an applicant’s ability to draft contractual provisions, write a letter, memorialize negotiations with opposing counsel, or complete the countless other tasks that require lawyers to write capably. In short, writing samples, even when read, provide little insight into an applicant’s ability to perform a typical lawyer’s most important challenges.

Legal employers should have a better way to assess a recruit’s ability to write well. To be sure, talent assessment is always difficult, and I don’t mean to suggest that any simple screen can consistently identify superstars. And other skills, unrelated to legal writing, are obviously critical, too. But all things equal, better legal writers make better lawyers, and a recruit’s prowess as a writer is easy to assess—provided that everyone is completing the sample problem in the same amount of time.

I realize, however, that no big law firm wants to be the first to demand that applicants prepare a comprehensive writing sample. That innovative firm might identify great recruits, but the yield would plummet. Why spend a Saturday afternoon taking the “Cravath Test” if other great firms will extend you an offer without even asking for a writing sample?

Sponsored

This question, however, has three answers.

First, firms should pay applicants for their time. It’s worth it. To everyone. As one possible illustration (which is just a sketch, not a master plan), firms could use traditional screening methods, such as grades, to assess which candidates preliminarily interest them. They could then pay applicants (probably around $300) to take a four- to six-hour performance test. Students would thus be paid generously for their time (but not enough to spawn professional test-takers). Many students who weren’t considering the firm seriously would drop out. NALP could even schedule the tests at the same time to prevent applicants from taking scores of tests.

Under this approach, each participating firm would have much more information about its recruits, would likely get a higher yield when it made offers (because only serious applicants would take the test), and could spend far fewer partner-hours on recruiting. Given the costs of hiring errors, avoiding a single bad recruit would amply cover the costs of the entire venture.

Firms that have made similarly bold moves have thrived. Wachtell, which pays far more than the market, is a juggernaut. Boies Schiller (my old firm) let associates take a stake in contingency-fee cases, attracting lawyers who were willing to bet on themselves. Susman Godfrey paid big bonuses and had a short partnership track. Quinn Emanuel became sexy largely because, in its early days, it gave summer associates a $10,000 bonus on top of their salaries. Innovative employment terms, accompanied by money, tend to attract top applicants. Investing a few hundred dollars per candidate is, by comparison, a bargain.

Second, firms could use some sort of writing competition only at schools where they don’t interview, democratizing the hiring process and identifying overlooked stars. Some small firms already use this approach. For instance, a top civil-rights boutique in Atlanta, Barrett & Farahany, requires applicants (including those from elite schools) to interview a potential client and to write a memo advising the firm whether to take the client’s case—among numerous other steps in the hiring process. Each applicant’s memo is compared alongside those of other candidates, so apples are compared to apples.

Sponsored

Third, the test could be optional, permitting applicants to choose for themselves whether to participate. Eventually, competitive pressures would nudge most students into taking it, just as nearly all of the top college-football prospects attend the diagnostic “Combine” at which the NFL measures strength, speed, agility, and other attributes that correlate with success as a football player. In short, a bit of cash and a sensible test could help law firms make better hiring decisions (including by finding undervalued students within top law schools, such as powerhouses who happen to interview poorly).

What does it say that big firms don’t seek meaningful information about applicants’ ability to write? Do they not care? Or do market pressures simply keep them from demanding relevant information? As counterintuitive as it seems, law students should want employers to assess the skills that they value even if the process becomes a tad more burdensome. If, during an interview, students meet a potential client and advise the firm whether to take that client’s case, there’s a good chance that they’ll be doing that sort of work as an attorney. Most students would want that job. Similarly, if an interviewer strives to learn how well a student analyzes complicated legal issues, the job is probably interesting and challenging. By contrast, if an interview fails to assess students’ skills, there’s a dark truth to the interview: the job may not require any. Applicants should brace themselves for document review or due diligence.

The LSAT, in its glorious imperfection, positively correlates with law school performance and bar passage. Some sort of comparable test (whether customized or standardized) to project how law students will fare as lawyers would help law firms make better hiring decisions and would have the ancillary benefit of encouraging students to develop their skills. Some law schools might train students to ace the various tests, but that’s the outcome for which law firms have been eternally hungering. A good diagnostic test could also assess an applicant’s aptitude at various types of skills, allowing law students to know which practice areas would suit them, either intellectually or temperamentally—just as top investment banks like Goldman Sachs do. Instead, law students often drift into practice groups based on either weak preferences or a firm’s ephemeral needs. I’m fully aware that no firm will develop an assessment test that perfectly forecasts employee outcomes. But the inability to achieve perfection is a straw man: my goal is to develop meaningful diagnostic tools, not infallible ones. In our profession, we still hire people based on questions about manholes. We can do better.

Noah Messing is Yale Law School’s Lecturer in the Practice of Law and Legal Writing. His first book, The Art of Advocacy: Briefs, Motions, and Writing Strategies of America’s Best Lawyers (affiliate link), was released earlier today. Examples from the book appear at www.noahmessing.com. On Twitter, you can follow him at @noahmessing.