When Bar Scores Plummet, Who Will Examine The Examiners?

What caused bar exam pass rates to drop across the country? We don't know -- and we can't know....

The Texas Board of Law Examiners released results from the July 2014 bar exam this week. July 2014 bar passage rates across the nation dropped from their July 2013 equivalents. Among the 29 jurisdictions that have so far released July 2014 scores, four have seen a decrease in pass rates of at least ten points. Thirteen have seen drops of five to nine points. Only eleven states’ averages remain unchanged or dropped four or fewer points.

Nationwide, scores on the Multistate Bar Exam (MBE) sunk to a mean scaled score of 141.7, nearly three points lower than the national mean for the July 2013 exam. This marks the single biggest year-to-year drop since the start of the MBE. July 2014’s score is the lowest national mean MBE score in ten years. Given especially that so many states use MBE results to scale the scoring of the state’s essay portion of the exam, there is little wonder that pass rates have plummeted. (For state-by-state statistics, look at Deceptively Blonde’s helpful collection of statistics here and here.) As more states announce results in the next few weeks, expect to see more depressed — and depressing — statistics.

Mere chance is unlikely to be the culprit when the decreases are so dramatic and so widespread. So, what happened? The National Conference of Bar Examiners blames students . . . .

NCBE president Erica Moeser sent an October 23 memo to law school deans. The subject line reads, “RE: Two Matters.” The memo begins with a clunky “I have been intending to write to you as a reminder that Civil Procedure will appear as the seventh content area on the Multistate Bar Examination beginning in February 2015 . . . .” Though it’s unclear what Moeser has been up to that has kept her from shooting off this friendly note to law schools — she has been intending to do it for a while, you see — law schools hardly need to be reminded that Civ Pro will appear on future bar exams. That’s a bit like if you had sent a memo to Republicans in the U.S. Congress last week saying, “Oh, by the way, I have been intending to write to you as a reminder that the midterm elections will take place next week.”

After a quick statement of the obvious about Civ Pro, Moeser moves into the meat of the memo: the nationwide drop in MBE scores. She writes, “While we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct.” She continues:

“Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of MBE test-takers. This year the number of MBE test-takers fell by five percent. This was not unanticipated: figures from the American Bar Association indicate that first-year law school enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class). We have been expecting a dip in bar examination numbers as declining law school applications and enrollments worked their way up to the law school graduation stage, but the question of the performance of the 2014 graduates was of course and unknown.”

Sponsored

Moeser ends her memo by reassuring deans that “had we discovered an error in MBE scoring, we would have acknowledged and corrected it.” How comforting.

The NCBE suggests that the decline in law school enrollment accounts for the score drop. Although Moeser leaves a few links in the chain of reasoning unspoken, the argument goes like this: Fewer people applied to law school in 2011 than in 2010. Fewer total applicants meant fewer well-qualified applicants. Fewer well-qualified applicants meant that schools lowered their admission standards in order to fill seats in their incoming classes. Graduating classes of students admitted on lowered standards meant less-qualified bar-takers in 2014. Less-qualified bar-takers means lower MBE scores.

We need to look more closely at the class of 2014 in order to accept the NCBE’s hypothesis. First, notice that a drop in enrollment is not the same as a drop in admissions standards. Enrollment might drop in part because schools don’t open their doors wider, even when faced with a smaller pool of applicants. After all, wouldn’t the problem be greater if schools kept enrolling the same number of students, even though the number of well-qualified applicants fell? That would suggest that schools were lowering their standards simply to keep enrollment numbers up. A drop in enrollment suggests, if anything, that schools were not willing to admit some people, even if that meant that they would have 7% fewer tuition checks coming their way. Law schools have no doubt adjusted their admissions standards in recent years, but the 2011 enrollment numbers don’t immediately reflect that.

Were the incoming metrics for that class lower than for previous classes? National LSAT scores for the class in question were not significantly lower overall. Derek Muller, who has done a yeoman’s job of covering the MBE score drop on Excess of Democracy, has a helpful graph depicting the relationship between mean MBE scores and the size of entering 1L classes.

Muller also considers the hypothesis that ExamSoft frustrations affected MBE scores. As he points out, however, jurisdictions that do not use ExamSoft do not seem to buck the national trend. The passage rates in Arizona, Virginia, and the District of Columbia dropped by 7, 7, and 8 points, respectively. Moreover, not all states that used ExamSoft experienced the dreaded upload delays.

Sponsored

So, what caused the drop? Did the test change in unannounced ways? Did the exam include an unusual number of questions that performed statistically in unexpected ways? Did the equating process used to convert raw scores into scaled scores differ this year?

We — law schools, law students, members of the profession — don’t know the answers to those questions. And we can’t know them. Why? Because we can’t know what the NCBE won’t tell us. The NCBE does not tell us much, unfortunately. They don’t have to. The NCBE keeps many of the MBE’s details, including the supremely important equating process, confidential.

The NCBE has, perhaps, earned the right to be opaque and unapologetic. State bars, law schools, and the legal profession as a whole have deferred to the NCBE’s judgment in part because MBE scores have appeared to be a reliable measure of competency. Bar passage rates historically correlate to several other measures of knowledge and skill, which is exactly what we would hope for. The NCBE may not tell us how exactly they arrive at the final scaled scores they deliver twice a year, but so long as those scores continue to appear valid and reliable as measures of what law graduates know and can do, the profession keeps taking the NCBE at its word.

Maybe we should stop taking the NCBE at its word.

Law students deserve greater transparency from the NCBE. They don’t necessarily deserve an easier bar exam. They — and the schools that teach them — deserve to know more about how the NCBE designs and analyzes test items and how, ultimately, the NCBE equates raw scores into scaled scores. When an anomaly like the most dramatic score drop in a test’s history occurs, exam-takers and future exam-takers deserve a better explanation than the NCBE satisfying itself that it didn’t screw up.


Tamara Tabo is a summa cum laude graduate of the Thurgood Marshall School of Law at Texas Southern University, where she served as Editor-in-Chief of the school’s law review. After graduation, she clerked on the U.S. Court of Appeals for the Fifth Circuit. She currently heads the Center for Legal Pedagogy at Texas Southern University, an institute applying cognitive science to improvements in legal education. You can reach her at tabo.atl@gmail.com.