ExamSoft Tells Senators That Facial Recognition Problems Are Everyone's Fault But Theirs

Sure one-third of California's applicants were flagged for cheating, but...

(Image via Getty)

Late last year, a number of senators wrote ExamSoft demanding some clarity on the various issues that bubbled up surrounding the online bar exam struggles that we reported about all year. Notably, the letter sought answers to allegations of ExamSoft’s facial recognition system repeatedly flagging people for cheating based on race — something that was entirely predictable given the systemic issues in facial recognition technology.

Honestly, we’d missed the company’s response. The holidays crept up and then there was a coup and we moved the entire 401k plan to Gamestop… it’s been busy. And, perhaps, that was ExamSoft’s hope. Thankfully an eagle-eyed observer found the response and posted a link. As you might expect, the answers leave a lot to be desired.

After clarifying that ExamSoft doesn’t own the facial recognition technology it uses for its exam proctoring:

We are committed to ensuring that ExamID is accurate for all exam-takers, regardless of their physical appearance or dress. Though ExamSoft does not directly operate the facial recognition technology ExamID leverages, ExamSoft works to evaluate the technology’s accuracy and effectiveness by performing regular tests and working with institutions to assess the facial recognition software.

That doesn’t really say anything. “We buy the tech from a third party and we ‘work to evaluate’ that it’s not screwing up.” Except we heard a deluge of complaints that it didn’t work. Primarily from people of color who were consistently incorrectly recognized by the system and forced to take the test knowing that they’d already been flagged. Throughout the letter it seems as though ExamSoft is oscillating between laying blame on the third-party technology and blaming examinees for not using the tech correctly. The only consistent theme is that it’s not ExamSoft’s fault.

But ExamSoft’s main takeaway is that even though they don’t bother to keep track of the complaints they’ve received, everything is fine and all of the primarily Black and Brown faces getting flagged are clearly the examinee’s fault.

Sponsored

ExamSoft does not currently maintain statistics on user inquiries related to the effectiveness or accuracy of facial recognition. But we investigate all user complaints and attempt to resolve them for both the individual user and current or future users who may have the same experience. We have investigated whether there have been any systemic issues with ExamID due to individual exam-takers’ race or gender identification. It would be a key priority to resolve any such issues, but we have found none. Rather, our investigation reveals that the most common reasons exam-takers experience difficulty with ExamID are as follows: (1) internet connectivity (whether due to the server, the computer, or the internet service itself); (2) poor picture quality (often due to camera issues such as out-of-date drivers, a dirty lens, or the exam-taker being out of focus); and (3) lighting issues (such as lights shining at the camera or significant under-lighting).

I dunno, that seems like a bold conclusion. In one of my prior posts, I even included a screenshot of a woman who was told the recognition software failed her and it certainly doesn’t look like a blurry picture or a lighting screw-up. That the feedback we received throughout the examination involved ExamSoft’s support people consistently saying that they’d never heard of any racial discrimination issues with the technology even though multiple sources had flagged this before the test started and we know that several examinees called in with this complaint, the primary takeaway from this answer is that ExamSoft should be tracking inquiries because then they might be able to see the trends plaguing the system.

The fallback defense for ExamSoft is that, even if the facial recognition software screws up repeatedly, a human proctor looks at everything after the exam so no one is put out by a failure during the test.

We also designed ExamID to minimize any disruption to exam-takers if the authentication software does not recognize them. As described above, the Deferred ID feature—which most institutions already choose to use and ExamSoft has required the majority of institutions to use in 2021—allows the exam-taker to take an exam without delay or interruption while the software analyzes their photo for later review and determination by an individual at the institution.

Putting aside that the system informed people that they weren’t recognized off the top requiring the examinee to endure the whole session under a cloud, which is far from ideal, this puts a great deal of faith on examiners that, empirically, isn’t warranted. Recall that California, after seeing the volume of flagged exams, including those flagged by this “Deferred ID” system, just went ahead and told one-third of the total number of examinees to prove that they weren’t cheating. In other words, ExamSoft’s precious protocol only succeeds in minimizing the imposition upon the examinee when the state bothers to perform the necessary leg work to dispense with the system’s hyperactive flagging system. This is like the tobacco industry saying, “everything’s safe assuming our customers adhere to a strict limit of one cigarette a week” — if the safeguards are unrealistic, then they’re not safeguards.

Sponsored

All in all, the letter exudes a self-assuredness that belies what took place in the fall. If ExamSoft had taken a different tack and said, “look, there are obviously problems with facial recognition technology that we couldn’t proactively address, but it was a public health emergency and we did everything we could to allow jurisdictions to administer the test they were intent on giving in the safest manner possible,” I’d have a lot more sympathy. Because that’s also probably more accurate! ExamSoft isn’t a mustachioed villain, they just tried to meet a need that state bar exams demanded — it’s not necessarily their fault that the parameters of this request rendered complete success impossible. For obvious reasons, they don’t want to blow up their customers, but it’s the state bar examiners who put ExamSoft in this situation, having to answer questions about how their technology failed to live up to unrealistic expectations.

There’s a lot going on in government right now, but let’s hope the senators who drafted this inquiry don’t let it go. It doesn’t take much to see that the problems of the remote bar exam debacle don’t amount to a hill of beans in this crazy world, but that doesn’t mean it should be swept under the rug. Especially with the adoption of facial recognition technology spreading across the country. This may be a “bar exam problem” for some people but the uncritical acceptance of a dodgy technology can have tremendous impacts elsewhere and this provides a sizable case study to dig into this issue.

(Check out the full letter on the next page.)


HeadshotJoe Patrice is a senior editor at Above the Law and co-host of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you’re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.