Translate

Thursday, December 11, 2014

Is the Bar Exam Broken? Or Are Law Students Dumber?

Law schools and the bar exam's creators agree: The plunge in test scores that hit several states this year is alarming, and it's probably the other side's fault.  
“This test is mysterious, unpredictable, and unfair," says Nick Allard, dean of Brooklyn Law School. "It is depriving highly qualified, motivated, well-prepared people from earning their license.” 
Allard was one of 79 law school deans who signed a letter last month demanding that the National Conference of Bar Examiners, a nonprofit that creates the multiple-choice portion of the test used in many state exams, “facilitate a thorough investigation of the administration and scoring of the July 2014 bar exam.”
The letter came after scores on the Multistate Bar Exam, the part of the test administered by the NCBE, fell to their lowest level in a decade. Erica Moeser, the nonprofit’s president, wrote a memo to deans in October that blamed the poor showing on a “less able” group of students. Moeser says she regrets that wording now, which she says elicited “screeches” from deans, but maintains her organization did not score the test wrong, as some have suggested. 
“It was graded correctly,” she says. “It’s certainly no insult to the group. It’s simply an objective fact.”
Still, some deans and legal experts are floating tentative theories about the historically bad results. Derek Muller, a law professor at Pepperdine University, says he has tested and rejected every explanation not tied to the test itself. Muller compared LSAT scores with bar exam scores and found that this year’s law grads should have done only slightly worse than last year. He also rejected the idea that a glitch in Examsoft, the software used to upload the July test, could have made the difference, because states that did not use Examsoft, such as Arizona and Virgina, still saw their pass rates dip. “By process of elimination, I’m running out of alternative explanations and looking more to the NCBE as a possibility,” he says.
Derek MullerMuller says the NCBE could have bungled scoring by grading the tests more harshly than in past years. The scoring specifics are complex, but basically the NCBE grades each test-taker on a scale that's determined by comparing their performance on certain questions with those of past test takers. Muller readily admits he doesn't have data to test the theory, but he says it’s possible the NCBE used particularly tough questions to scale this year’s results.
For her part, Moeser says the questions used to scale the test and every other item on it are reviewed extensively by academic experts before and after people take the exam. She says law schools should look inward instead of blaming the bar for the new crop of failed lawyers. 
Median scores on the LSAT—the law school entrance exam—have dipped since 2010, partly, experts say, because as fewer people apply to law schools, many are filling their classes with a larger share of people with low LSAT scores. Research shows that LSAT scores align pretty neatly with bar exam scores.     
“A law school should only be accepting people who can complete the program and enter the profession,” Moeser says. “If you as a law school are sitting there and admitting a student for whom you cannot say that, I think you have an obligation to do something about it.”
Some say the low scores—whether caused by weaker test-takers or a flawed exam—are proof the test should be abolished altogether. 
“Far too many people are not passing,”Allard says, which makes him wonder “whether this additional mysterious gantlet of passing the bar is necessary.” In Wisconsin, where the NCBE is located, Allard is quick to point out, graduates of some local law schools don’t even need to take the bar exam, let alone pass it, to practice the law.

No comments:

Post a Comment