California’s troubled bar exam faced scrutiny as it was revealed that AI and nonlawyers helped draft some questions. The exam, developed under a $8.25 million contract with Kaplan, is criticized for transparency issues and potential conflicts of interest. With calls for lowering the passing score and a growing number of withdrawals from candidates, concerns about the exam’s quality continue to mount.
In a curious twist amidst the troubles surrounding California’s new bar exam, it has recently come to light that some of its questions were drafted by nonlawyers with the aid of artificial intelligence. Despite a hefty $8.25 million contract with Kaplan for the exam’s design, details have emerged that raise eyebrows, not just among candidates, but also law educators across the state.
While Kaplan was entrusted with most of the exam’s development, a handful of questions stemmed from the so-called “baby bar” exam, which is catered to first-year law students. Moreover, the state bar engaged ACS Ventures, an independent psychometrician, to utilize AI in drafting multiple-choice questions. These were then reviewed by panels of experts yet still, concerns about transparency loomed large.
Last summer, California’s financial struggles led to a shift towards a hybrid bar exam system. This change, however, resulted in technical glitches and organizational hiccups when the exam was launched in February. Law faculty, notably those from the University of California, Irvine, expressed outrage over the state bar’s decision to include AI-assisted questions authored by nonlawyers, labeling it both irresponsible and reckless.
Mary Basick, assistant dean at UC Irvine School of Law, was particularly vocal about this breach of trust. She emphasized that without the California Supreme Court’s approval, using these AI-generated questions was questionable at best. Similarly, Martin Pritikin from Purdue Global Law School remarked on the lack of prior disclosure regarding AI’s involvement in drafting the test, citing a failure to define boundaries for its usage.
Kaplan’s contract included terms preventing them from partaking in bar-prep activities in the state, suggesting a desire to mitigate conflicts of interest. However, Pritikin argued that allowing ACS Ventures to create questions might violate that principle as they also evaluate the exam’s validity.
The timeline for creating the new exam also raised alarms, with legal educators questioning the rushed process that usually requires years to refine. Basick noted an abundance of errors in practice materials provided to students, raising further suspicions about the quality overall. Potential collaboration with law faculty was offered but quickly withdrawn, citing a conflict of interest due to their ties with the National Conference of Bar Examiners.
As discussions regarding remedies for February’s exam takers continue, a recommendation has been proposed to lower the passing score from 560 to 534, pending Supreme Court approval. Yet, notably, the court was not informed about AI’s role in the question drafting, prompting further inquiries into how the technology was employed.
As more than 5,600 candidates registered for the new exam, a significant number—a whopping 1,300—later withdrew amid the chaos leading up to it. Concerns raised by faculty like Pritikin, who likened candidates to “unwitting guinea pigs” for untested AI methods, add to the growing unease surrounding the exam’s integrity as urgent discussions loom on alternative pathways for licensure for those affected.
In conclusion, the involvement of AI and nonlawyers in drafting parts of California’s bar exam has ignited a tempest among legal educators and candidates alike. With glaring issues of transparency and possible conflicts of interest at play, the legitimacy of the exam hangs in the balance. As remedies for those impacted are deliberated, questions about the overall reliability and quality of the exam remain critical as California navigates these troubled waters.
Original Source: www.abajournal.com