Can’t See Straight: The Foolproof Method to Combat AI Hallucinations

All of the hand-wringing over artificial intelligence tools hallucinating caselaw, frankly, takes away from more important AI governance issues, that law firms should be focusing more of their efforts on . Once lawyers realize that hallucinations can happen in AI software – which most should be aware of, by now: there are simple fixes to preventing those hallucinations from having a negative effect on your law practice. In the first place, there is a simple way to determine whether a cited source is correct: Double check its viability through a second, non-AI source . Then, make sure that your staff does the same thing, and that the work product is vetted by a supervisor before it leaves the firm. That’s it . That’s how you combat hallucinations. Of course, this shouldn’t be all that different from how your law firm was operating , pre-AI . And, this tracks with the nature of AI software – that it is more like an emp...