Dec 18, 2024
3 min

AI Hallucinations and False Information in Legal AI at Biglaw

Note: This article is just one of 60+ sections from our full report titled: The 2024 Legal AI Retrospective - Key Lessons from the Past Year. Please download the full report to check any citations.

Challenge: AI Hallucinations and False Information

AI models can sometimes generate false or nonsensical information, known as "hallucinations."

In 2024, the National Center for State Courts recommended that judicial officers and lawyers must be trained to critically evaluate AI-generated content and not blindly trust its outputs.[129]

In a 2024 survey conducted by LexisNexis, concerns over AI hallucinations were cited as the biggest hurdle to adoption of generative AI in law firms (55%).[130]

Alex Denne
Advisor
Alex Denne, Head of Growth (Open Source Law) at Genie AI, is a legal tech leader and serial founder with over a decade of experience driving innovation and making legal services more accessible. Since joining in 2021, he has scaled the platform from 200 to over 120,000 users, combining deep contract law expertise with a data-driven, open-source approach. He is passionate about democratizing legal knowledge through AI, backed by strong academic credentials and experience leading major product and innovation initiatives.
Alex Denne, Head of Growth (Open Source Law) at Genie AI, is a legal tech leader and serial founder with over a decade of experience driving innovation and making legal services more accessible. Since joining in 2021, he has scaled the platform from 200 to over 120,000 users, combining deep contract law expertise with a data-driven, open-source approach. He is passionate about democratizing legal knowledge through AI, backed by strong academic credentials and experience leading major product and innovation initiatives.

Interested in joining our team?Explore career opportunities with us and be a part of the future of Legal AI.

Jump to