What's Happening?
Danielle Malaty, a lawyer formerly with Goldberg Segalla, has been sanctioned for improperly using artificial intelligence in legal filings. Malaty, who represented the Chicago Housing Authority (CHA), cited fictitious court cases generated by ChatGPT in a case involving lead paint poisoning. This was not her first misuse of AI, as she had previously included 12 fabricated case citations in another case. Cook County Circuit Judge William Sullivan fined Malaty $10 and ordered her to pay $1,000 to the plaintiff's counsel for the time spent addressing the issue. Malaty has since been terminated from her firm and has started her own practice.
Why It's Important?
The sanctioning of Danielle Malaty highlights the growing concerns over the use of artificial intelligence in the legal profession. As AI tools become more prevalent, the potential for misuse and the ethical implications are significant. This case underscores the need for strict guidelines and training for legal professionals using AI to ensure accuracy and integrity in legal proceedings. The incident also raises questions about the responsibility of law firms to monitor AI use and implement preventive measures to avoid similar issues in the future.
What's Next?
Goldberg Segalla has implemented firm-wide measures to re-educate its attorneys on AI use policies. Malaty's and the firm's responses to a motion for sanctions in the CHA case are due on August 21, with a hearing scheduled for August 22. The outcome of these proceedings could influence future policies on AI use in legal settings and potentially lead to more stringent regulations.