Generic AI tools — ChatGPT, Microsoft Copilot, Google Gemini — are excellent at what they were built for: summarising articles, drafting emails, writing code. They were not built for legal practice. They do not understand the difference between persuasive and binding authority. They hallucinate citations that do not exist. They read your documents but cannot tell you what they mean for your client's case.
And yet, attorneys are using them. Some have paid for it in front of judges.
This is not a theoretical risk. It is a pattern that is already playing out in courtrooms across the world — and the structural reasons it will keep happening are worth understanding before it costs your firm.
When Generic AI Goes Wrong in Legal Practice
The Mata v. Avianca Citation Scandal
In 2023, New York attorney Steven Schwartz submitted a legal brief that cited six cases. None of them existed. Schwartz had used ChatGPT to research precedent and trusted the output without checking. When the opposing party could not locate the cases, the court ordered the attorneys to explain themselves.
Judge P. Kevin Castel imposed sanctions on both Schwartz and his supervising partner, Peter LoDuca, finding that they had made "acts of conscious avoidance and false statements to the Court." The firm was fined USD 5,000 and the attorneys were required to notify the judges of the cases they had cited — cases that did not exist.
ChatGPT does not retrieve live case law. It generates text that looks like case citations based on patterns in its training data. It has no mechanism to verify whether a case exists, what it says, or whether it is still good law.
The Michael Cohen Incident
Michael Cohen, former personal attorney to Donald Trump, acknowledged in 2023 that he had given his own attorney documents containing fake case citations generated by Google Bard. Cohen said he was not aware Bard was capable of fabricating legal authorities. His attorney submitted them to federal court without verification.
The court was not sympathetic.
South Africa Is Not Immune
These incidents are not confined to US jurisdiction. The underlying technology is identical. An attorney in Johannesburg using ChatGPT to research South African case law faces exactly the same hallucination risk — with the additional problem that South African law, statute, and LPC practice rules are underrepresented in ChatGPT's training data. The less a model knows about a jurisdiction, the more it fills gaps by guessing.
There is no publicly reported South African case of an attorney being sanctioned for AI-generated citations yet. That is a matter of time, not a matter of legal culture.
Why Consumer AI Cannot Be Fixed With a Warning Label
The standard response from AI vendors is: "these tools are not legal advice." This is true and also beside the point.
The problem is not that attorneys treat ChatGPT as a lawyer. The problem is that they use it as a research assistant, a document reader, and a drafter — roles where accuracy, jurisdiction-awareness, and verifiability are critical — and the tools are structurally unequipped for those roles.
Hallucination is not a bug to be patched. It is a feature of how large language models work. They predict the most statistically likely next word. When asked about a niche area of South African insolvency law, a general-purpose model has very little training signal to draw from, so it generates plausible-sounding text. It does not know it is wrong. It cannot flag uncertainty reliably. The more confidently wrong it is, the more dangerous it is.
This is not a criticism of ChatGPT. It is an excellent product for its intended purpose. Its intended purpose is not legal practice.
The Copilot Problem: Surface-Level Document Reading
Microsoft Copilot has become popular in law firms because it integrates directly into Microsoft 365 — the environment most South African firms already use. Attorneys can ask Copilot questions about documents stored in SharePoint or OneDrive and get instant answers. On the surface, this looks like exactly what legal AI should do.
In practice, there is a significant gap between what Copilot does and what legal document work requires.
What Copilot Does Well
Copilot is good at retrieving and summarising. Ask it "what documents do we have about the Smith matter?" and it will locate files. Ask it "give me a summary of this contract" and it will produce a competent paragraph. For a general office assistant, this is genuinely useful.
Where Copilot Falls Short for Legal Work
It reads your documents, but it cannot interrogate them.
When you are preparing for a hearing, you do not need a summary. You need to ask: What is the opposing party's strongest argument? Where does our expert's evidence contradict their expert? What are the dates the chronology turns on? These are not retrieval tasks. They require the AI to understand the legal structure of a dispute, the significance of specific document contents, and the relationship between pieces of evidence.
Copilot is designed for the Microsoft 365 productivity surface — email, calendar, documents. It is not designed to hold a multi-turn conversation about a legal matter, surface high-risk clauses, map timelines across multiple uploads, or explain why a particular clause in a lease creates a specific litigation risk.
It does not distinguish between legal and non-legal relevance.
Copilot treats a memo about the office Christmas party and a privileged legal opinion with the same retrieval logic. There is no concept of legal significance baked in. A document that is formally irrelevant to a dispute but happens to contain a keyword will surface alongside genuinely relevant material.
Data residency is an open compliance question.
Microsoft Copilot for M365 operates across Microsoft's global infrastructure. Depending on your tenant configuration, prompts and document contents processed through Copilot may route through data centres outside South Africa. This creates POPIA exposure that many firms have not formally assessed. Microsoft provides data residency options, but they require deliberate configuration — they are not the default — and the position for South African tenants specifically warrants professional POPIA advice before deployment.
It is not built to be corrected or challenged.
When Copilot gives you a summary of a document and you ask "are you sure that's what it says?" it can generate a follow-up answer, but it does not have a reliable mechanism for self-correction on legal matters. Legal AI needs to be built with explicit uncertainty flagging — knowing what it does not know — and general-purpose AI is not designed that way.
What Purpose-Built Legal AI Actually Does Differently
The distinction between Copilot and a purpose-built legal AI tool is not marketing language. It is architectural.
EchoFelix was built specifically for South African legal practice. It is not a general-purpose assistant with a law firm disclaimer attached. The differences matter:
Deep Document Interrogation, Not Just Retrieval
When you upload a case file or contract to EchoFelix, you can ask it questions the way you would brief a junior associate: "What are the three weakest points in the plaintiff's claim?" "What damages are explicitly excluded under clause 14?" "Build me a chronology of all events mentioned in these documents."
This is qualitatively different from Copilot's retrieval model. EchoFelix is designed to hold the legal meaning of a document in context — not just its text — so the answers reflect legal structure, not just keyword matching.
Highlights and Risk Flagging
EchoFelix surfaces high-risk clauses, unusual terms, and potential dispute points automatically. A contract review does not just produce a summary; it produces a structured risk register: what is standard, what is unusual, what the attorney should look at before signing off.
Copilot will tell you what a document says. EchoFelix will tell you what a document means, and where the risk is.
Case-Centric Q&A Across Multiple Documents
Real matters involve dozens of documents: pleadings, expert reports, correspondence, financial records, prior orders. EchoFelix allows attorneys to ask questions across the full matter file — not just individual documents — and get answers that synthesise across sources. "Based on all the documents in this matter, what evidence supports a claim for consequential damages?" is a question Copilot cannot answer from a matter perspective because it is not designed around the concept of a matter.
Secure South African Hosting, POPIA by Design
All EchoFelix data is processed and stored on South African infrastructure. There are no routing decisions to make. No tenant configuration to audit. No POPIA exposure to assess after the fact. It was built that way from the start because it was built for South African attorneys, not adapted for them.
An Addition to Your Workflow, Not a Replacement for It
This is the point that gets lost in the AI-for-lawyers conversation: the goal is not to replace your case management system, your attorneys, or your professional judgement. EchoFelix works alongside the tools you already use — LegalSuite, GhostPractice, or whatever practice management platform your firm runs. It handles the document-heavy, time-intensive parts of legal work: transcription, analysis, research, contract review. Attorneys remain in control of every output.
Every transcript is a draft for attorney review. Every case analysis is a starting point, not a finding. The professional responsibility stays where it belongs.
The Commercial Question: What Does Generic AI Actually Cost You?
The risk calculation for using generic AI in legal practice is not just disciplinary. It is commercial.
Time lost to verification. If you use ChatGPT for research, you must verify every output against primary sources. For experienced attorneys, this may be faster than starting from scratch — but only if the AI-generated output is directionally correct. When it is not, you have wasted time and introduced error.
Client exposure. A matter that goes wrong because of an AI-hallucinated point of law exposes the client to harm and the attorney to a negligence claim. Professional indemnity insurance is unlikely to cover a known and preventable technology risk.
Regulatory exposure. The LPC has not yet issued formal guidance on AI use in legal practice. That guidance is coming. Firms that have already deployed consumer AI tools without formal data processing agreements, jurisdiction-appropriate legal research verification, and POPIA-compliant data handling will be behind when it does.
Reputational exposure. Attorneys caught submitting AI-hallucinated material to court are named in the judgments. Those judgments are public. They are searchable.
What to Ask Before Deploying Any AI in Your Firm
Before your firm deploys any AI tool — generic or purpose-built — these are the questions that matter:
- Where is client data processed? Is it leaving South Africa? Is there a data processing agreement in place?
- What is the hallucination risk? Is the tool designed for legal research with verified sources, or is it a general language model?
- What is the review obligation? Does the tool's output require attorney review before use? How is that workflow enforced?
- Is there legal-domain training? Has the model been trained on South African law specifically, or on general English text?
- What happens when it is wrong? Is there a mechanism for the tool to flag uncertainty? Can errors be traced?
EchoFelix is designed to answer these questions affirmatively. Generic AI tools cannot.
The Right Tool for the Right Job
Microsoft Copilot is a good productivity tool for drafting routine emails, summarising meeting notes, and navigating large document libraries in the Microsoft 365 ecosystem. Use it for those things.
ChatGPT is good for brainstorming arguments, explaining unfamiliar legal concepts at a surface level, and drafting non-legal communication. Use it for those things.
Neither should be the tool your attorneys use when they open a client matter, read pleadings, draft submissions, or form legal opinions.
Legal work requires legal AI. Not a general assistant that has read some law.
Explore what EchoFelix does for South African law firms → | See how it handles case document analysis → | Talk to us about your firm →