Artificial intelligence is everywhere. From drafting emails to answering legal questions in seconds, tools like ChatGPT are increasingly being used by people going through divorce, child arrangement disputes and other family law issues.
It is easy to see why. When someone is dealing with separation, worries about children, or financial pressure, typing a question into an AI chatbot can feel faster, cheaper and less intimidating than speaking to a solicitor.
But UK courts are now issuing serious warnings about the dangers of relying on AI-generated legal advice.
In recent months, judges have criticised lawyers and those representing themselves who submitted fake legal authorities, incorrect case citations and entirely fabricated legal arguments generated by AI software. The message from the courts is becoming increasingly clear: AI can be a useful tool, but it should never be relied upon as a substitute for proper legal advice.
What’s wrong with asking AI for legal advice?
One of the biggest risks with AI systems is something known as a “hallucination”.
Despite sounding sophisticated, AI does not actually understand the law in the way a solicitor or barrister does. Instead, it predicts responses based on patterns in huge amounts of online text. Most of the time the output looks convincing. Occasionally, however, it produces information that is simply false.
The problem is that the false information often sounds completely genuine.
An AI-generated answer may confidently refer to:
• court decisions that do not exist,
• legal principles that are outdated,
• incorrect procedural rules, or
• quotations that were never said by any judge.
For someone unfamiliar with the legal system, spotting those errors can be almost impossible.
That becomes particularly dangerous in family law, where decisions involving children, finances and personal safety can have life-changing consequences. It’s also important to note that no two cases are the same. That is why legal advice from a solicitor is bespoke as what works for you, may not work for others.
Have there been issues with fake AI cases?
This is no longer a hypothetical issue.
Earlier this year, the High Court criticised lawyers involved in a judicial review case after written submissions contained fake legal authorities generated by AI. The judge described the conduct as “appalling professional misbehaviour” and referred the matter to regulators.
The court found that several cited cases simply did not exist.
In another recent tribunal case, solicitors faced scrutiny after false authorities generated by AI software appeared in legal documents submitted to the court. Judges stressed that lawyers remain personally responsible for checking every case and authority relied upon, regardless of whether AI was involved.
The issue has not been limited to lawyers.
In Manchester, a litigant in person reportedly used ChatGPT to research legal authorities for their own case. One authority turned out not to exist at all, while other quotations had been entirely fabricated.
Similarly, a tax tribunal recently dealt with a case involving multiple fictitious legal authorities generated by AI software. Although the errors were found to be accidental, the case highlighted how easily people can unknowingly place false information before the court.
Why is AI so risky in family law?
Family law is rarely straightforward.
Whether the issue involves divorce, child arrangements, domestic abuse allegations or financial settlements, outcomes depend heavily on the specific facts of each case. Small details can significantly affect the court’s decision.
That is where AI tools can become unreliable.
A chatbot cannot properly assess safeguarding concerns, evaluate evidence, judge credibility, or apply discretion in the way a family court judge can. It may also confuse English law with American law or provide answers based on outdated legal information.
For example, someone searching online might receive an AI-generated answer suggesting:
• they automatically have rights as a “common law spouse”,
• a child can decide which parent to live with at a particular age,
• they can relocate abroad with a child without court permission, or
• domestic abuse allegations have no impact on child contact.
In reality, the law is far more nuanced.
So should I not use AI at all?
Many clients use AI to summarise their thoughts or create a succinct list of their concerns and questions. Used carefully and responsibly, these tools can be helpful.
However, there is a significant difference between using AI as an administrative aid and relying on it for legal advice. Even the most advanced AI systems cannot replace professional judgment, legal training or detailed knowledge of current family law practice.
How can you help?
AI is likely to remain part of the future of legal services. It can help improve accessibility and save time. But when it comes to your family law matter, whether involving children, finances or personal safety, relying solely on ChatGPT or other AI platforms can be risky.
Online answers may appear persuasive while being legally inaccurate, incomplete or entirely fabricated.
If you are facing issues involving divorce, child arrangements, financial disputes or domestic abuse, it is important to seek advice tailored to your individual circumstances from one of our qualified family law professionals. Use the links below to book an appointment today.
