Artificial Intelligence and automation in the legal system is increasing and the SRA has just authorised the first AI-driven law firm. It is important for the insurance market and their insureds that proactive steps are taken to train law firm staff on the appropriate use of AI and ensure that safeguarding measures are put in place to mitigate the risk from the improper use of AI. The Court's reminder of the need for careful human input in the context of citing cases is timely.
In the case of Frederick Ayinde, R (on the application of) v The London Borough of Haringey, the Claimant brought Judicial Review proceedings against the Defendant which were settled before trial. The key point of interest for risk management concerns the Defendant's subsequent application for a wasted costs order against the Claimant's solicitors and barrister, which was based on their reliance upon five non-existent cases in the Claimant's Statement of Facts and Grounds for the Judicial Review.
The Court considered this to be extremely troubling and a "professional shame", and rejected the Claimant's barrister's explanation for how the non-existent cases had been cited. One explanation for how the cases arose was that the barrister had used AI, but the Court did not know whether that was true and could not make a finding on it because the barrister did not give witness evidence.
The Court made a wasted costs order against the Claimant's solicitors and barrister for what it considered was "appalling professional misbehaviour". It was wholly improper to include false cases in a pleading, and producing submissions based on fake cases was misleading the Court. It was unreasonable, when it was pointed out, to say the cases were “minor citation errors” or “cosmetic errors”. The Court held that it was the responsibility of the Claimant's legal team to ensure that the pleading was correct, and they should have been shocked when they were told the citations did not refer to real cases.
According to the Court, the barrister should have reported herself to the Bar Council and the solicitors to their regulator, the Solicitors Regulation Authority as their actions, in the Court's view constituted professional misconduct. The barrister had intentionally put the cases into her Statement of Facts and Grounds, without checking whether they existed or not.
The Court also considered that it would have been negligent for the barrister to have used AI and not to have checked the output for accuracy before including it in her pleading.
In Venkateshwarlu Bandla v Solicitors Regulation Authority, the Court refused to extend time for a former solicitor to appeal against the SRA’s decision in 2017 to strike him off the Roll of Solicitors. The Appellant had failed to provide sufficient evidence to support his assertion that his seven-year delay in appealing the decision was due to a mental health disorder.
The Court would have struck out the Appellant's Grounds of Appeal as an abuse of process of the Court notwithstanding the delay on the basis that he had cited 25 non-existent cases in support of the appeal in his Grounds of Appeal and Skeleton Argument.
When addressing this latter issue in relation to the first of a number of non-existent case references the Appellant explained that he had not written the summary himself, he had not read the judgments, he denied using AI and claimed to have simply used a Google search for "case law in support of mental health problems". The Appellant accepted that this case, and many other cases which he cited to the Court, did not in fact exist and he also accepted that he had not checked the citations.
The Appellant's answer to why, in light of this citation of non-existent authorities, the Court should not strike out the Grounds of Appeal, was that the substance of the points which were being put forward in the Grounds were sound, even if the authority which was being cited in support of those points did not exist. The Judge was wholly unpersuaded and struck out the Grounds of Appeal as an abuse of process stating:
"the Court needs to take decisive action to protect the integrity of its processes against any citation of fake authority. There have been multiple examples of fake authorities cited by the Appellant to the Court, in these proceedings. They are non-existent cases. Here, moreover, they have been put forward by someone who was previously a practising solicitor. The citations were included, and maintained, in formal documents before the Court."
Comment
These recent decisions reinforce the need for care and accuracy to be taken in submissions to the Court. All Court documents, no matter how they have been produced and whether or not AI has been used as a tool when drafting them must be checked for accuracy. Failure to do so exposes the party filing the submissions to the risk that if they are inaccurate the Court could determine that they have acted in a way that undermines the integrity of the profession, exposing the solicitor and barrister professionals to a negligence claim and possible regulatory action.
As the legal profession enters a new era of AI, it remains to be seen how technology will continue to shape legal work, however, these two recent cases serve to underline the fact that AI is not a substitute for proper legal work. It is important that law firms train their staff on the appropriate use of AI and ensure that safeguarding measures are put in place to avoid the serious implications which can arise from the improper use of AI. Finally, it is vital that lawyers check the accuracy of their work product, and take full ownership of it, to ensure that they discharge their responsibility to ensure that information presented to the Court is correct.