
A ma connaissance, c’est la première décision où le juge traite explicitement de son usage de l’IA dans la préparation et la rédaction d’un jugement.
Extrait de VP Evans (as executrix of HB Evans, deceased) & Ors v The Commissioners for HMRC, § 42 et ss (https://caselaw.nationalarchives.gov.uk/ukftt/tc/2025/1112#download-options):
« The use of AI
42. I have used AI in the production of this decision.
43.This application is well-suited to this approach. It is a discrete case-management matter, dealt with on the papers, and without a hearing. The parties’ respective positions on the issue which I must decide are contained entirely in their written submissions and the other materials placed before me. I have not heard any evidence; nor am I called upon to make any decision as to the honesty or credibility of any party.
44. In his Practice Direction on Reasons for Decisions, released on 4 June 2024, the Senior President of Tribunals wrote:
« Modern ways of working, facilitated by digital processes, will generally enable greater efficiencies in the work of the tribunals, including the logistics of decision-making. Full use should be made of any tools and techniques that are available to assist in the swift production of decisions. »
45. I regard AI as such a tool, and this is the first decision in which I have grasped the nettle of using it. Although judges are not generally obliged to describe the research or preparatory work which may have been done in order to produce a judgment, it seems to me appropriate, in this case, for me to say what I have done.
46. The Senior President’s guidance has recently been endorsed by the Upper Tribunal: see Medpro Healthcare v HMRC [2025] UKUT 255 (TCC) at [40] et seq (Marcus Smith J and UTJ Jonathan Cannan).
47. In April 2025, the senior Courts and Tribunals judiciary published « AI: Guidance for Judicial Office Holders ». It is available online. It updated and replaced a guidance document originally issued in December 2023. The stated aim of the guidance was to assist judicial office holders in relation to the use of AI. It emphasises that any use of AI by or on behalf of the judiciary must be consistent with the judiciary’s overarching obligation to protect the integrity of the administration of justice. The guidance mandated the use of a private AI tool, Microsoft’s ‘Copilot Chat’, available to judicial office holders through our platform, eJudiciary. As long as judicial office holders are logged into their eJudiciary accounts, the data they enter into Copilot remains secure and private. Unlike other large language models, it is not made public.
48. Principally, I have used AI to summarise the documents, but I have satisfied myself that the summaries – treated only as a first-draft – are accurate. I have not used the AI for legal research.
49. I am mindful that « the critical underlying principle is that it must be clear from a fair reading of the decision that the judge has brought their own independent judgment to bear in determining the issues before them »: see Medpro at [43]. This decision has my name at the end. I am the decision-maker, and I am responsible for this material. The judgment applied – in the sense of the evaluative faculty, weighing-up the arguments, and framing the terms of the order – has been entirely mine.»
Voilà qui est clair, et plutôt bien tourné de la part du juge Christopher McNall…
Me Philippe Ehrenström, avocat, LLM, CAS en Droit et Intelligence Artificielle