I’m all about working smarter, not harder. And, if there’s one thing that’s accomplished with artificial intelligence and machine learning in particular, it’s that they produce smart output. Think about it. Technology assisted review platforms can parse millions of documents and save tons of time and money in document review; rather than have lawyers billing by the hour put eyes on every document.
There are many other applications for machine learning technologies. It is used in email and spam filtering; to identify malware and phishing expeditions; to categorize enterprise-wide content as part of an information governance initiative. And it’s even in use to diagnose serious illness. The possibilities are endless, frankly.
So, when I saw enterprising companies looking at potentially predicting outcomes of court proceedings, I perk up and pay attention.
Suppose you could collect and analyze the data for US judges’ court rulings –most of which are all public—and then use that data to successfully predict how the courts or an individual might rule on a given issue. Insight like this, even if it only revealed general tendencies, could impact the actions of litigants and their attorneys.
Well, of course this technology exists in the US and elsewhere. People in the legal business have been using this data to assess the outcome of legal proceedings. But in France this recently became illegal.
The French Code of Administrative Justice was recently amended by Article 33 of the Justice Reform Act. It prohibits the use of a judge’s name “for the purpose or effect of evaluating, analyzing, comparing or predicting their actual or alleged professional practices.” A violation of this law is punishable by 5 years in prison and other sanctions.
What?!
Court decisions in France are available to the public free of charge and in electronic format. They typically remove the parties’ names from the decisions, but why would French lawmakers prohibit the analysis of judges’ decisions? Is this not public information?
Your guess is as good as mine, but it seems odd to say the least. Sure, the potential privacy issues apply to private parties under the GDPR. But, it seems to me that judges, for all of their independence and fair-mindedness, are part of the government. And, if your goal is to be transparent and grant greater access, you don’t prohibit the analysis of that information.
The real irony here is that this all comes amidst the French President’s push for greater use of AI. Last year, President Macron gave speeches and interviews proclaiming France would lead the charge on the use of artificial intelligence. He even pledged nearly $2 billion to do it.
Legal technology companies should be careful not to run afoul of the new French law. But, if you’re not in France and subject to their jurisdiction, it seems unlikely they could do much about it.
(This article originally appeared on Above the Law in a slightly altered format)