UK Court Divorce Software Made Obvious Math Errors, Not Noticed For Almost 20 Months, Till Nonlawyer Caught it

The Guardian has a story that should cause terror to those who design legal software without properly testing it, as well as those who say we have to limit practice to fully trained lawyers.

As the Guardian reports, the online version of Form E has been in use for divorces in the UK since April 2014.

One particular paragraph, numbered 2.20, which is suposed to produce totals, fails to reflect the minus figure of final liabilities entered earlier on, producing a simple mathematical error. If a party had significant debts or liabilities, they were not recognised or recorded on the electronic form, potentially inflating their true worth. Distorted net figures of applicants’ assets were therefore being produced.

God only knows how many lawyers, supervised paralegls, judges and court staff have used the form in the last 20 months, without any of them noticing what should really be an obvious failure, like failing to deduct prior payments on a bill.

And, guess what, the error was not caught by a barrister, or a solicitor, or any of those listed above.  Rather it was caught by a McKenzie Friend, a lay expert who under the law in most of the Commonwealth is allowed to help litigants, even in court.  For more info on the concept (which has been influential in the US in reducing anxiety about nonlawyer practice, see here.

The first obvious point is that legal software should be tested and tested and tested. I remember back when we were doing some of the first online legal forms, it was really hard to get programmers to understand just how bad errors could be.  Maybe we need to put in place appropriate standards for automated forms that include rigorous testing and certification.

The second is that we have to get over our idea that lawyers are best for everything.  As CJ Jonathan Lippman put it: “sometimes an expert non-lawyer is better than a lawyer non-expert.”  Although here is seems that an expert nonlawyer was better than an entire expert lawyer profession.

I hope that this episode will not slow the adoption of these online tools, merely help make sure that they are done right.  The irony is that in the long term such forms and processes should reduce math errors, rather than increase them — but only if the programmers and testers know what they are doing.

Advertisements

About richardzorza

I am deeply involved in access to justice and the patient voice movement.
This entry was posted in Family Law, Forms, Non-Lawyer Practice, Simplification, Technology. Bookmark the permalink.

4 Responses to UK Court Divorce Software Made Obvious Math Errors, Not Noticed For Almost 20 Months, Till Nonlawyer Caught it

  1. johnpmayer says:

    No one can guarantee correctness of software. All they can do is increase the amount of dur diligence. The open source solution is that “many eyes make all bugs shallow”, but that implies that anyone can see the code – something that won’t happen with most proprietary software at all. Math errors are the most obvious errors, but not the most scary, IMHO. How do we know that the lawyers who interpreted the law correctly when automating the process? Only by being able to review the underlying assumption and by documenting the assumptions in a machine-readable and therefor automatically test-able way.

  2. Jim Greiner says:

    Hi, Richard,

    I happen to have $20 in my pocket. If anyone has a single (1) dollar available, I will make the following bet: the number of math errors made in the set of calculations covered by the software during the 18 months in which this bug was in place was less than the number of math errors made in the same set of calculations in the 18 months before this software came into use. Any takers?

    Software is a terrible, awful way to make legal calculations. Unspeakably bad. In fact, it’s so horrible, it’s almost as bad as having human beings do it (or input the figures into calculators). That’s (almost) how horrible it is. And I know of no greater insult that one could hurl at it.

    Could I suggest the following: “The first obvious point is that [human administration of law, including calculation of legal quantities] should be tested and tested and tested. I remember back when we were doing some of the first [human lawyers], it was really hard to get [those lawyers] to understand just how bad errors could be. Maybe we need to put in place appropriate standards for [lawyers, including their calculation of legal quantities] that include rigorous testing and certification.”

    Happy Holidays!

    • richardzorza says:

      Touche, Jim.
      But, more seriously, it’s a bit like deaths due to nuclear or coal. Its probably true that many more people are killed by coal generation of electricity, but when something goes wrong with a nuclear plant there is a real risk of massive disruption.

      • Jim Greiner says:

        Hi, Richard, I think your analogy is terrific, and it feels like it proves my point. What little I understand of the data suggests that regarding energy generation, nuclear power causes only a fraction of the more or less direct deaths that are caused by the oil and gas industries, even apart from the indirect deaths caused by things like nasty emissions and global warming. A fact less well known than it should be: at least 11 people died in the Deep Water Horizon explosion. And I guess that I would call Deep Water Horizon a “massive disruption.”

        It’s an odd thing. We’re perfectly delighted to shrug off wave upon wave of deaths so long as those deaths occur daily or otherwise appear “normal.” But something concentrated and unusual, even if exceedingly rare, causes us to freak out. This feels like the same thing. We’re perfectly delighted to shrug off mistaken calculations made by human beings, particularly legal staff. After all, everyone knows that lawyers brag about how bad at math they are. It’s “normal.” But if there’s a bug in a computer program, lights out.

Comments are closed.