A Cautionary Tale — Cartoon Points Out the Downsides of Automated Courts

The LA Times has a great cartoon on court automation that might give folks a kick.

Accompanying a (presumably humorous) piece on the potential of court automation, the cartoon has four panels.  In one a person tries to fill in a form.  In the second a “truth algorithm” is applied.  In the third the person is told that gay marriage is OK, and the litigant says that he had applied to subdivide a plot of land, and in the fourth, the litigant is told that there is no appeal.

It is well worth clicking through and seeing the whole thing.

Seriously, the points are valid, and need to be internalized as we move forward with technology — regardless of how minor compared to what is ironically suggested in the cartoon:

  • Must be easy to use for all
  • Any algorithms must be transparent and legitimate — validated and known to be validated
  • There have to be systems to correct errors
  • For a long time — perhaps for ever — there have to human checks available
  • Circularity must be avoided — a pattern of results that are algorithm driven can not alone justify an algorithm

These topics are discussed in more detail including ideas for moving forward, as well as risk minimization, in a thought exercise paper I wrote for the LSC Tech Summit.

Thanks to Bonnie Hough, who caught this and who is always on point in warning of the dangers as well as highlighting the potential benefits of technology.

 

 

Advertisements

About richardzorza

I am deeply involved in access to justice and the patient voice movement.
This entry was posted in Systematic Change, Technology, Transparency. Bookmark the permalink.

2 Responses to A Cautionary Tale — Cartoon Points Out the Downsides of Automated Courts

  1. Pingback: Hilarious (and Dark) Side of Automated Court Services: Replacing Courts with Websites | Oregon Legal Research Blog

  2. Claudia Johnson says:

    Thanks for sharing this Richard. This highlights the need to have a diversity of experience and voices involved in creating the tools for access to justice-to allow for different experiences and factors to be included and lead to a better process. Regarding the algorithms/’rules based systems–they have to allow for outliers. There will be some litigants–that despite of what the logarithm says, will be fully capable (or not) to carry their own case. The algorithm and the reviewers will need to allow these outliers to follow a path not recommended by the algorithm. Because race, class, income, education, and health and ability to respond to stress are so intertwined in our society–the impact of any systemization will also need to be reviewed against civil rights. We need to be vigilant and not create rules or systems that systematically lead to disproportionate impact among protected groups and classes (gender, religion, national origin, race, ethnicity and in California–income source).

Comments are closed.