This is sort of fun.
We have been struggling now for over two decades to find the most useful and access-expanding ways of explaining when an interaction is the giving of information, not generally subject of regulation as the practice of law, and when it is something more, which should be restricted, usually but not always at this point, to lawyers. As a general matter the information/advice test is usually used, although it is often recognized that the definitions shift over time, and that the core point is preserving neutrality when it is the court helping, and competence, regardless of who is helping.
Most of you probably know about Google Home, of which you can ask questions. Monty Python fans, for example, will be happy that if you ask, “who won the cup final in 1949?” you will get the right answer. Perhaps more surprising, if you ask, “what is class struggle?’ you do get an appropriate answer. But if you ask the more “authorized version” of Python, as used in the Communist Quiz Sketch, asked of Karl Marx, “The struggle of class against class is a what struggle?” Google home is as yet unable to give an answer.
So what? Well look at this analysis by John Brandon under the heading, Google sticks to the facts, but needs a point of view, of Google Home’s capacity, and see if it seems somehow familiar.
Using the Google Home on a daily basis makes you appreciate how helpful it is. You can ask for directions and find out about the weather. After a while, you realize the Assistant that answers questions can provide a wealth of information, but it’s essentially a duplicate of Google Search. Just imagine how much more valuable the device would become if it could also give advice. . . .
One reason is that the bot on Google Home is not that intelligent yet. It doesn’t really know me, and it doesn’t really know how to give advice. It can tell me to bring a jacket on the trip because of a weather report, but doesn’t go a level deeper and know that the trail I’ve picked is known for inclement weather and wind — especially 20 miles from my origin point. As a voice-enabled version of search, it is helpful. But a true bot needs to parse complex information and provide better advice. It needs to go a few steps further and understand what I’m trying to do, become more proactive, and engage in a discussion with me that is helpful in a way that goes beyond the facts. (Bold added)
And:
A true AI assistant would know about me and my tastes, and know how to match the data already out on the web with my individual preferences. It would know how to give advice by correlating various inputs — exactly like the human brain
In other words, facts yes, opinions and judgement, no.
Put that way, it is easy to see why we limit going beyond legal information right now. But it is also becoming easier to envision that changing over time. What would a
“Google Court,” look like?
In the short term, it may actually be useful, in analyzing if something is legal information, in asking what questions such a environment could actually answer, and in what way. I think these are OK.
When can I file for divorce? (some follow up needed)
How much child support will I get? (if a formula can be applied)
But not:
Should I ask for custody of my children? (although a well written response could provide a framework for the information seeker to think about the issue).
So, play around with Good Home or equivalents, and see if it helps you think about these choices.
Thanks for sharing this Richard. I’ll check it out. Gut reaction–if it can cause harm and the assessment is done without a full picture of the material facts–then that is advice. Second gut reaction–advice is advice, no matter what you call it. You can call it services, information, referral, or something else. Once you go into the advice world–you really need to know what you are doing and what you are sharing with the person. Words matter.