Rethinking Automated Legal Guidance | Regulatory Review

0

Agencies should exercise caution when using chatbots, virtual assistants, and preset menus to share legal information.

Federal agencies fulfill many functions and responsibilities. One of these functions is to help members of the public understand and apply the law. Increasingly, agencies are providing this help using automated legal referral tools, such as chatbots, virtual assistants, and other automated systems.

For example, when people have questions about their immigration status, they can turn to Emma, ​​the computer-generated virtual assistant for U.S. Citizenship and Immigration Services. When they have questions about their student loans, they can ask Aidan, the U.S. Department of Education’s virtual assistant who answers questions about federal student aid. And when they have questions about how personal and business activities affect their US federal tax liability, they can consult the Internal Revenue Service’s interactive tax wizard to answer taxpayers’ personal tax questions.

There are a number of explanations for the increasing use by federal agencies of automated legal guidance tools. Under the Plain Writing Act of 2010, agencies are required to communicate complex legal rules and procedures to the public in “plain language”. Yet formal law – including statutes, regulations and court decisions – is often so complex that it is difficult, if not impossible, to understand for most members of the public.

Agencies also often lack the resources to fully explain legal issues using human customer service representatives. Additionally, agencies face pressure to provide service comparable to the private sector, where automated customer service tools have become commonplace. Automated tools appear to be helping agencies address a number of these issues by translating complex legal documents and making them more accessible.

As a result, several federal agencies now use automated referral tools to respond to tens of millions of inquiries about the law each year. Other agencies are considering introducing these tools as a supplement or replacement for human customer service representatives. Despite the prevalence of this change, scholars who have studied technology and artificial intelligence in government agencies have not focused on how agencies use automation to explain the law.

To address the growing use of automated legal guidance tools by federal agencies, the Administrative Conference of the United States (ACUS) commissioned us to examine federal agencies’ use of automated legal advice and propose recommendations for reform.

In our study, we examined the use of automated wayfinding tools by all federal agencies and conducted extensive research on two main models of automated legal wayfinding. The first of these, a decision tree “answer” model, requires users to click on online topics to find answers to their questions. In contrast, the second model – the natural language “sorting” model – allows users to type their questions in natural language and then uses artificial intelligence to sort the questions into categories and provide the corresponding information.

We explored how these different models of automated legal advice provide answers that fit or deviate from the underlying law. To learn more about how agency heads themselves feel about these tools, we also interviewed federal agency heads who have direct responsibility for the well-developed automated legal guidance tools used by the federal government, or responsibility for guidance oversight in agencies that have developed such tools. tools. Additionally, we conducted interviews with officials from the US Government Accountability Office who work with agencies to develop such tools.

We have found that automated legal guidance tools provide many benefits to agencies and the public. They allow agencies to respond to public inquiries more efficiently than human customer service representatives, help the public navigate complex legal regimes, and, for some requests, provide accurate answers based on the underlying law. Automated legal guidance tools also allow agencies to reveal their views to the public in an easily accessible format.

But automated legal guidance tools also have drawbacks. In their attempt to provide the public with simple explanations of complex laws, automated legal guidance tools may provide advice that deviates from the underlying law. This result occurs in both decision tree “response” models and natural language “sorting” models. For example, both models may portray unestablished law as unambiguous, add administrative luster to the law, and omit discussion of legal and regulatory exceptions and requirements. These discrepancies can mislead the public about how the law applies to their personal circumstances.

Currently, the agencies’ automated legal guidance tools also provide users with little information about the underlying laws on which the agency’s guidance is based, and little or no warnings about the limited authority of automated legal guidelines or the inability of users to rely on them as a legal matter. We also found that no federal agency publishes records of changes to automated legal advice.

The potential for automated legal advice to mislead members of the public, coupled with the inability of the public to rely on such advice in any meaningful way, may exacerbate equity gaps between members of the public who have access to advice. reliable through a lawyer and those who don’t.

Interviews with federal agency officials also revealed that agency officials are insufficiently aware of some of the downsides of automated legal advice. We heard few concerns from agency officials about trust issues, as they took the position that members of the public do not or should not trust automated legal advice. Agency officials held this belief in part because they believed that automated legal advice only provided “information” and was not a source of law. This reaction was common even though millions of people turn to automated legal advice every year to get answers about the law from federal agencies.

Automated legal advice has an important role to play in informing members of the public about the law and, in any event, will be used by agencies in the future to explain the law to the public. Agencies should be aware of the potential downsides of such advice, however, especially as the uses of automated legal advice expand.

Based on our report, the ACUS full assembly adopted earlier this year 20 recommendations on the use of automated legal advice by agencies in the following areas: design and management, accessibility, transparency and trust.

These recommendations include, among other things, a call for agencies to consider when and whether a user’s good faith reliance on the advice of automated legal guidance tools should serve as a defense against penalties for non-compliance. -compliance. The recommendations encourage agencies to allow users to obtain a written record of their communication with automated legal guidance tools, including timestamps.

In addition, agencies should explain the limitations of advice users receive when the underlying law is unclear or unestablished. Where possible, agencies should provide access, through automated legal guidance tools, to the legal documents underlying the tools, including relevant laws, rules and court or court decisions. More generally, agencies should design and manage automated legal referral tools in a way that promotes fairness, accuracy, clarity, efficiency, accessibility, and transparency.

It is essential that agencies follow these best practices when implementing automated legal guidance tools. As our study revealed, automated legal advice can enable agencies to effectively convey complex laws to the public. But it can also lead to the government presenting the law as simpler and clearer than it is, a phenomenon that current agency practices threaten to worsen, including by making automated legal advice appear to be more personalized than they are.

Ultimately, our report and ACUS’s recommendations provide agency officials with a guide to maximizing benefits and minimizing costs when introducing automated legal advice to help members of the public know and comply the law.

Joshua D. White is a professor and director of strategic initiatives at the University of California, Irvine School of Law.

Leigh Osofsky

Leigh Osofsky is a professor and associate dean for research at the University of North Carolina Law School.

This essay is part of a three-part series on the United States Administrative Conference, titled Using Technology and Entrepreneurs in the Administrative State.

Share.

Comments are closed.