“I am very excited that this research promises to provide a transformational overcoming of the ability of our computer systems to understand the underlying meaning of soldiers’ instructions, despite the difference in their dialects, accents and other types of noise. that will certainly arise in the military. applications. “
– Dr Claire Bonial
U.S. Army Combat Capability Development Command Public Affairs
ADELPHI, Maryland – Army researchers have developed breakthrough technology that will improve the way soldiers and robots communicate and perform tasks in tactical environments.
This research aims to develop a Natural Language Understanding, or NLU, pipeline for robots that would easily transfer to any IT system or agent and gradually tame the variation we see in natural language, the researcher from Army, Dr. Claire Bonial of the US Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory.
This means that regardless of how a soldier chooses to express himself to the robot, the underlying intention of that language is understood and can be implemented, given both the current conversational and environmental or situational context. .
To do this, the NLU pipeline first automatically parses the input language into an abstract meaning representation, or AMR, which captures the basic meaning of the language’s content, Bonial said. It then converts and augments the AMR into Dialogue-AMR, which captures additional elements of meaning necessary for human-robot dialogue in particular, such as what the person is trying to do with the utterance in the conversational context, for example giving an order, ask a question, state a fact about the environment, etc.
This research was presented at the 14th International Conference on Computational Semantics, or IWCS 2021, where it received the Outstanding Paper Award.
The award citation noted that “authors are not afraid to use old-fashioned handwritten rules when doing the job, which is lacking in many current NLP work,” and that “Anyone who wants to working on the dialogue I want to see this first attempt at analysis in this new field. “
“This award was incredibly gratifying for several reasons,” said Bonial. “First, this article represents the research efforts that have been planted and that are growing since I was a doctoral student. I was part of the first group of researchers to establish what has become one of the most widely used semantic representations in natural language processing, AMR. “
Bonial started working with this group in 2010 and has since been actively involved in refining and extending representation. Thus, this article represents a body of research spanning more than a decade for Bonial.
This includes work to represent the language expressed through semi-idiomatic constructions, for example “the higher you fly, the harder you will fall!” a oriented and situated dialogue between people and robots, in an augmented version of the representation called Dialogue-AMR, she said.
Efforts to develop Dialogue-AMR began in 2018 with dialogue expert Dr David Traum as part of the University Affiliate Research Center previously established between DEVCOM ARL and the Institute for Creative Technologies at the University of Southern California.
Dialogue-AMR builds on ICT’s ARL and Bot Language research collaboration focused on robotic dialogue systems, initiated by one of the IWCS paper co-authors, Dr Clare Voss, in 2012.
Second, Bonial said, this article reports on a critical step in the research trajectory: the experiments of researchers to robustly and critically assess Dialogue-AMR as a computational semantic representation, as well as the NLU pipeline used to obtain automatically the correct Dialogue-AMR. representation when given unconstrained natural language input.
These experiments were also carried out in collaboration with Traum of ICT, as well as an ARL intern student and recent Georgetown University graduate, Mitchell Abrams. Abrams was recently selected as a research fellow in the Department of Defense Science, Mathematics and Research for Transformation, and will return to the lab after completing his fully funded doctoral program.
This assessment is important because the researchers assess in two problematic areas: on the one hand, the area of human-robot dialogue for collaborative search and navigation tasks, and on the other hand, the area of human-human dialogue in the Minecraft virtual game environment, where participants collaboratively build block structures.
The Minecraft dialogue data that made this comparative assessment possible was obtained from Dr. Martha Palmer of the University of Colorado, Boulder, and Dr. Julia Hockenmaier of the University of Illinois at Urbana-Champaign.
Assessment in these two areas poses a key question regarding the usefulness of such a representation and the NLU pipeline in general.
Bonial asked how efficiently and accurately the pipeline and representation can be applied in a problem domain, then transferred and refined into a new problem domain. Or, in other words, how possible is it for us to use this approach to communicate with a robot when that robot has to tackle new problems and environments?
Additionally, their evaluation demonstrates that with only a small amount of additional training data (several hours of annotation work equating to approximately 200 additional training instances for the machine learning elements of the pipeline), the NLU pipeline achieves promising performance in the second area of human-human dialogue in the Minecraft realm, she said.
The researchers noted that although the performance is somewhat lower than the performance of the original human-robot dialogue domain, it is comparable or better than the performance of other automatic semantic analyzers for other semantic representations and linguistic domains.
“These promising results demonstrate that this NLU pipeline takes advantage of a valid approach to communicate with robots in collaborative tasks across multiple domains,” said Bonial. “I am very excited that this research promises to provide a transformational over-match in the ability of our computer systems to understand the underlying meaning of soldiers’ instructions, despite the difference in their dialects, accents and other types of noise. which will certainly arise in the military. applications. “
In the next steps of the research team, they connect the output semantic representation with a system that anchors the elements of the representation to the two entities in the environments and the executable behaviors of the robot in a joint work with Dr. Thomas Howard of the University of Rochester. .
“We are optimistic that the deeper semantic representation will provide the necessary structure for a superior anchoring of the language in the conversational and physical environment, so that the robots can communicate and act more as teammates of the soldiers, for example opposition to tools, ”Bonial said.