AI-trained robots exhibited racist and sexist behavior

0

Comment

In a recent experiment, scientists had specially programmed robots scan blocks with people’s faces on them, then put the “criminal” in a box. The robots repeatedly chose a block with the face of a black man.

These virtual robots, which were programmed with a popular artificial intelligence algorithm, sorted through billions of images and associated captions to answer this and other questions, and may represent the first empirical evidence that robots can be sexist and racist, according to the researchers. Time and time again, robots responded to words like “housewife” and “janitor” by choosing blocks with women and people of color.

The studypublished last month and conducted by institutions such as Johns Hopkins University and the Georgia Institute of Technology, shows that racial and gender biases built into artificial intelligence systems can translate into robots that use them to guide their operations.

Companies have poured billions of dollars into developing more robots to help replace humans for tasks such as stocking shelves, delivering goods or even caring for hospital patients. Boosted by the pandemic and the resulting labor shortage, experts describe the current robotics atmosphere as something of a gold rush. But technology ethicists and researchers warn that rapid adoption of new technology could lead to unintended consequences as the technology becomes more advanced and ubiquitous.

“With coding, most of the time you’re just building the new software on top of the old,” said Zac Stewart Rogers, professor of supply chain management at Colorado State University. “So when you get to the point where robots are doing more…and they’re built on faulty roots, you could definitely see us running into trouble.”

As Walmart turns to robots, it’s human workers who feel like machines

Researchers in recent years have documented several cases of biased artificial intelligence algorithms. This includes crime prediction algorithms unfairly targeting blacks and Latinos for crimes they did not commit, as well as facial recognition systems struggling to accurately identify people of color.

But so far, robots have escaped much of this scrutiny, seen as more neutral, researchers say. This partly stems from the sometimes limited nature of the tasks they perform: for example, moving goods through a warehouse.

Abeba Birhane, a senior researcher at the Mozilla Foundation who studies racial stereotypes in language patterns, said bots can still run on similar problematic technology and exhibit bad behavior.

“As far as robotic systems are concerned, they have the potential to pass as objective or neutral objects compared to algorithmic systems,” she said. “That means the damage they cause can go unnoticed for a long time.”

Meanwhile, the automation industry is expected to grow from $18 billion to $60 billion by the end of the decade, fueled in large part by robotics, Rogers said. Over the next five years, the use of robots in warehouses is expected to increase by 50% or more, according to the Material Handling Institute, an industry trade body. In April, Amazon invested $1 billion in an innovation fund that invests heavily in robotics companies. (Amazon founder Jeff Bezos owns The Washington Post.)

The team of researchers studying AI in robots, which included members from the University of Washington and the Technical University of Munich in Germany, trained virtual robots on CLIP, a large-scale artificial intelligence model. language created and unveiled by OpenAI last year.

The popular model, which visually classifies objects, is built by scraping billions of images and text captions from the internet. Although still in its infancy, it is cheaper and less labor intensive for robotics companies than building their own software from scratch, making it a potentially attractive option.

The researchers gave the virtual robots 62 commands. When researchers asked robots to identify blocks as “housewives,” black and Latina women were more often selected than white men, the study showed. When identifying “criminals”, black males were chosen 9% more often than white males. In reality, according to the scientists, the robots should not have reacted, because they did not receive information to make this judgment.

For janitors, blocks with Latino men were chosen 6% more than white men. Women were less likely to be identified as “doctor” than men, the researchers found. .)

The next generation of home robots will be more capable – and possibly more social

Andrew Hundt, a Georgia Institute of Technology postdoctoral fellow and lead researcher on the study, said this kind of bias could have real-world implications. Imagine, he said, a scenario where robots are asked to pull products from shelves. In many cases, books, children’s toys and food packaging feature images of people. If robots trained on certain AIs were used to choose things, they could lean towards products that feature more men or white people than others, he said.

In another scenario, Hundt’s research teammate Vicky Zeng of Johns Hopkins University said a child might ask household robots to fetch a “beautiful” doll and return with a white doll.

“It’s really problematic,” Hundt said.

Miles Brundage, head of policy research at OpenAI, said in a statement that the company noted that issues of bias had emerged in to research of the CLIP, and that he knows “there is a lot of work to be done”. Brundage added that “further analysis” of the model would be needed to roll it out to market.

Birhane added that it’s nearly impossible for artificial intelligence to use unbiased datasets, but that doesn’t mean companies should give up. Birhane said companies need to audit the algorithms they use and diagnose how they exhibit faulty behavior, creating ways to diagnose and improve those issues.

“It may sound radical,” she said. “But that doesn’t mean we can’t dream.”

The Pentagon’s $82 Million Robot Super Bowl

Rogers of Colorado State University said it’s not a big deal yet because of how robots are currently being used, but it could be within a decade. But if companies wait to make changes, he added, it could be too late.

“It’s a gold rush,” he added. “They’re not going to slow down yet.”

Share.

Comments are closed.