/We Can Now Hear an AI Robot’s Thought Process

We Can Now Hear an AI Robot’s Thought Process

Summary: Italian researchers have given one of SoftBank’s robots, called Pepper, the ability to share its internal monologue with humans while completing tasks.

Original author and publication date: Kristin Houser – April 25, 2021

Futurizonte Editor’s Note: How soon before AI allows us to hear other people’s thought processes?

From the article:

Trusting the bots: AI systems can do a lot — drive cars, cool cities, and even save lives — but the systems usually can’t explain exactly how they do those things. Sometimes, their developers don’t even know the answer.

This is known as AI’s black box problem, and it’s holding back the mainstream adoption of AI.

“Trust is a major issue with artificial intelligence because people are the end-users, and they can never have full trust in it if they do not know how AI processes information,” Sambit Bhattacharya, a professor of computer science at Fayetteville State University, told the Economic Times in 2019.

The idea: University of Palermo researchers thought it might be easier for people to trust an AI bot if they knew the robot’s thought process during interactions.

“The robots will be easier to understand for laypeople, and you don’t need to be a technician or engineer,” study co-author Antonio Chella said in a press release. “In a sense, we can communicate and collaborate with the robot better.”

For a study published in iScience, Chella and first author Arianna Pipitone, trained a Pepper robot in table setting etiquette. They then gave it the ability to say, in plain English, what it was “thinking” when executing a task.

Inside a robot’s head: In a recorded demo, when asked to hand over a napkin, the robot starts talking in what sounds like a stream of consciousness: “Where is the napkin? The object is in the box….I am using my right arm to get the object.”

This gave the person working with Pepper an understanding of the robot’s thought process.

“The robot is no longer a black box, but it is possible to look at what happens inside it and why some decisions are (made),” the researchers wrote in their study.

READ the full article here