outbrain
Close  

WATCH: Robot self-corrects when ‘scolded’ by human

/ 03:14 PM March 07, 2017

Robots are known for their precision, hence their proliferation in factory assembly lines. When something goes wrong, programmers are usually called in to fix the robots’ programming, to prevent it from messing up again.

ADVERTISEMENT

But what if you could just tell the robot that it did a bad job? TechCruch reports that this is exactly what researchers from MIT’s CSAIL department are working on.

In the demonstration video, a human observer wears an EEG cap to measure his brainwaves. The robot then monitors these brainwaves before making a decision regarding its given task.

FEATURED STORIES

In this case, it’s segregating the paint cans from the spools of wire, and into their respective containers. The robot will monitor for signs that the human is doubting its choice, then adjust accordingly.

CSAIL research scientists Stephanie Gil says, “Being able to read the EG signals of the human and using that as a control signal to the robot will have an effect on the robot’s choice.”

“Whether or not the robot makes the right choice will have an effect on the human’s reactions,” she adds. “That’s a natural two-way communication or a conversation between humans or robots.”

It’s still far from a human having a witty repartee with an android butler, but it looks like the science for that is heading in the right direction. Alfred Bayle/JB

TOPICS: MIT, robot research, robotics, Robots, self-correcting robot
Read Next
EDITORS' PICK
MOST READ
Don't miss out on the latest news and information.
View comments

Subscribe to INQUIRER PLUS to get access to The Philippine Daily Inquirer & other 70+ titles, share up to 5 gadgets, listen to the news, download as early as 4am & share articles on social media. Call 896 6000.

For feedback, complaints, or inquiries, contact us.


© Copyright 1997-2020 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.