homehome Home chatchat Notifications


You may not believe it, but this robotic hand can imagine its next move  

Robots are starting to think about themselves.

Rupendra Brahambhatt
July 26, 2022 @ 4:08 pm

share Share

A team of researchers from Columbia University has demonstrated a method that allows a robot to learn the model of its own body. This self-modeling process enabled the robot to decide the type of movements best suited under different circumstances and basically think about its next move. 

Robot WidowX successfully avoids the obstacle and touches the sphere. Image credits: Hod Lipson/YouTube

Every change in our body posture or position is commanded by our nervous system (motor cortex). The human brain knows how the different body parts can move and therefore, it can plan and coordinate our every action before it happens. This is possible because the brain has maps and models of our entire body.

These maps allow the brain to guide the movement of our different body parts, provide us with well-coordinated motion, and even save us from injuries while we face obstacles on our path. Could we do the same thing for robots? Boyuan Chen, the lead author of a new study and an assistant professor at Duke University believes so.

“We humans clearly have a notion of self. Somewhere inside our brain, we have a notion of self, a self-model that informs us what volume of our immediate surroundings we occupy, and how that volume changes as we move.”

Similar to how human body movements are guided using multiple brain maps, Boyuan and his team have demonstrated that a robot can also develop a kinematic model of itself.

A kinematic model is a mathematical information about a robot’s dimensions, moving capabilities and limitations, depth of field, and the workspace it can cover at any given time. It is used by robot operators to control the actions of a machine. However, after self-modeling, a robot can control itself as it becomes aware of how different motor commands trigger different body movements.

How did the scientists enable the robot to model itself?

There is no way scientists can see the brain maps formed inside a person’s mind or what a person thinks at any given point in time — at least, we don’t have the technology yet. Similarly, if a robot imagines something, a scientist can’t see the same by simply peeking into the robot’s neural network. The researchers suggest that a robot’s brain is like a “black box”, so in order to find out if a robot can model itself, they performed an interesting experiment. 

The different tests that confirmed the self-modeling ability of the robot. Image credits: Boyuan et al. 2022, Science Robotics

Describing the experiment in interview with ZME Science, one of the authors of the study and the director of Columbia University’s Creative Machines Lab, Hod Lipson explained: 

“You can imagine yourself, every human can imagine where they are in space but we don’t know exactly how this works. Nobody can look into the brain even of a mouse and say here is how the mouse sees itself.” 

So during their study, the researchers surrounded a robot arm called WidowX 200 with five cameras in a room. The live feed from all the cameras was connected to the robot’s neural network so the robot could see itself through the cameras. As WidowX performed different kinds of body movements in front of the live streaming cameras, it started observing how its different body parts behaved in response to different motor commands. 

After three hours, the robot stopped moving. Its deep neural network had collected all the information required to model the robot’s entire body. The researchers then performed another experiment to test if the robot had successfully modeled itself. They assigned a complex task to the robot that involved touching a 3D red sphere while avoiding a large obstacle in its path. 

Moreover, the robot has to touch the sphere with a particular body part (the end effector). To complete the task successfully, WidowX needed to propose and follow a safe trajectory that could allow it to reach the sphere without collision. Surprisingly, the robot did it without any human help, and for the first time, Boyuan Chen and his team proved that a robot can also learn to model itself. 

Self-modeling robots can advance the field of artificial intelligence

The WidowX robotic hand is not exactly an advanced machine, it can only perform a limited number of actions and movements. Humans in general looks forward to a future that will be run by robots and machines much more complex than WidowX. When asked if any robot could learn to model itself using the same approach, Professor Lipson told ZME Science:

“We did it with a very simple cheap robot (WidowX 200) that we can just buy on Amazon but this should work on other things. Now the question is how complex a robot can be and will this still work? This work for a six-degree robot, will this work for a driverless car? Will this work for 18 motors, a spider robot? And that’s what we gonna do next, we gonna try to push this to see how far it can go.”

Image credits: Possessed Photography/Unsplash

Many recent AI-based innovations such as drones, driverless cars, and humanoids like Sophia perform multiple functions at the same time. If these machines learn to imagine themselves and others including humans, this could lead to a robot revolution. The researchers believe that the ability to model self and others would allow robots to program, repair, and function on their own without human supervision.

“We rely on factory robots, we rely on drones, we rely more and more on these robots, and we can’t babysit all these robots all the time. We can’t always model them or program them, it’s a lot of work. We want the robots to model themselves and we are also interested in working on how robots can model other robots. So they can help each other, keep taking care of themselves, adapt, and be much more resilient and I think it’s gonna be important,” said Professor Lipson.  

The study is published in the journal Science Robotics.

share Share

A Brain Implant Just Turned a Woman’s Thoughts Into Speech in Near Real Time

This tech restores speech in real time for people who can’t talk, using only brain signals.

Using screens in bed increases insomnia risk by 59% — but social media isn’t the worst offender

Forget blue light, the real reason screens disrupt sleep may be simpler than experts thought.

We Should Start Worrying About Space Piracy. Here's Why This Could be A Big Deal

“We are arguing that it’s already started," say experts.

An Experimental Drug Just Slashed Genetic Heart Risk by 94%

One in 10 people carry this genetic heart risk. There's never been a treatment — until now.

We’re Getting Very Close to a Birth Control Pill for Men

Scientists may have just cracked the code for male birth control.

A New Antibiotic Was Hiding in Backyard Dirt and It Might Save Millions

A new antibiotic works when others fail.

Researchers Wake Up Algae That Went Dormant Before the First Pyramids

Scientists have revived 7,000-year-old algae from Baltic Sea sediments, pushing the limits of resurrection ecology.

A Fossil So Strange Scientists Think It’s From a Completely New Form of Life

This towering mystery fossil baffled scientists for 180 Years and it just got weirder.

ChatGPT Seems To Be Shifting to the Right. What Does That Even Mean?

ChatGPT doesn't have any political agenda but some unknown factor is causing a subtle shift in its responses.

This Freshwater Fish Can Live Over 120 Years and Shows No Signs of Aging. But It Has a Problem

An ancient freshwater species may be quietly facing a silent collapse.