homehome Home chatchat Notifications


Robot workspace to get human touch remotely

Assemble, stitch, fasten, box, seal, ship. Wait, what can get done if assemblers, stitchers, and the others are staying at home?

Nancy Cohen
December 22, 2020 @ 12:21 am

share Share

It’s been fairly easy for some to adopt a remote working model during the pandemic, but manufacturing and warehouse workers have had it rougher — some tasks just need people to be physically present in the workplace.

But now, one team is working on a solution for the traditional factory floor that could allow more workers to carry out their labor from home.

The proposed human-in-the-loop assembly system. The robot workspace can be manipulated remotely. Image credits: Columbia Engineering.

Columbia Engineering announced that researchers have won a grant to develop the project titled “FMRG: Adaptable and Scalable Robot Teleoperation for Human-in-the-Loop Assembly.” The project’s raw ingredients include machine perception, human-computer interaction, human-robot interaction, and machine learning.

They have come up with a “physical-scene-understanding algorithm” to convert visual observations via camera shots of a robot workspace into a virtual 3D-scene representation.  

Handling 3D models

The system analyzes the robot worksite and can change it into a visual physical scene representation. Each object is represented by a 3D model that mimics its shape, size, and physical attributes. A human operator gets to specify the assembly goal by manipulating these virtual 3D models.

A reinforcement learning algorithm infers a planning policy, given the task goals and the robot configuration. Also, this algorithm can infer its probability of success and use it to determine when to request human assistance — otherwise, it carries out its work automatically.

The project is led by Shuran Song, an assistant professor of computer science at Columbia University. She said the system they envision will allow workers who are not trained roboticists to operate the robots and this pleases her.

“I am excited to see how this research could eventually provide greater job access to workers regardless of their geographical location or physical ability.”

Automation for the future

The team received $3.7m funding from the National Science Foundation (NSF). The NSF stated the award period starts from January 1 to an estimated end date of Dec. 31, 2025. The NSF award abstract reveals the positive impact such an effort could have on business and workers:

“The research will benefit both the manufacturing industry and the workforce by increasing access to manufacturing employment and improving working conditions and safety. By combining human-in-the-loop design with machine learning, this research can broaden the adoption of automation in manufacturing to new tasks. Beyond manufacturing, the research will also lower the entry barrier to using robotic systems for a wide range of real-world applications, such as assistive and service robots.”

The abstract said their team is collaborating with NYDesigns and LaGuardia Community College “to translate research results to industrial partners and develop training programs to educate and prepare the future manufacturing workforce.”

Song is directing the vision-based perception and machine learning algorithm designs for the physical-scene-understanding algorithms. Computer Science Professor Steven Feiner, Columbia University, is looking at the 3D and VR user interface. Matei Ciocarlie, associate professor of mechanical engineering, Columbia University, is building the robot learning and control algorithms. Before joining the faculty, Matei was a scientist at Willow Garage, and scientist at Google. Matei contributed to the development of the open-source Robot Operating System.

A takeaway: News of robots often results in hair-pulling remarks on a tradeoff that can result in lost jobs for humans. Here is a project that, once complete, has the potential to complement human capabilities by using robotics.

Nancy Cohen is a contributing author. Want to get involved like Nancy and send your story to ZME Science? Check out our contact and contribute page.

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.