Robots that have been taught to feel the layers of fabric can help with washing

Robots that have been taught to feel the layers of fabric can help with washing

Humans use their senses of touch and sight to grab a cup or pick up a piece of clothing. However, for robots, these activities are very challenging. The amount of data collected by touch is difficult to measure, and previously it was difficult to simulate the sense of touch in robots.

New research from the Institute for Robotics could help robots sense layers of fabric, which could one day allow robots to help people with household tasks such as folding clothes. Image Credit: Carnegie Mellon University Robotics Institute

Humans look at something, reach for it, and then use touch to make sure we’re in the right position to grab it. Much of the sense of touch that humans do is natural to us. We don’t think much about it, so we don’t realize how important it is.

David Heald, Head of the Robotics Perception and Action Lab, Robotics Institute, Carnegie Mellon University

David Heald is also an assistant professor in the College of Computer Science at Carnegie Mellon UniversityRobotics Institute.

For example, to fold clothes, robots require a sensor to replicate how a human finger can feel the top layer of a shirt or towel and grip the layers underneath. Scientists can program a robot to sense and grab the top layer of fabric, but without the robot sensing the other layers of fabric, the robot will only grab the top layer and not fold the cloth completely.

“How can we solve this problem?” Hold asked. “Well, maybe what we need is touch sensing.”

The ReSkin, created by scientists at Carnegie Mellon and Meta AI, was the perfect solution. The open source touch-sensing “skin” consists of a thin, flexible polymer implanted with magnetic particles to create 3-axis touch signals. In the latest research paper, scientists used ReSkin to help the robot sense layers of fabric rather than relying on vision sensors to perceive them.

“By reading changes in magnetic fields from depressions or skin movement, we can achieve the sensation of touch,” said Thomas Wing, Ph.D. A student in the R-Pad Lab, who was part of the project with RI postdoctoral fellow Daniel Seita and graduate student Sashank Tirumala. “We can use this tactile sensing to determine how many layers of fabric we’ve captured by applying pressure with the sensor.”

Other studies have used tactile sensing to understand solid and static objects, but the fabric is “deformable,” meaning it changes when touched, making the activity more difficult. Adjusting the robot’s grip on the fabric changes its shape and sensor readings.

The scientists did not direct the robot where or how to hold the cloth. Instead, they programmed the robot to sense how many layers of fabric it was holding with the sensors in the ReSkin, then adjust the grip to try again. The team tested the robot, having it pick up a layer or two of fabric, and used different colors and textures to show that it could be generalized outside of the training data.

The flexibility and thinness of the ReSkin sensor made it possible to program robots to grip something as fragile as layers of fabric.

The profile of this sensor is very small, and we were able to do this very precise task, inserting it between the layers of fabric, which we cannot do with other sensors, especially optical sensors. We were able to use it to do tasks that were not previously achievable.

Thomas Wing, Ph.D. Student, Perception and Doing Robotics Lab, Robotics Institute, Carnegie Mellon University

There is still more research to be done before the bot can be used for commercial purposes. This starts with steps like smoothing out the wrinkled fabric, choosing the correct number of layers of fabric you want to fold, and then folding the fabric in the right direction.

It’s really an exploration of what we can do with this new sensor. We’re exploring how to make robots feel this magnetic skin of soft things, and we’re exploring simple strategies for handling the fabric we need so robots can finally do our laundry.

Thomas Wing, Ph.D. Student, Perception and Doing Robotics Lab, Robotics Institute, Carnegie Mellon University

At the 2022 International Conference on Robots and Intelligent Systems in Kyoto, Japan, held between October 23research and development and 27The tenthIn 2022, the team will present a research paper titled “Learn to separate layers of fabric using haptic feedback. The article also received a Best Paper Award at the conference’s 2022 RoMaDO-SI workshop.

magazine reference

Tirumala, S.; and others. (2022) Learn to separate layers of fabric using haptic feedback. arXiv. doi.org/10.48550/arXiv.2207.11196.

source: https://www.cmu.edu

#Robots #taught #feel #layers #fabric #washing

Leave a Comment

Your email address will not be published. Required fields are marked *