Meta AI researchers give robots a sense of touch and we’re getting all the creepy feels

by Pelican Press
10 views 5 minutes read

Meta AI researchers give robots a sense of touch and we’re getting all the creepy feels

AI has given robots the ability to ‘hear’ and ‘see’ the world to understand human orders and carry out tasks better, but Meta’s AI researchers are now testing ways to let robots mimic the sense of touch, too. The Fundamental AI Research (FAIR) division of Meta has just debuted a set of tools that could make robotic tools able to detect, decipher, and react to what they touch. That could make even the most basic robot arm sensitive enough to handle delicate objects without damaging them and make them useful in more settings.

Meta showcased a combination of new technologies and features that work together to give robots the ability to feel things. Touch-sensing tech Sparsh gives AI a way of identifying things like pressure, texture, and movement without needing a huge database. It’s like an AI version of how you can feel something in the dark and describe how it feels even if you don’t know what you’re touching.

To send information about what the robot is touching to the AI model, Meta teamed with a company called GelSIght to create essentially a robot fingertip called Digit 360. The sensors in Digit 360 are very sensitive, so the AI can not only determine details about what the robot is touching but also apply pressure appropriate to a task involving the object, like lifting or rotating it.

For the rest of the robotic hand (or equivalent device, Meta created a system called Plexus with Wonik Robotics to spread multiple touch sensors across the device. Meta claims Plexus can mimic the human sense of touch enough for fragile or awkward objects. You can see below how the three technologies work together in a robotic hand.

Meta AI Touch

(Image credit: Meta)

Sensitive AI

“The human hand is marvelous at signaling to the brain touch information across the skin from fingertips to palm. This enables actuating the muscles in the hand when making decisions, for instance about how to type on a keyboard or interact with an object that’s too hot,” Meta explained in a blog post. “Achieving embodied AI requires similar coordination between the tactile sensing and motor actuation on a robot hand.”

There are many ways robot hands that can ‘feel’ linked to AI capable of interpreting those sensations could be useful. Imagine robotic surgical assistants able to feel minute changes in the body and respond faster, with exact yet gentle movements matching or beating human responses. The same goes for manufacturing delicate devices without breaking them and perhaps coordinating better among multiple robotic hands the way humans do with their pair. It could even make virtual experiences feel more real for humans, with the understanding of what objects and environments should feel like used to inform their virtual counterparts.

Using AI to mimic the sense of touch for robots isn’t the only human experience that AI is mimicking for machines. Researchers at Penn State have recently showcased how AI models linked to an electronic tongue can simulate a sense of taste good enough to spot tiny differences in flavor. Meanwhile, a company called Osmo has taught AI models how to emulate a sense of smell that’s far better than a human’s. The company demonstrated how its AI can analyze a scent precisely enough to even recreate it from scratch by picking out and combining chemicals without human intervention

You might also like



Source link

#Meta #researchers #give #robots #sense #touch #creepy #feels

You may also like