Prosthetic Hand By Georgia Tech Uses Ultrasound & Deep Learning

Using an open source project from Open Bionics, ultrasound, and deep learning, researchers at Georgia Tech have developed the most intuitive prosthetic sensor on the market.

Myoelectric prosthetics are robotic arms controlled by brain signals and muscle movements, just like a real arm is. Myoelectric prosthetics are a breakthrough in transforming the experience of prosthetic users, allowing them to interact with the world with more control. Unfortunately, because forearm muscle movements are too complex for electromyogram (EMG) sensors to fully understand, designs rely on complex muscle movements throughout the body to communicate directions to the prosthetic. Users have to learn to wiggle their toes and clench their chest in specific patterns to create hand movements. Naturally, users dream of an arm as easily controlled as Luke Skywalker’s prosthetic hand in Star Wars, which could be controlled in the same way as a regular arm is – with forearm muscles.

Georgia Tech’s answer was simple: if EMG sensors cannot discern the movements of the forearm muscles, why not use another sensor? Using an open source project from Open Bionics, ultrasound, and deep learning, researchers at Georgia Tech have developed the most intuitive prosthetic sensor on the market. Ultrasound allows a more detailed and dynamic map to facilitates this movement, and deep learning accurately correlates these patterns with the specific movements of the hand.

Researchers at the Georgia Institute of Technology have created an ultrasonic sensor that allows amputees to control each of their prosthetic fingers individually. It provides fine motor hand gestures that aren’t possible with current commercially available devices” says Georgia Tech News. The prosthetic arm uses detailed maps of muscle movements in the forearm to manufacture natural movements of the hand controlled by the same muscles traditionally used for hands and fingers.

“This allowed us, for the first time, to predict an amputees’ continuous and simultaneous finger gestures, which makes the control for an amputee completely intuitive. The user doesn’t need to learn a particular set of gestures. They can just move their muscles as they would regularly, and the prosthetic hand will move accordingly,” – Gil Weinberg, Georgia Tech researcher

The researchers designed the prosthetic for Jason Barnes, a musician who can now play the piano and drums the same way he was trained – with his forearm muscles. “Human muscles work very similarity across different subjects, so the system can work for anyone,” says Gil, “After 30-60 seconds of training for any particular user, the network can be fine-tuned for any minute individual idiosyncratic differences.”

“My motto has always been that if our robots satisfy musical demands, they will satisfy demands in pretty much any other scenario,” says Gil Weinberg.

Leave a Reply