“A Star Wars-inspired limb provides control of each finger.
A galaxy far, far away is a little closer with the invention of a robotic arm inspired by Luke Skywalker’s bionic hand.
And while this arm may not wield a lightsaber, it has a greater power for jazz musician Jason Barnes — it lets him play the piano for the first time in five years.
Barnes, who lost much of his right arm in a work accident, is back at the keys with an AI prosthesis created by researchers at the Georgia Institute of Technology. Unlike most prosthetics, it gives the 28-year-old the ability to control each finger individually.
With it, Barnes can play Beethoven. He also plays the “Star Wars” theme song. (You can watch him the in the video below.)
“It’s completely mind-blowing,” Barnes said. “If it can play piano, it can do almost anything.”
Individual Finger Control Enables Great Dexterity
With individual finger control, Barnes and other amputees could use an AI prosthesis for daily activities like holding a fork, a washcloth or a comb. That sort of dexterity comes from a combination of GPU-accelerated deep learning and an ultrasound machine.
Barnes’ everyday prosthesis, like most, relies on electromyogram (EMG) sensors to detect electrical impulses in his muscles. Although these recognize muscle movement, EMG signals are too noisy to determine which finger wants to move.
“It’s like putting a microphone next to a concert hall,” said Gil Weinberg, the Georgia Tech professor who leads the research. “We needed to be inside the concert hall.”
Ultrasound Strikes a Chord
Weinberg was in a colleague’s lab trying to improve EMG when he noticed an ultrasound machine next to where he was working. The same device that doctors use to see babies in the womb let him see muscle contractions, as well as the speed and direction of muscle movements.
“It was a big eureka,” he said. “With the ultrasound, there was a distinct correlation between what finger moved and what was on the machine.”
By attaching an ultrasound probe to the arm, Weinberg trained a deep learning network to analyze and detect muscle movements. Using our GeForce GTX TITAN X GPU with the cuDNN-accelerated TensorFlow deep learning framework, the team created an algorithm that predicts what finger the musician is trying to use.
A Different Drummer
The “Star Wars” arm is Barnes’ second AI prosthesis. The Atlanta music teacher is a drummer at heart. Determined to keep playing after the accident, he rigged up a homemade prosthesis. It let him use a drumstick, but he couldn’t control the speed or bounce of the stick.
That posed the perfect challenge for Weinberg, founding director of Georgia Tech’s center for music technology, who wants to change how we think about music by creating AI technologies that compose and perform songs.
When Barnes approached him, Weinberg had already built a robotic percussionist and marimba player that use deep learning to improvise with human musicians. Like Barnes, he’s a jazz musician (piano), and the idea of using AI to help Barnes get his groove back intrigued him.
AI Music on Tour
But Weinberg did more than let Barnes beat the drum again. He built the deep learning prosthesis with not one, but two drumsticks. Barnes controls one, and the other improvises tunes based on the music in the room. Besides composing music, the robotic arm plays faster than any drummer in the world, according to Weinberg.
“The idea is to bring you back to how you used to be — or better,” Weinberg said. “We can push the limits of what’s humanly possible with deep learning.”
Although the second drumstick was intimidating at first, Barnes mastered it well enough to go on tour. The journey took him and Weinberg to four continents and included a stop at the Kennedy Center in Washington.
“I went from a horrible accident to playing around the world,” Barnes said.”