Controlling Bionic Limbs with Thought

Prosthetic devices are getting smarter all the time. They are enabling people with serious injuries or amputations to live independently. IBM researchers have recently come up with a brain-machine interface that uses AI to enhance prosthetic limbs. They used deep-learning algorithms to decode activity intentions using data from an EEG system and had a robotic arm execute the task.

The robotic arm was linked to a camera and deep learning framework called GraspNet. As IBM explains, the system:

outperforms state of the art deep learning models in terms of grasp accuracy with fewer parameters, a memory footprint of only 7.2MB and real time inference speed on an Nvidia Jetson TX1 processor.

[HT]

 

*some of our articles have aff links. Please read our disclaimer on how we fund Exxponent.
Read more:
OstrichPillow Loop Eye Pillow Help You Disconnect

Meet the OstrichPillow Loop: a comfortable eye pillow that blocks 99.9% of light, helping you take a nap and disconnect...

Luna: Compact 360-Degree Camera

Meet Luna: a 360-degree camera that lets you shoot immersive video and images on the go. It uses a dual...

Lockheed Martin Exoskeleton Helps Soldiers Carry Heavy Equipment

Lockheed Martin has a new exoskeleton to make life easier for soldiers. The FORTIS Knee Stress Release Device is a...

Close