Controlling Bionic Limbs with Thought

Prosthetic devices are getting smarter all the time. They are enabling people with serious injuries or amputations to live independently. IBM researchers have recently come up with a brain-machine interface that uses AI to enhance prosthetic limbs. They used deep-learning algorithms to decode activity intentions using data from an EEG system and had a robotic arm execute the task.

More items like this: here

The robotic arm was linked to a camera and deep learning framework called GraspNet. As IBM explains, the system:

outperforms state of the art deep learning models in terms of grasp accuracy with fewer parameters, a memory footprint of only 7.2MB and real time inference speed on an Nvidia Jetson TX1 processor.

[HT]

 

*Our articles may contain aff links. As an Amazon Associate we earn from qualifying purchases. Please read our disclaimer on how we fund this site.
Read more:
O’We Smartwatch Is a Fitness, UV Tracker

There are plenty of smartwatches that track your daily activity and keep you connected. O'We can also serve as a...

Whoosh: Non-Voice Acoustics for Hands-free Input on Smartwatches

In the past few months, we have covered a bunch of projects that involve creating new ways to interact with...

Docomo’s Yubi Navi for Smartphone Navigation

Many of us use our smartphones to find our way around. Staring at your smartphone while walking is not a...

Close