Mind-controlled prosthetic arm can now ‘feel’ objects


Researchers from the Chalmers University of Technology in Sweden have published a study documenting a breakthrough in haptic feedback in a prosthetic arm. It documented how three patients have lived for several years with a mind-controlled prosthesis in their everyday life.
For the last few years, this included the ability to ‘feel’ objects through a prosthetic hand.
This is a new concept for artificial limbs – referred to as neuromusculoskeletal prostheses as they are connected to the user’s nerves, muscles and skeleton. The ability to feel the sensation of touch is possible through stimulation of the nerves that used to be connected to a biological hand before amputation.
Force sensors located in the thumb of the prosthesis measure contact and pressure applied to an object while grasping. This information is then transmitted to the patients’ nerves leading to their brains.
“The most important contribution of this study was to demonstrate that this new type of prosthesis is a clinically viable replacement for a lost arm,” said Max Ortiz Catalan, who led the study.
“No matter how sophisticated a neural interface becomes, it can only deliver real benefit to patients if the connection between the patient and the prosthesis is safe and reliable in the long term.”
Muscle signals used to pilot a robot
Researchers from MIT’s Computer Science and AI Laboratory (CSAIL) have designed a system that allows human muscle signals from wearable sensors to pilot a robot’s movement. The system, called ‘Conduct-A-Bot’, combines electromyography and motion sensors worn on a human’s biceps, triceps and forearms.
These measure muscle signals and movement, with algorithms then processing the gestures in real time. Testing for Conduct-A-Bot was done using a Parrot Bebop 2 drone.
By detecting actions like rotational gestures, clenched fists, tensed arms and activated forearms, Conduct-A-Bot can move the drone left, right, up, down and forward, as well as allow it to rotate and stop.
“Understanding our gestures could help robots interpret more of the nonverbal cues that we naturally use in everyday life,” said Joseph DelPreto, lead author on a new paper about Conduct-A-Bot.
“This type of system could help make interacting with a robot more similar to interacting with another person, and make it easier for someone to start using robots without prior experience or external sensors.”
Revamped, ancient manufacturing process has bright future
Researchers from the University of Maryland have reinvented a 26,000-year-old manufacturing process to fabricate ceramic materials with promising applications for hydrogen fuel cells.
Ceramics are widely used in batteries, electronics and extreme environments. However, conventional ceramic sintering – an essential part of the firing process – often require hours of processing time to produce.
To overcome...

Top