MIT Develops Technology to Control Robots With Brainwaves and Hand Gestures

 
Share on Facebook Tweet Share Share Reddit Comment
MIT Develops Technology to Control Robots With Brainwaves and Hand Gestures

Photo Credit: MIT

Highlights

  • CSAIL has released a demo video for the technology
  • It uses a combination of EEG and EMG
  • Aim is to help elderly and workers with limited mobility

Massachusetts Institute of Technology (MIT) announced on Wednesday that its Computer Science and Artificial Intelligence Laboratory (CSAIL) division aims to create a technology that allows users to control robotic actions using just hand gestures and brainwaves. Expanding on the previous research on simple binary activities, the new updates give it the potential to handle multiple-choice tasks which opens up the possibilities to manage teams of robots.

In a blog post, MIT's CSAIL division explained how the system monitors brain activity to detect if the human handler has detected an error in the task. Once detected, the robot stops and asks for help. Using hand gestures, the person can then scroll through the options and tell the robot to rectify the specific mistake.

 

To create such a sophisticated project, CSAIL used electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle actions, using a series of electrodes placed across the subject's scalp and forearm. The blog post claims that while both have their individual shortcomings, a combination allows for "robust bio-sensing" and makes it easier for new users.

"This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we've been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity," said CSAIL Director Daniela Rus, who supervised the experiment.

It is claimed that previously the technology was limited to recognising brain signals that people trained themselves to think about. PhD candidate Joseph DelPreto, who was lead author on a paper about this project, said, "What's great about this approach is that there's no need to train users to think in a prescribed way. The machine adapts to you, and not the other way around."

As for their aim, the folks over at MIT believe that this system could one day be useful for the elderly, or workers with limited language fluency and mobility.

Comments

For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and subscribe to our YouTube channel.

Further reading: MIT, CSAIL
Samsung 8TB SSD Based on NF1 Form Factor Launched for Data Centres
 
 

Advertisement

 

Advertisement