Search
Close this search box.

RU BLOG

AI & ROBOTICS RESEARCH

RAFFLEES UNIVERSITY LATEST RESEARCH:
Deep Learning for the Brain-Machine Interaction

[26 SEPTEMBER 2022]

莱佛士大学 AI & Robotics Faculty proudly presents the latest research – Deep Learning Architecture for Decoding EEG Brain Signals. The project is led by Associate Professor. Dr Sasa ARSOVSKI, the programme director of AI & Robotics programmes at Raffles University.

A drone is controlled by “mind thinking.”

As shown in the demonstration video below, Assoc. Prof. Dr Sasa manages to control the drone by wearing a portable headband without any controller. The technology can be applied to a drone or machine.

Explanation of the scenario

The headband is portable electroencephalography – “Muse” that is used to collect brain activities. 

  1. The electroencephalography (EEG) neural signals of the imagined directions (left, right, forward, backward) of individuals are decoded.
  2. A brain waves classifier of the left, right, up, down, forward, and backward movement imagery with portable electroencephalography – “Muse” (a portable headband launched lately) has been developed.
  3. The brain activities are collected with Muse electroencephalography.
  4. An Artificial Neural Network is used as the classifier of the extracted features.
  5. The real-time EEG data is collected and defined with procedure steps to send the command block to the drone.
  6. The drone will then move according to the brain signal collected (when a person thinks about how the drone should be moving in any direction). 

Future Implication

The working prototype of our solution can be commercialised and implemented in different industries, from Defence, Manufacturing to Medical and Health. Since the technology can be applied to both drones and machines, imagine if it applies to the robotic arm, it will be greatly helpful in the medical industry.

Contact us at lynngoh@raffles-university.edu.my for any further demonstration or collaboration.