Abstract:
Traditional electroencephalography (EEG) based brain computer interface (BCI) systems for performing three-dimensional (3-D) movement control used motor imagery paradigm, where the participants had to be trained to imagine certain combinations of movements of parts of their body such as hands, feet, and tongue to control the movements in separate dimensions. In the present work, we propose a new mental imagery - flying imagery - where the participants imagine flying in certain directions in the 3-D space surrounding them. As an empirical study, the present work used machine learning methods to classify flying imagery under two stages (preparation and execution) in six directions (forward, backward, left, right, up, and down) along with a control state where no movement was imagined. We also performed classification-based time-frequency analyses in identifying the significant frequency bands, time windows, and EEG features associated with flying imagery that differ between classes and contribute to the classification. We obtained classification results significantly better than chance levels, showing that the direction of flying imagery can be decoded from the EEG signals. Our results also suggest that the spatial information of flying imagery might be encoded mainly in alpha band activities over the parietal lobe, likely originated from the posterior parietal cortex (PPC).