Rachneet Kaur,
Maxim Korolkov,
Manuel Hernandez, Richard Sowers |
University of Illinois at Urbana-Champaign
IEEE EMBC 2020 |
|
|
|
|
Abstract |
Electroencephalography (EEG) is a commonly used method for monitoring brain activity. Automating an EEG signal processing pipeline is imperative to the exploration of real-time brain computer interface (BCI) applications. EEG analysis demands substantial training and time for removal of distinct unwanted independent components (ICs), generated via independent component analysis, corresponding to artifacts. The considerable subject-wise variations across these components motivates defining a procedural way to identify and eliminate these artifacts. We propose DeepIC-virtual, a convolutional neural network (CNN) deep learning classifier to automatically identify brain components in the ICs extracted from the subject's EEG data gathered while they are being immersed in a virtual reality (VR) environment. This work examined the feasibility of DL techniques to provide automated ICs classification on noisy and visually engaging upright stance EEG data. We collected the EEG data for six subjects while they were standing upright in a VR testing setup simulating pseudo-randomized variations in height and depth conditions and induced perturbations. An extensive 1432 IC representation images data set was generated and manually labelled via an expert as brain components or one of the six distinct removable artifacts. The supervised CNN architecture was utilized to categorize good brain ICs and bad artifactual ICs via generated images of topographical maps. Our model categorizing good versus bad IC topographical maps resulted in a binary classification accuracy and area under curve of 89.20% and 0.93 respectively. Despite significant imbalance, only 1 out of the 57 present brain ICs in the withheld testing set was miss-classified as an artifact. These results will hopefully encourage clinicians to integrate BCI methods and neurofeedback to control anxiety and provide a treatment of acrophobia, given the viability of automatic classification of artifactual ICs. |
Figure: EEG data collection setup: The training and testing data for this study was accumulated from a designed system with real-time EEG recording setup while a subject stands quietly to pseudo-randomized height and depth alterations and induced perturbations.
|
Figure: Artifactual IC rejection. Left: An accepted independent component. The IC representation consists of a scalp topography image (top left), an event-related potential (ERP) image (top right), a time series displaying activity from the component (middle right, below the ERP diagram) and an activity power spectrum plot (bottom). In the scalp projections diagram, red, blue and white colors denote positive, negative and null contributions respectively. The heat map demonstrates inputs from multiple electrodes. Right: A rejected independent component representing a muscle artifact from the side of head. The power spectrum plot reflects no apparent peaks in the alpha-band.
|
Explanatory video |
|
|
References |
Rachneet Kaur, Maxim Korolkov, Manuel E Hernandez, Richard Sowers. Automatic Identification of Brain Independent Components in Electroencephalography Data Collected while Standing in a Virtually Immersive Environment - A Deep Learning-Based Approach. In IEEE EMBC 2020 [Bibtex] |
|
Acknowledgements |
We would like to thank all of the subjects who participated in this study. This work would not have been possible without the financial support of JUMP ARCHES. We would also like to thank members of the Mobility and Fall Prevention Lab for their contributions to the study, and the Campus Research Board for financial support of this work. RK is thankful to William Chittenden for the William A. Chittenden Award. |
Website adapted from Jingxiang, Richard and Deepak. |