TY - JOUR TI - Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface AU - Batula, Alyssa M. AU - Kim, Youngmoo E. AU - Ayaz, Hasan T2 - BioMed Research International AB - Motor-imagery tasks are a popular input method for controlling brain-computer interfaces (BCIs), partially due to their similarities to naturally produced motor signals. The use of functional near-infrared spectroscopy (fNIRS) in BCIs is still emerging and has shown potential as a supplement or replacement for electroencephalography. However, studies often use only two or three motor-imagery tasks, limiting the number of available commands. In this work, we present the results of the first four-class motor-imagery-based online fNIRS-BCI for robot control. Thirteen participants utilized upper- and lower-limb motor-imagery tasks (left hand, right hand, left foot, and right foot) that were mapped to four high-level commands (turn left, turn right, move forward, and move backward) to control the navigation of a simulated or real robot. A significant improvement in classification accuracy was found between the virtual-robot-based BCI (control of a virtual robot) and the physical-robot BCI (control of the DARwIn-OP humanoid robot). Differences were also found in the oxygenated hemoglobin activation patterns of the four tasks between the first and second BCI. These results corroborate previous findings that motor imagery can be improved with feedback and imply that a four-class motor-imagery-based fNIRS-BCI could be feasible with sufficient subject training. DA - 2017/// PY - 2017 DO - 10.1155/2017/1463512 DP - DOI.org (Crossref) VL - 2017 SP - 1 EP - 13 J2 - BioMed Research International LA - en SN - 2314-6133, 2314-6141 UR - https://www.hindawi.com/journals/bmri/2017/1463512/ Y2 - 2021/06/02/14:52:26 ER -