ИСТИНА |
Войти в систему Регистрация |
|
ИСТИНА ПсковГУ |
||
Could humans become more effective and efficient in their collaboration with computers when solving complicated mental problems if the interaction between the brains and computers was possible without any mediation by the muscular activity? This question can be answered only using experiments. However, invasive brain-computer interfaces (BCIs) are too risky to be used in extensive experimentation, and non-invasive BCIs are yet too imprecise. In our lab at NRC Kurchatov Institute (Moscow, Russia), we focus on developing hybrid Eye-Brain-Computer Interfaces (EBCIs) that combine gaze interaction and on-fly detection of intention to act. The presence of intention is recognized by applying machine learning to a marker of feedback expectation that we described for intentional gaze dwells (Shishkin et al. 2016, Front Neurosci https://doi.org/10.3389/fnins.2016.00528 ). The marker was first found in the EEG, but currently we also start exploring the possibility to enhance its detection using MEG+EEG+gaze co-registration in collaboration with MEG Center Moscow. New opportunities in the EBCI development are also explored in the interaction with moving objects. Deep learning with transfer learning are being adapted for fast recognition of EEG/MEG patterns during gaze dwells. Finally, the bottleneck of conscious control in the use of BCI technology and gaze interaction is theoretically and, in certain aspects, experimentally addressed, to approach the problem of balancing fluency of the interface operation with the ability to control actions in sufficient extent. By combining all these approaches, we aim on building a human-machine interface that will be extraordinary responsive to intentions and will make possible experimental assessment of the capacities of extremely fluent human-machine interaction.