Abstract | This research investigated a brain–computer interface (BCI) for classifying vigilance states. A BCI approach based on event-related potential (ERP) and spectral features of electroencephalographic (EEG) data derived from an auditory oddball paradigm was employed. A 128-channel EEG system recorded potentials while participants simultaneously completed a prolonged visual match-to-sample task. It was hypothesized that better classification rates would be found for a group of “bored” participants, as compared to the not-bored participants. The best classification rates were found by extracting EEG features from gamma frequencies at middle and late latencies (for the standard tones). Critically, the BCI paradigm was most effective for the bored participants (mean misclassification rate = 13%), as compared to the not-bored participants (mean misclassification rate = 21%). These findings extend our knowledge of reliable latencies and spectral features of ERP data most salient to the classification of vigilance states. |
---|