Capturing Human Perceptual and Cognitive Activities via Event-Related Potentials Measured with Candle-Like Dry Microneedle Electrodes.
Yuri YoshidaTakumi KawanaEiichi HoshinoYasuyo MinagawaNorihisa MikiPublished in: Micromachines (2020)
We demonstrate capture of event-related potentials (ERPs) using candle-like dry microneedle electrodes (CMEs). CMEs can record an electroencephalogram (EEG) even from hairy areas without any skin preparation, unlike conventional wet electrodes. In our previous research, we experimentally verified that CMEs can measure the spontaneous potential of EEG from the hairy occipital region without preparation with a signal-to-noise ratio as good as that of the conventional wet electrodes which require skin preparation. However, these results were based on frequency-based signals, which are relatively robust compared to noise contamination, and whether CMEs are sufficiently sensitive to capture finer signals remained unclear. Here, we first experimentally verified that CMEs can extract ERPs as good as conventional wet electrodes without preparation. In the auditory oddball tasks using pure tones, P300, which represent ERPs, was extracted with a signal-to-noise ratio as good as that of conventional wet electrodes. CMEs successfully captured perceptual activities. Then, we attempted to investigate cerebral cognitive activity using ERPs. In processing the vowel and prosody in auditory stimuli such as /itta/, /itte/, and /itta?/, laterality was observed that originated from the locations responsible for the process in near-infrared spectroscopy (NIRS) and magnetoencephalography experiments. We simultaneously measured ERPs with CMEs and NIRS in the oddball tasks using the three words. Laterality appeared in NIRS for six of 10 participants, although laterality was not clearly shown in the results, suggesting that EEGs have a limitation of poor spatial resolution. On the other hand, successful capturing of MMN and P300 using CMEs that do not require skin preparation may be readily applicable for real-time applications of human perceptual activities.