Automated Analysis Pipeline for Extracting Saccade, Pupil, and Blink Parameters Using Video-Based Eye Tracking.
Brian C CoeJeff HuangDonald C BrienBrian J WhiteRachel YepDouglas P MunozPublished in: Vision (Basel, Switzerland) (2024)
The tremendous increase in the use of video-based eye tracking has made it possible to collect eye tracking data from thousands of participants. The traditional procedures for the manual detection and classification of saccades and for trial categorization (e.g., correct vs. incorrect) are not viable for the large datasets being collected. Additionally, video-based eye trackers allow for the analysis of pupil responses and blink behaviors. Here, we present a detailed description of our pipeline for collecting, storing, and cleaning data, as well as for organizing participant codes, which are fairly lab-specific but nonetheless, are important precursory steps in establishing standardized pipelines. More importantly, we also include descriptions of the automated detection and classification of saccades, blinks, "blincades" (blinks occurring during saccades), and boomerang saccades (two nearly simultaneous saccades in opposite directions where speed-based algorithms fail to split them), This is almost entirely task-agnostic and can be used on a wide variety of data. We additionally describe novel findings regarding post-saccadic oscillations and provide a method to achieve more accurate estimates for saccade end points. Lastly, we describe the automated behavior classification for the interleaved pro/anti-saccade task (IPAST), a task that probes voluntary and inhibitory control. This pipeline was evaluated using data collected from 592 human participants between 5 and 93 years of age, making it robust enough to handle large clinical patient datasets. In summary, this pipeline has been optimized to consistently handle large datasets obtained from diverse study cohorts (i.e., developmental, aging, clinical) and collected across multiple laboratory sites.