Evaluating the effects of a programming error on a virtual environment measure of spatial navigation behavior.
Steven M WeisbergVictor R SchinaziAndrea FerrarioNora S NewcombePublished in: Journal of experimental psychology. Learning, memory, and cognition (2022)
Relying on shared tasks and stimuli to conduct research can enhance the replicability of findings and allow a community of researchers to collect large data sets across multiple experiments. This approach is particularly relevant for experiments in spatial navigation, which often require the development of unfamiliar large-scale virtual environments to test participants. One challenge with shared platforms is that undetected technical errors, rather than being restricted to individual studies, become pervasive across many studies. Here, we discuss the discovery of a software bug in a virtual environment platform used to investigate individual differences in spatial navigation: Virtual Silcton. The bug, which was difficult to detect for several reasons, resulted in storing the absolute value of a direction in a pointing task rather than the signed direction and rendered the original sign of the direction unrecoverable. To assess the impact of the bug on published findings, we collected a new data set for comparison. Results revealed that although the bug caused suppression in pointing errors and had different effects across people (less accurate navigators had more suppression), the effect of the bug on published data is small, partially explaining the difficulty in detecting the bug. We also used the new data set to develop a tool that allows researchers who have previously used Virtual Silcton to evaluate the impact of the bug on their findings. We summarize the ways that shared open materials, shared data, and collaboration can pave the way for better science to prevent errors in the future. (PsycInfo Database Record (c) 2022 APA, all rights reserved).