Quantifying the movement, behaviour and environmental context of group-living animals using drones and computer vision.
Benjamin KogerAdwait DeshpandeJeffrey T KerbyJacob M GravingBlair R CostelloeIain D CouzinPublished in: The Journal of animal ecology (2023)
Methods for collecting animal behaviour data in natural environments, such as direct observation and biologging, are typically limited in spatiotemporal resolution, the number of animals that can be observed and information about animals' social and physical environments. Video imagery can capture rich information about animals and their environments, but image-based approaches are often impractical due to the challenges of processing large and complex multi-image datasets and transforming resulting data, such as animals' locations, into geographical coordinates. We demonstrate a new system for studying behaviour in the wild that uses drone-recorded videos and computer vision approaches to automatically track the location and body posture of free-roaming animals in georeferenced coordinates with high spatiotemporal resolution embedded in contemporaneous 3D landscape models of the surrounding area. We provide two worked examples in which we apply this approach to videos of gelada monkeys and multiple species of group-living African ungulates. We demonstrate how to track multiple animals simultaneously, classify individuals by species and age-sex class, estimate individuals' body postures (poses) and extract environmental features, including topography of the landscape and animal trails. By quantifying animal movement and posture while reconstructing a detailed 3D model of the landscape, our approach opens the door to studying the sensory ecology and decision-making of animals within their natural physical and social environments.