Login / Signup

Mixed Reality Interfaces for Achieving Desired Views with Robotic X-ray Systems.

Benjamin D KilleenJonas WinterWenhao GuAlejandro Martin-GomezRussell H TaylorGreg OsgoodMathias Unberath
Published in: Computer methods in biomechanics and biomedical engineering. Imaging & visualization (2022)
Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction paradigm is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding even more complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view relative to the anatomy, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool's pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Initial human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of X-ray images acquired solely during "fluoro hunting" for the desired view or standard plane.
Keyphrases
  • high resolution
  • robot assisted
  • dual energy
  • minimally invasive
  • computed tomography
  • endothelial cells
  • virtual reality
  • case report
  • pluripotent stem cells
  • trauma patients