Login / Signup

Validation of Novel Relative Orientation and Inertial Sensor-to-Segment Alignment Algorithms for Estimating 3D Hip Joint Angles.

Lukas AdamowiczReed D GurchiekJonathan FerriAnna T UrsinyNiccolo FiorentinoRyan S McGinnis
Published in: Sensors (Basel, Switzerland) (2019)
Wearable sensor-based algorithms for estimating joint angles have seen great improvements in recent years. While the knee joint has garnered most of the attention in this area, algorithms for estimating hip joint angles are less available. Herein, we propose and validate a novel algorithm for this purpose with innovations in sensor-to-sensor orientation and sensor-to-segment alignment. The proposed approach is robust to sensor placement and does not require specific calibration motions. The accuracy of the proposed approach is established relative to optical motion capture and compared to existing methods for estimating relative orientation, hip joint angles, and range of motion (ROM) during a task designed to exercise the full hip range of motion (ROM) and fast walking using root mean square error (RMSE) and regression analysis. The RMSE of the proposed approach was less than that for existing methods when estimating sensor orientation ( 12 . 32 ∘ and 11 . 82 ∘ vs. 24 . 61 ∘ and 23 . 76 ∘ ) and flexion/extension joint angles ( 7 . 88 ∘ and 8 . 62 ∘ vs. 14 . 14 ∘ and 15 . 64 ∘ ). Also, ROM estimation error was less than 2 . 2 ∘ during the walking trial using the proposed method. These results suggest the proposed approach presents an improvement to existing methods and provides a promising technique for remote monitoring of hip joint angles.
Keyphrases
  • machine learning
  • total hip arthroplasty
  • deep learning
  • clinical trial
  • high resolution
  • working memory
  • study protocol
  • ultrasound guided
  • open label