Evaluating Scoliosis Severity Based on Posturographic X-ray Images Using a Contrastive Language-Image Pretraining Model.
Artur FabijanRobert FabijanAgnieszka Zawadzka-FabijanEmilia NowosławskaKrzysztof ZakrzewskiBartosz PolisPublished in: Diagnostics (Basel, Switzerland) (2023)
Assessing severe scoliosis requires the analysis of posturographic X-ray images. One way to analyse these images may involve the use of open-source artificial intelligence models (OSAIMs), such as the contrastive language-image pretraining (CLIP) system, which was designed to combine images with text. This study aims to determine whether the CLIP model can recognise visible severe scoliosis in posturographic X-ray images. This study used 23 posturographic images of patients diagnosed with severe scoliosis that were evaluated by two independent neurosurgery specialists. Subsequently, the X-ray images were input into the CLIP system, where they were subjected to a series of questions with varying levels of difficulty and comprehension. The predictions obtained using the CLIP models in the form of probabilities ranging from 0 to 1 were compared with the actual data. To evaluate the quality of image recognition, true positives, false negatives, and sensitivity were determined. The results of this study show that the CLIP system can perform a basic assessment of X-ray images showing visible severe scoliosis with a high level of sensitivity. It can be assumed that, in the future, OSAIMs dedicated to image analysis may become commonly used to assess X-ray images, including those of scoliosis.
Keyphrases
- deep learning
- artificial intelligence
- convolutional neural network
- optical coherence tomography
- high resolution
- big data
- machine learning
- dual energy
- early onset
- end stage renal disease
- chronic kidney disease
- electron microscopy
- magnetic resonance
- electronic health record
- prognostic factors
- smoking cessation
- peritoneal dialysis
- quality improvement
- patient reported