Projects per year
Abstract
Diagnostic abdominal ultrasound screening and monitoring protocols are based around gathering a set of standard cross sectional images that ensure the coverage of relevant anatomical structures during the collection procedure. This allows clinicians to make diagnostic decisions with the best picture available from that modality. Currently, there is very little assistance provided to sonographers to ensure adherence to collection protocols, with previous studies suggesting that traditional image only machine learning classification can provide only limited assistance in supporting this task, for example it can be difficult to differentiate between multiple liver cross sections or those of the left and right kidney from image post collection. In this proof of concept, positional tracking information was added to the image input of a neural network to provide the additional context required to recognize six otherwise difficult to identify edge cases. In this paper optical and sensor based infrared tracking (IR) was used to track the position of an ultrasound probe during the collection of clinical cross sections on an abdominal phantom. Convolutional neural networks were then trained using both image-only and image with positional data, the classification accuracy results were then compared. The addition of positional information significantly improved average classification results from ~90% for image-only to 95% for optical IR position tracking and 93% for Sensor-based IR in common abdominal cross sections. While there is further work to be done, the addition of low cost positional tracking to machine learning ultrasound classification will allow for significantly increased accuracy for identifying important diagnostic cross sections, with the potential to not only provide validation of adherence to protocol but also could provide navigation prompts to assist in user training and in ensuring adherence in capturing cross sections in future.
Original language | English |
---|---|
Article number | 025002 |
Number of pages | 12 |
Journal | Machine Learning: Science and Technology |
Volume | 5 |
Issue number | 2 |
Early online date | 2 Apr 2024 |
DOIs | |
Publication status | Published - 30 Jun 2024 |
Funding
This work was supported by a UK Engineering and Physical Sciences Research Council (EPSRC) Future Ultrasonic Engineering Center for Doctoral Training (FUSE CDT) under Grants EP/S023879/1 and 2296317.
Keywords
- machine learning
- ultrasound
- classification
- infrared sensors
Fingerprint
Dive into the research topics of 'Using positional tracking to improve abdominal ultrasound machine learning classification'. Together they form a unique fingerprint.Projects
- 1 Finished
-
EPSRC Centre for Doctoral Training in Future Ultrasonic Engineering (FUSE) | Lawley, Alistair
Dobie, G. (Principal Investigator), Hampson, R. (Co-investigator) & Lawley, A. (Research Co-investigator)
EPSRC (Engineering and Physical Sciences Research Council)
1/10/19 → 27/03/24
Project: Research Studentship - Internally Allocated
Datasets
-
Data for: "Using Positional Tracking to Improve Abdominal Ultrasound Machine Learning Classification"
Lawley, A. (Creator) & Dobie, G. (Supervisor), University of Strathclyde, 20 Dec 2023
DOI: 10.15129/6151ae14-8acf-4bed-a1b6-d9367aeec218
Dataset