Letter to the Editor
Alexandros Psarris, Michael Syndos, George Daskalakis, Dimitrios Loutradis
1st Department of Obstetrics and Gynecology, “Alexandra” Maternity Hospital, National and Kapodistrian University of Athens, Greece
Correspondence: Psarris Alexandros MD MSc, 1st Department of Obstetrics and Gynecology “Alexandra” General Hospital National and Kapodistrian University of Athens Vasilissis Sofia 80 and Lourou Street, 11528 Athens, Greece, Tel: 0030 6979232977, E-mail: Psarris.alexandros@gmail.com
Dear Editor,
Artificial intelligence (AI) is the ability of a machine (a computer for instance) to think like a human. More specifically, the use of complex algorithms enables computers to reason, solve problems and make decisions. The core aspects of AI are machine learning, natural language processing, convolutional neural networks and computer vision. Utilizing AI technologies in medicine to assist physicians with diagnostic and therapeutic dilemmas is very promising.
The introduction of AI in obstetrics and gynecology has shown great potential. Using AI for CTG (cardiotocography) interpretation managed to increase its sensitivity from 60% to 94% with a 91% specificity1. In breast imaging, AI is already part of everyday clinical practice. AI has shown a reduction of 5.7% in false positives and 9.4% in false negatives, in screening mammography interpretation, outperforming all US board certified radiologists who participated in the study of McKinney et al2. Hence, it is inevitable for AI not to make its way to fetal imaging and particularly fetal ultrasound.
Routine obstetric examination includes ultrasound evaluation of fetal viability and wellbeing, estimation of the fetal weight and gestational age and evaluation of the fetal anatomy. This is achieved in two-dimensional ultrasonography through acquisition of standard planes for the biometric measurement and inspection of anatomical structures. This process is often arduous and time consuming even for the most experienced sonographers as fetal position and patient characteristics may inhibit proper evaluation of anatomical structures.
The introduction of AI in fetal ultrasonography aimed at the automatic acquisition of fetal standard planes during routine ultrasound, thus enabling faster and more accurate fetal evaluation. The most promising approach so far has been proposed by Baumgartner et al. They proposed a system based on convolutional neural networks for the real time, automated detection of 13 fetal standard planes3. These include brain views at the levels of the cerebellum and at the posterior horn of the ventricle, coronal view of the lips and nose, standard abdominal view, axial kidney view, sagittal and coronal spine view, standard femur view, four chamber and three vessel heart views, right and left ventricular outflow tract views and median facial profile view. Furthermore, fetal anatomy structures are automatically identified and localized via a bounding box3. Apart from real time identification of standard planes, Baumgartner et al. demonstrated that ultrasound videos can be used for the same purpose, achieving an average accuracy of more than 90% and an average localization accuracy of 84.9%3. Similar results were achieved by Chen et al. proving the reproducibility of the method4.
Although the majority of researchers utilize neural networks to achieve optimal results, their use has limitations. Their “black box” nature in result interpretation, overtraining of the algorithm (inability of generalization) and proneness to overfitting (reduced performance in different datasets) do not allow for elimination of human supervision5.
Future research in the field of AI in fetal ultrasonography will most likely focus on the enhancement of machine learning algorithms. This will be achieved via curation of datasets, appropriate use of convolutional neural networks and recurrent neural networks and integration of multiple models into a single end to end model6. Finally, validation of the machine learning algorithm with large heterogenous datasets will ensure generalization of the results to diverse populations6.
It seems as though AI in fetal ultrasonography is one step away from being incorporated in everyday practice. The prospects of automatic acquisition of standard planes are very interesting. Examination efficiency will be boosted with shorter examination times and increased diagnostic accuracy. Furthermore, automatically acquired images from examinations conducted by less experienced sonographers could be easily forwarded to maternal-fetal medicine specialists for consultation.
Inevitably, the application of AI in fetal ultrasound raises important questions. Does the image interpretation need human supervision, or we can rely on the AI? In cases of wrong or missed diagnosis who is legally responsible? Is it the sonographer performing the scan, the programmer of the AI software or the company that supplies the software? Is technical skill and experience going to fade as the sonographers rely more and more on the automation provided by the AI? Is AI going to ease the workload of medical professionals or replace them entirely?
It is evident that ultrasound technology is advancing with leaps and bounds, providing great tools for maternal-fetal medicine specialists. However, care should be exercised in incorporating AI in daily clinical practice as many questions remain unanswered and unanticipated problems are likely to occur. The ethical dimensions of using artificial intelligence in health care are just starting to become apparent and the scientific community is only at the beginning of tackling the many important issues that arise7.
References
1. Fergus P, Hussain A, Al-Jumeily D et al. Classification of caesarean section and normal vaginal deliveries using foetal heart rate signals and advanced machine learning algorithms. Biomed Eng Online BioMed Central Ltd.; 2017; 16.
2. McKinney SM, Sieniek M, Godbole V et al. International evaluation of an AI system for breast cancer screening. Nature 2020; 577: 89–94.
3. Baumgartner CF, Kamnitsas K, Matthew J et al. SonoNet: Real-Time Detection and Localisation of Fetal Standard Scan Planes in Freehand Ultrasound. IEEE Trans Med Imaging Institute of Electrical and Electronics Engineers Inc.; 2017; 36: 2204–2215.
4. Chen H, Wu L, Dou Q et al. Ultrasound Standard Plane Detection Using a Composite Neural Network Framework. IEEE Trans Cybern Institute of Electrical and Electronics Engineers Inc.; 2017; 47: 1576–1583.
5. Pergialiotis V, Pouliakis A, Parthenis C et al. The utility of artificial neural networks and classification and regression trees for the prediction of endometrial cancer in postmenopausal women. Public Health Elsevier B.V.; 2018; 164: 1–6.
6. Chen PHC, Liu Y, Peng L. How to develop machine learning models for healthcare. In: Nature Materials. Nature Publishing Group; 2019.
7. Rigby MJ. Ethical dimensions of using artificial intelligence in health care. In: AMA Journal of Ethics. American Medical Association; 2019.
Received 21-01-20
Revised 30-04-20
Accepted 30-04-20