This media is currently not available.
Artificial Intelligence for Endoscopic Ultrasound: Multicenter Multidevice Detection and Differentiation of Subepithelial Lesions
Poster Abstract

Aims

Subepithelial lesions (SELs) are commonly identified during esophagogastroduodenoscopy. Nevertheless, their diagnostic approach is complex, and patient management is dependent of appropriate anatomopathological diagnosis. In this context, endoscopic ultrasound (EUS) offers the possibility of both lesion characterization and sampling, being a first line procedure in patients with suspected upper gastrointestinal tract SELs over 10 millimeters. However, EUS is operator dependent and has a suboptimal diagnostic accuracy. This multicentric study aimed to developed a deep learning model for detection of upper gastrointestinal tract SEL, with distinction between leiomyoma and gastrointestinal stromal tumors (GIST).

Methods

An artificial intelligence model based on a convolutional neural network (YOLO) was developed. This architecture included both detection and characterization modules, in which lesions were delimitated with bounding boxes and classified as GIST or leiomyoma. The model performance of the detection module was assessed through the mean average precision at an intersection-over-unity threshold of 50% (mAP50), while the classification module was evaluated through its sensitivity, precision, accuracy, and area under precision-recall curve (AUPRC).

Results

A convolutional neural netwrok was developed using a total of 67,452 images from 91 EUS procedures performed across 12 centers in 6 countries (Portugal, Spain, United Kingdom, United States of America, Brazil and Argentina). Pleomorphic SEL were identified with a mAP50 of 98.6%. The AI system had an overall sensitivity of 98.4%, precision of 98.3% and accuracy of 98.5%. The model had an AUPRC of 0.996 for GIST and 0.977 for leiomyoma.

Conclusions

This study represents one of the first multicentric efforts to detect and differentiate the most frequent subepithelial lesions on EUS with a single interoperable AI system. Leveraging data from five distinct EUS processors strengthens technical robustness, while integrating cases from six countries across three continents reduces demographic bias and supports global applicability. This AI model has the potential to redefine the evaluation of subepithelial lesions by improving diagnostic discrimination and supporting more targeted EUS-guided tissue sampling, ultimately elevating clinical confidence and decision making.