透過您的圖書館登入
IP:18.225.255.134
  • 學位論文

基於深度學習於2-D乳房超音波報告自動產生系統

Automatic Reporting System for 2-D Ultrasound Images Using Deep Learning

指導教授 : 張瑞峰
本文將於2024/08/19開放下載。若您希望在開放下載時收到通知,可將文章加入收藏

摘要


乳癌是女性中最常見的癌症,藉由早期偵測可以降低乳癌的致死率。超音波是一種常見的早期偵測方法,放射科醫生分析超音波影像來撰寫成報告,並根據報告來決定病人是否要進行更進一步的檢查。然而對於放射科醫生來說,撰寫報告除了要具備對於乳癌方面的基本知識,也必須要有分析超音波影像的能力。同時,撰寫報告是一件枯燥且耗時的事情。本研究提出了一個自動報告產生系統來幫助醫生分析影像與完成報告。首先,我們使用了PSPNet的切割方法從超音波影像提取腫瘤區域並使用dense CRFs來進行後處理。接著我們使用深度學習模型、機器學習的分類器與集成學習的模型來進行腫瘤特徵的預測。最後,我們使用了這些預測結果來產生報告。在這個實驗中,利用了318個腫瘤來測試我們提出的方法。由實驗結果可知,由平均集成學習生成的模型在不同的腫瘤特徵分類上有最好的結果,其形狀、平行度、邊界特性、均質度、後方區域特性準確度分別為85.85% (273/318)、83.02% (264/318)、80.19% (255/318)、78.62% (250/318)、87.11% (277/318)。

並列摘要


Breast cancer is the most common cancer in women. Early detection could reduce the mortality rate of breast cancer. Ultrasound is often used for early detection, and ultrasound images are used to write the medical reports for the evaluation of further examinations by radiologists. However, for radiologists, writing reports requires domain knowledge of breast and skills of ultrasound images analysis, and it is tedious and time-consuming. In this study, an automatic reporting system was proposed to assist radiologists in writing the reports and analyzing the ultrasound images. First, the tumor region was extracted by the pyramid scene parsing network (PSPNet) segmentation model with dense dense condition random fields (CRFs). Second, the DL model and ML classifiers were applied to predict the lexicon of the tumor, and we also used the ensemble method the combine the DL model and ML classifiers. Finally, the predicted lexicons were applied to generate medical imaging reports. In this experiment, a totally of 318 tumors with ultrasound lexicons were used to evaluate our proposed method. According to the experiment results, the ensemble method using the average strategy combined with the DL model and the ML classifiers has the highest lexicon prediction performance, and the accuracy of lexicon prediction (shape, orientation, margin, heterogeneity, posterior features) were 85.85% (273/318), 83.02% (264/318), 80.19% (255/318), 78.62% (250/318), 87.11% (277/318).

參考文獻


[1] F. Bray, J. Ferlay, I. Soerjomataram, R. L. Siegel, L. A. Torre, and A. Jemal, "Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries," CA: A Cancer Journal for Clinicians, vol. 68, no. 6, pp. 394-424, Sep 2018.
[2] L. Wang, "Early Diagnosis of Breast Cancer," Sensors, vol. 17, no. 7, p. 1572, Jul 2017.
[3] E. B. Mendelson, M. Böhm-Vélez, and W. A. Berg, "ACR BI-RADS atlas, breast imaging reporting and data system.," Reston, VA: American College of Radiology, Jan 2014.
[4] P. Kisilev, E. Walach, E. Barkan, B. Ophir, S. Alpert, and S. Y. Hashoul, "From medical image to automatic medical report generation," IBM Journal of Research and Development, vol. 59, no. 2/3, pp. 2:1-2:7, Apr 2015.
[5] P. Kisilev, E. Walach, S. Y. Hashoul, E. Barkan, B. Ophir, and S. Alpert, "Semantic description of medical image findings: structured learning approach," in Proceedings of the British Machine Vision Conference (BMVC), 2015, pp. 171.1-171.11.

延伸閱讀