透過您的圖書館登入
IP:3.12.73.64
  • 學位論文

3D膠囊神經網路之乳房自動超音波電腦輔助腫瘤診斷

3D Capsule Neural Network on Automated Breast Ultrasound Tumor Diagnosis

指導教授 : 張瑞峰

摘要


乳癌是女性最常見的癌症之一,因此早期的偵測、診斷和治療是減低病人的死亡的最好方法。全自動乳房超音波(Automated Breast Ultrasound, ABUS)因可提供醫生完整的三維乳房影像資訊所以廣泛用於腫瘤偵測,然而大量的影像使得醫生需要花費較多時間檢閱影像、確定腫瘤位置、以及預先腫瘤良惡性。近來,基於卷積神經網路(Convolutional Neural Network, CNN)開發的電腦輔助診斷系統證實CNN能從影像中自動學習紋理與形狀特徵並提升醫生的診斷率,然而,卷積神經網路有著對於物體旋轉和對於特徵之間的相對關係學習不佳的問題。2017年,一個淺層且以向量方式呈現特徵的膠囊網路被提出解決卷積神經網路的問題,但也因為層數少的關係會有不易從影像中學習較複雜特徵的問題。因此,本研究提出一個包含3-D U型網路與改進的3-D膠囊網路的電腦輔助診斷系統,首先,從ABUS影像中取得腫瘤範圍,接著透過3-D U型網路取得腫瘤遮罩,最後,再利用改進的3-D膠囊網路同時取得紋理與形態特徵進行腫瘤診斷。論文中,我們引入了3-D殘差塊提高診斷腫瘤良惡性的準確率,此外,由於全自動乳房超音波影像資訊與膠囊網路使得系統訓練時限制了批量集的大小影響系統準確率,我們更採用組標準化取代原本的批量標準化解決此問題提升系統準確率。實驗中共使用了446筆全自動乳房超音波產生的乳房腫瘤影像,其中包含229個惡性腫瘤和217個良性腫瘤。根據實驗顯示,我們提出的方法達到準確率85.20%、靈敏性87.34%、特異性82.95%和ROC曲線下面積0.9134的成果,顯示系統是有能力在ABUS影像上進行腫瘤診斷。

並列摘要


Breast cancer is one of the most common cancer in the female. Early detection, diagnosis, and treatment is the best way for reducing mortality. The automated breast ultrasound (ABUS) had been widely used for breast tumor detection since it can provide the three-dimensional (3-D) volume of the breast for a physician to review. However, it is time-consuming for a physician to review the whole ABUS image, find out the suspicious lesion, and determine lesion as benign or malignant. Some computer-aided diagnosis (CADx) systems based on the convolution neural network (CNN) had been proposed and proven that CNN is a useful architecture to learn texture and shape features automatically for helping the physician make a diagnosis. However, CNN is poor at dealing with the spatial relationship between different features and object rotation. In 2017, a shallow network, capsule net (CapsNet), representing features as a vector was proposed for overcoming the problem. However, the shallow architecture was also the drawback that made the CapsNet hard to learn complex features from the image. Therefore, in this study, a CADx consisted of the 3-D U-net and the modified 3-D CapsNet network is proposed for tumor diagnosis in ABUS. First, the volume of interest (VOI) is cropped out from the ABUS image. Then, the VOI is input into the 3-D U-net model to generate the tumor mask. Afterward, the VOI and mask are delivered to the modified CapsNet for determining tumor as malignant or benign. In this study, to overcome the drawback of original CapsNet, the 3-D residual block is introduced into the CapsNet for learning high-level features from tumor image. Furthermore, the group normalization is also substituted for the batch normalization to address the limitation of batch size during training because the usage of 3-D input and capsule is memory demanding. In our experiments, there were 446 breast tumors images generated from automated breast ultrasound system (ABUS), which included 229 malignant tumors and 217 benign tumors. In our experiment result, the accuracy, sensitivity, specificity, and area under the curve (AUC) were 85.20%, 87.34%, 82.95% and 0.9134 respectively which outperformed other CNN models.

參考文獻


[1] R. L. Siegel, K. D. Miller, and A. Jemal, "Cancer statistics, 2019," CA: a cancer journal for clinicians, vol. 69, no. 1, pp. 7-34, Jan 2019.
[2] R. J. Hooley, L. M. Scoutt, and L. E. Philpotts, "Breast ultrasonography: state of the art," Radiology, vol. 268, no. 3, pp. 642-659, Sep 2013.
[3] D.-R. Chen, R.-F. Chang, W.-J. Kuo, M.-C. Chen, and Y.-L. Huang, "Diagnosis of breast tumors with sonographic texture analysis using wavelet transform and neural networks," Ultrasound in medicine & biology, vol. 28, no. 10, pp. 1301-1310, Oct 2002.
[4] M.-C. Yang et al., "Robust texture analysis using multi-resolution gray-scale invariant features for breast sonographic tumor diagnosis," IEEE Transactions on Medical Imaging, vol. 32, no. 12, pp. 2262-2273, Dec 2013.
[5] J.-Z. Cheng et al., "Computer-aided diagnosis with deep learning architecture: applications to breast lesions in US images and pulmonary nodules in CT scans," Scientific reports, vol. 6, p. 24454, Apr 2016.

延伸閱讀