透過您的圖書館登入
IP:18.219.215.178
  • 學位論文

應用深度學習技術於蘆筍生長階段及病害辨識之應用

Application of Deep Learning Technology in Asparagus Growth Stage and Disease Identification

指導教授 : 周呈霙 江昭皚 王人正 林達德
本文將於2024/09/26開放下載。若您希望在開放下載時收到通知,可將文章加入收藏

摘要


蘆筍是一種具有很高經濟價值的季節性作物,是最有營養的蔬菜之一。然而,在溫暖潮濕的天氣裡,台灣蘆筍的莖部經常感染一種真菌病,即莖枯病。真菌孢子感染並破壞莖的組織。如果種植者不及早治療蘆筍的莖枯病,受感染的蘆筍就會枯萎死亡。此外,蘆筍有幾種生長階段,如嫩芽、嫩莖、母莖等。透過深度學習技術辨識蘆筍的三種生長階段,將有助於種植者確切地了解蘆筍植株的生長階段和分佈位置。 本研究中,我應用各種分類器辨識種植範圍內感染莖枯疾病的嚴重程度;我也應用各種物件偵測模型來識別蘆筍莖枯疾病的位置與蘆筍不同的生長狀態和蘆筍嫩莖分佈位置。在目標檢測模型的損失函數中,我使用先進的邊界框損失函數來優化物件偵測模型的損失函數,例如: GIoU、DIoU與CIoU。同時,我也應用SSD、Faster R-CNN、YOLOv3、YOLOv5以及改良的YOLOv5 C3-DenseNet模型來達成物件偵測的任務。新提出的YOLOv5 C3-DenseNet模型在特徵提取骨幹中使用密集連接卷積網絡架構;在頸部層中,它使用殘差網路架構取代原本的雙卷積層;在預測層中,它增加一個具有160×160像素的預測特徵圖與額外的三個錨框以增加模型的能力與精度。 在測試結果中,對於莖枯病早期病變檢測任務。YOLOv5s與YOLOv5s C3-DenseNet模型在IoU標準等於0.5的平均精度均值分別為0.7370和0.7960;對於蘆筍不同的生長狀態和分佈位置檢測任務,在IoU標準等於0.5的YOLOv3、YOLOv5l與YOLOv5l C3-DenseNet模型的平均精度均值為0.7700、0.7689和0.8101。本研究通過即早發現感染病灶,種植者可以及時對感染的蘆筍進行精準有效的治療,進而減少經濟的損失。另外,本研究可準確地識別蘆筍生長狀態和分佈位置有助於區分與定位細小的蘆筍嫩莖,最終地幫助農民收成蘆筍嫩莖的與增進經濟獲利。

並列摘要


Asparagus is a seasonal crop with high economic value and one of the most nutritious vegetables. However, a fungal disease often infects the stem of asparagus in Taiwan during warm and wet weather, namely stem blight disease. The fungus spores infect and damage the tissues of the stem. If the planter does not treat the stem blight disease of asparagus early, the infected asparagus will wither and die. In addition, asparagus has several growth stages, such as shoot, spear, mother-stalk, etc. Identifying these stages of growth of asparagus with deep learning technologies will help the planter identify the growth status and distribution location of asparagus. In this study, I applied various classifiers to identify the severity of stalk blight disease in the planting area; I also applied various object detection models to identify the location of asparagus stalk blight disease and the different growth states of asparagus, and the distribution of asparagus spears. In the loss function of object detection models, I used advanced bounding box loss functions to optimize the loss function, such as GIoU, DIoU, and CIoU. At the same time, I also applied SSD, Faster R-CNN, YOLOv3, YOLOv5, and the improved YOLOv5 C3-DenseNet model to achieve object detection tasks. This proposed YOLOv5 C3-DenseNet uses the architecture of a densely connected convolutional network (DenseNet) in the feature extraction backbone; in the neck layer, it uses the residual network architecture to replace the double convolution layer; in the prediction layer, it adds 160×160 pixels predicted feature maps and additional anchor boxes to increase the power and accuracy of the model. For the test results, for the early lesion detection task of stem blight disease, the mean average precision (mAP) of the YOLOv5s, and YOLOv5s C3-DenseNet models under the IoU standard equal to 0.5 were 0.7370, and 0.7960, respectively; the detection tasks of different growth states and distribution positions of asparagus, the mean average precision (mAP) of the YOLOv3, YOLOv5l and YOLOv5l C3-DenseNet models are 0.7700, 0.7689, and 0.8101 when the IoU criterion is equal to 0.5. In this study, growers can timely and effectively treat infected asparagus through early detection of infection lesions, thereby reducing economic losses. In addition, this study can accurately identify the growth state and distribution position of asparagus, which is helpful to distinguish and locate the small asparagus shoots, and ultimately help farmers to harvest asparagus shoots and increase economic profits.

參考文獻


Afzaal, U., Bhattarai, B., Pandeya, Y. R., and Lee, J. (2021). An instance segmentation model for strawberry diseases based on Mask R-CNN. Sensors, 21(19):6565.
Bochkovskiy, A., Wang, C. Y., and Liao, H. Y. M. (2020). YOLOv4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934.
Buslaev, A., Iglovikov, V. I., Khvedchenya, E., Parinov, A., Druzhinin, M., and Kalinin, A. A. (2020). Albumentations: fast and flexible image augmentations. 11(2):125.
Dai, J., Li, Y., He, K., and Sun, J. (2016). R-FCN: Object detection via region-based fully convolutional networks. Advances in Neural Information Processing Systems, 29.
Davis, R. (2001). Asparagus stem blight recorded in Australia. Australasian Plant Pathology, 30(2):181–182.

延伸閱讀