透過您的圖書館登入
IP:18.222.37.169
  • 學位論文

深度卷積神經網路於茶葉採摘點辨識之應用

Application of Deep Convolutional Neural Networks for the Identification of Tea Plucking Points

指導教授 : 陳世芳
本文將於2028/08/20開放下載。若您希望在開放下載時收到通知,可將文章加入收藏

摘要


現行茶葉採收方式主要分為手採與機器採收兩大類,其中以機器進行採收較手採之方式,可增進12到15倍的效率,然機器採收無法避免造成破葉和老葉的收集,亦無法達成特定位置(如:一心二葉、一心三葉或一心四葉)的採收需求。因此對於精品茶市場,仍須以人力需求大的手採為主,但於採收季時面臨勞動力缺乏問題,因此為兼顧採收效率與特定採收位置,本研究致力於開發茶葉採收點辨識。 本研究旨在使用深度學習於偵測嫩葉和辨識其採摘點,使用更快速區域卷積神經網路(Faster Region-based Convolutional Neural Network, Faster R-CNN),搭配ZF模型,達成偵測嫩葉區域位置資訊,再透過全卷積網路(Fully Convolutional Network, FCN)辨識出欲採之區域,測試其三種結構:FCN-32s、FCN-16s和FCN-8s,以FCN-16s表現最佳,最後以影像處理方法決定其二維採收座標。選用台茶8號和台茶18號為訓練樣本,Faster R-CNN其測試平均精確度(Average Precision)結果獲得86.34%,FCN測試結果,其平均準確度和平均交集與聯集比(Intersection over Union),分別達84.91%和70.72%。同時經過測試,其所使用之方法同時也能應用於未被訓練之茶種之上,如:青心烏龍、台茶12號和台茶13號,同時影像並不會受其相機參數影響,亦可達到辨識之結果,訓練之模型提供了一心二葉採摘點位置辨識之成果。

並列摘要


Tea (Camellia sinensis) has mainly two plucking types: hand plucking and machine plucking. Although mechanical tea harvester boosts the harvesting efficiency by 12 to 15 times compared with hand-plucking method, it cannot avoid broken or old leaves and achieve the specific points (e.g., one tip with two leaves, one tip with three leaves, one tip with three leaves). High value tea is usually harvested by hand, which is labor intensive. However, tea farmers have faced the problem of labor shortage. To achieve efficient harvesting and specific plucking point, this study focused on developing an algorithm to identify the plucking points of tea shoot. This study proposed to automatically identify and localize tea plucking point using deep learning. First, faster region-based convolutional neural network (Faster R-CNN) with ZF model was applied to identify the regions of tea shoots. Second, fully convolutional network (FCN) was applied to identify the plucking region. After comparing the performance of three types of FCN: FCN-32s, FCN-16s and FCN-8s, FCN-16s structure was selected. Finally, image processing was applied to get the two dimensional coordinate. Tea leaf images of Taiwan Tea Experiment Station no. 8 and no. 18 were acquired and were used to develop Faster R-CNN and FCN. The Faster R-CNN model achieved a testing average precision (AP) of 86.34%. The Faster R-CNN model could also be applied on another variety Chin Shin oolong. The testing of FCN achieved an average accuracy and average intersection-over-union of 84.91% and 70.72%, respectively. The testing results showed that these methods achieved the same performance while applied on other varieties not used for training (e.g., chin shin oolong, Taiwan Tea Experiment Station no.12 and no.13). Besides, images could not be influenced by other camera and were successfully identified. The developed model presents a promising result to provide the plucking position of the specified tea shoot.

並列關鍵字

Faster R-CNN FCN one bud with two leaves

參考文獻


行政院農委會。2018。農業統計月報。台北:行政院農委會。網址: http://agrstat.coa.gov.tw/sdweb/public/book/Book.aspx。上網日期:2018-8-13。
行政院農委會。2018。農產品生產面積統計。台北:行政院農委會。網址: http://agrstat.coa.gov.tw/sdweb/public/inquiry/InquireAdvance.aspx。上網日期:2018-8-13。
行政院農委會。2018。農畜產品生產成本統計。台北:行政院農委會。網址: http://agrstat.coa.gov.tw/sdweb/public/inquiry/InquireAdvance.aspx。上網日期:2018-8-13。
吳雪梅、張富貴、呂敬堂。2013。基於圖像顏色信息的茶葉嫩葉識別方法研究。中國農業機械學報 33(6):584~589。
吳雪梅、唐仙、張富貴、顧金梅。2015。基於K-means聚類法的茶葉嫩芽識別研究。中國農業機械學報 36(5)。

延伸閱讀