透過您的圖書館登入
IP:18.119.17.64
  • 學位論文

合作式影像系統在農業環境作業模式分析之應用

Applications of Cooperative Imaging System on Working Pattern Analyses in Agricultural Environment

指導教授 : 林達德
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


合作式影像系統是一個用於解決長時間觀察的方法。據此,我們建立了一套合作式影像系統用來幫助農業環境中的管理者或是擁有者能夠更了解他們的設施運作狀況、以及一些需要長時間觀察才能獲得的資訊。 本研究繼承了前代系統的架構並進行改良,舊系統採用環場影像以及Pan-Tilt-Zoom (PTZ) 攝影機的個別優勢,將其組合在一起同時取得廣角的全域影像以及局部的高解析度影像;由於硬體效能上的限制以及環場影像攝影機的畫面品質較為不足,因此新系統使用了廣角攝影機以及PTZ攝影機,其中廣角攝影機之角度高達135度,能夠捕捉該地大部分所發生的事件影像,再利用能夠自由進行光學變焦的PTZ取得欲觀察之局部高解析影像。 在軟體方面架構上我們從主從式影像系統改成合作式影像系統,讓攝影機由單純的主從關係轉變成對等並能夠進行情報的交換,有助於個別攝影機在下次能夠更容易地尋找到所要觀察之目標。為偵測所有活動區域的目標,尋找前景的部分使用了多重解析度高斯混合模型背景相減法 (Multi-resolution Gaussian Mixture Model) 和靜態物體追蹤法 (Static Object Detection),並使用動態歐式距離 (Dynamic Euclidean Distance) 對抓取目標的演算法閥值進行自動化參數調整,為了使每個攝影機都能夠獨立運作並進行目標的追蹤,系統改為多重並序的架構進行運作,並且利用PTZ影像進行像差 (frame differential) 與質心預測 (centroid estimation) 演算法讓PTZ也能夠自動進行物件追蹤以增加抓到目標細節影像的準確率。這些來自攝影機的資料將會被送到系統後端進行分析產生工作模式分析,辨識物體的類別是透過由加柏濾波器 (Gabor Filter) 和詞袋模型 (Bag of Words) 組成的混合特徵擷取演算法進行追蹤物件的擷取,再透過支持向量機演算法 (Support Vector Machine, SVM) 分析結果,為了讓系統能夠在不同環境下使用,我們建立了一套自定義規則系統,讓使用者能夠使用軌跡資訊以及支持向量機演算法分類的類別對該地進行組態的客製化以符合需求,並且提供使用者一個良好的使用者介面去觀察分析的結果。 本研究進行了五次實驗以驗證系統的有效性,第一次實驗在台大園藝系溫室驗證並改進了物體追蹤演算法;第二次實驗在紫城農場進行農業環境簡單條件下的測試,透過該實驗設計並改良自定義追蹤系統;第三次實驗在第三班蔬菜集貨包裝廠測試農業環境下集貨區的工作模式,確保系統能夠有環性的自適性;第四次實驗在台大知武館,強化了PTZ自我追蹤演算法的架構;第五次實驗在農機館門口,再次對系統做了驗證並且改進了系統的資料架構以及系統架構的多重並序。我們開發的合作式影像系統能夠確保在事件發生時可以取得最大的資訊量以便我們進行精準的觀察與分析。

並列摘要


The surveillance systems solve the difficulties of long-term observation. We are designing a cooperative surveillance system to make managers and owners get to know their places of the agricultural environments better by providing the working patterns and the other additional long-time observed information. This research continues the predecessor’s system structure and improves it. The old system use a combination of a panorama camera set and a Pan-Tilt-Zoom (PTZ) camera. The system has the advantage of monitoring the object’s surroundings and the object itself in high resolution at the same time. Due to the limitation of the hardware performance and the low image quality from the camera set, the new system uses an ultra-wide field of view (FOV) camera and a Pan-Tilt-Zoom (PTZ) camera. Ultra-wide FOV images from the static camera up to 135 degree provide most of any possible happenings, and the images from PTZ camera fill the information of the low-resolution images of the ultra-wide FOV camera. In the software, we changed the system from Master-slave system to Cooperative Surveillance system. The relationship between the cameras has changed from inequality to equal. The Network Control Center in the system is able to let the cameras to communicate with each other instead of doing their works on their own. In order to detect the objects from the cameras, we use Multi-resolution Gaussian Mixture Model, Static Object Detection and Dynamic Euclidean Distance to adapt the video sequence in different environment. Some changes are made to make the system more “cooperative”. First, the system structure is changed to parallel thread processing. We also use frame differential and centroid estimation to process the PTZ images to make the PTZ able to do self-tracking tasks to increase the accuracy of capturing any tracked object. All the information from the cameras will be sent to the Network Control Center to analyze the working patterns of a place. We use mixture feature extracting method consist of Gabor filter and Bag of Words to process the images of the detected objects. The processed featured will be trained and predicted using SVM. To make the system able to fit different environment use, a Custom Define Rules system is provided to let users to create their own working patterns by existing features and the trajectory information. An Object View Manager is also provided for users to look up the individual detail of any detected object. We designed five experiments to validate our system. The first experiment is done in the NTU farm; we verified our object tracking methods with the new camera. The second experiment is done in the Zhi Chen farm. We enhanced and redesigned the Custom Define Rules System. The third experiment in the 3rd vegetable packaging factory verified the previous changes is adaptable in different environment. With the fourth experiment outside of the Tomatake Hall, we improved the PTZ self-tracking algorithm. At last, the fifth experiment in the plaza in front of the Dept. of the BIME, we again verified the above changes and made a final improvement of our system and data structure. The cooperative system ensures the maximum information during any event that provides us to judge more precisely.

參考文獻


余世忠。2012。主從式影像監視系統之研製與生態監測應用。碩士論文。台北: 台灣大學生物產業機電工程學系
Aleksander, I., and H. Morton. 1995. An Introduction to Neural Computing 2nd ed.
Arulampalam, M. S., S. Maskell, N. Gordon, and T. Clapp. 2002. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. Signal Processing, IEEE Transactions on 50(2):174-188.
Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. 2008. Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding 110(3):346-359.
Bohyung, H., Z. Ying, D. Comaniciu, and L. Davis. 2005. Kernel-based Bayesian filtering for object tracking. In Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on.

延伸閱讀