透過您的圖書館登入
IP:18.119.105.239
  • 學位論文

仿生型自主式水下載具利用環境特徵之定位控制

Localization of Biomimetic Autonomous Underwater Vehicle by Environmental Detection

指導教授 : 郭振華

摘要


本文提出一個使用延伸式卡爾曼濾波器的資訊融合方法,裝設於仿生型自主式水下載具(簡稱BAUV)上做為環境偵測與導航之用。本文於BAUV載具前端裝設聲納以及攝影機來偵測未知的環境,單點式聲納可以提供載具與未知物的距離,而立體視覺也可配合載具狀態來計算在影像座標上目標物與載具之間的相對位置,聲納及攝影機之資料先由卡爾曼濾波器執行載具及目標點位置估測,二者之資訊再由聯邦濾波器來更新估測值,使得載具在感測器資訊融合之下能夠準確地的接近目標點。本文所提出的資料融合方法,最後以BUAV載具在水槽內測試可行性,實驗結果證明本方法皆可以提高載具及環境位置估測之準確度。

並列摘要


A sensor fusion algorithm using Extended Kalman Filter for a fishlike Biomimetic Autonomous Underwater Vehicle (BAUV) is presented. This is a step toward the applications of BAUVs in real missions. There are six brushless DC servomotors mounted inside the BAUV, four of the motors responsible for the motion of side fins, two motors responsible for the tail joint and the tail fin joint. In addition, the BAUV’s head moves in a cyclic fashion, so one can use an echo sounder and two video cameras installed on its head to detect the environment. Firstly, BAUV uses an imaging sonar to detect and estimate the target location roughly until the BAUV proceeds to visible range of the target, then switch to visual guidance mode to reach the target point in good precision. An efficient, federated Kalman filter is developed for use in distributed multi-sensor systems. Using data fusion and updating the information, the BAUV can have more accuracy in detecting target and control its orientation to approach targets in an unknown environment.

並列關鍵字

BAUV Federated Filter Estimation

參考文獻


[1] Zhi-Jie Chen, The optimal body motion for oscillation propulsion of a biomimetic autonomous underwater vehicle, Master thesis, National Taiwan University, 2002.
[2] Yueh-Sheng Ho, Control system for waypoint-tracking of a biomimetic autonomous underwater Vehicle, Master thesis, National Taiwan University, 2003.
[3] J. Miura, Y. Shirai, “Vision and motion planning for a mobile robot under uncertainty,” Int. J. Robotics Research 16(6) (1997), pp.806-825.
[4] Andrew J. Davison, David W. Murray, “Simultaneous location and map-building using active vision,” IEEE Trans. On Pattern Analysis and Machine Intelligence, Vol. 24, No. 7. July 2002, pp. 865-880.
[5] Inhyuk Moon, J. Miura, Y. Shirai, “On-line viewpoint and motion planning for efficient visual navigation under uncertainty,” Robotics and Autonomous System 28 (1999), pp. 237-248.

延伸閱讀