透過您的圖書館登入
IP:3.133.146.143
  • 學位論文

視覺自主人形機器人之定位與導航

Localization and Navigation for Vision-based Autonomous Humanoid Robot

指導教授 : 翁慶昌
共同指導教授 : 鄭吉泰(Chi-Tai Cheng)

摘要


針對視覺自主的小型人形機器人,本論文提出一個定位與導航的設計與實現方法,主要有三大項:視覺系統、定位系統以及路徑規劃系統。在視覺系統的設計與實現上,本論文將影像中感興趣的顏色區塊標記為特徵,藉以作為定位的標準。在定位系統的設計與實現上,本論文首先建立一個有障礙物資訊的虛擬地圖,然後以蒙地卡羅自我定位法搭配粒子濾波器來推算機器人在環境中的位置。在路徑規劃系統的設計與實現上,本論文首先以近似度來求障礙物的適應值,使機器人不走向障礙物以減少碰撞之機率,然後以改良式A*演算法來規劃一個具有較少轉折點的全域規劃路徑,最後以虛擬路徑點作為區域避障的方式,使機器人在避開障礙物的同時,亦可以追尋全域路徑規劃的路徑點。從實驗結果可知,本論文所提出的方法確實可以讓機器人具備自我定位以及導航的能力,可以安全地避開障礙物並順利抵達目的地。

並列摘要


In this thesis, a localization and navigation system is proposed to be implemented on a vision-based autonomous small-sized humanoid robot. There are three main parts: a vision system, a localization system, and a path planning system. In the design and implementation of visual system, some interested color blocks in the image are labeled as features. In the design and implementation of localization system, a virtual map with some obstacle information is first established. Then a method based on the Monte Carlo self-localization method and the particle filter is proposed to calculate the localization of the robot in the environment. In the design and implementation of path planning system, In order to reduce the probability of collision, the degree of similarity is used to evaluate fitness values of obstacles so that the robot does not walk toward obstacles. Then a modified A * algorithm is proposed to plan a global path planning which has less turning point. Finally, some virtual points are considered for a regional obstacle avoidance path, so that the robot can simultaneously avoid obstacles and pursue the assigned path points of global path planning method. From the experimental results, we can see that the proposed method really lets the robot with the ability to self-positioning and navigation. Moreover, the robot can safely avoid obstacles and successfully reach the destination.

參考文獻


[13] J. K. David, T. Ernst, and O. B. Thomas, “Stereovision and navigation in building for mobilerobots,”IEEE Transaction on Robotics and Automation,vol. 5, pp. 792-803, 1989.
[3] T. Takenaka, T. Matsumoto, and T. Yoshiike, “Real time motion generation and control for biped robot -1st report: walking gait pattern generation-,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp.1084–1091, 2009.
[4] T. Takenaka, T. Matsumoto, T. Yoshiike, and S. Shirokura, “Real time motion genera-tion and control for biped robot -2nd report: Running gait pattern generation-,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1092–1099, 2009.
[5] T. Takenaka, T. Matsumoto, and T. Yoshiike, “Real time motion generation and con¬trol for bipedrobot -3rd report: Dynamics error compensation-,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1594–1600, 2009.
[6] T. Takenaka, T. Matsumoto, T. Yoshiike, T. Hasegawa, S. Shirokura, H. Kaneko, and A. Orita, “Real time motion generation and control for biped robot -4th report: Integrated balance control-,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1601–1608, 2009.

被引用紀錄


簡瑜萱(2017)。基於ROS之人形機器人的影像定位與導航〔碩士論文,淡江大學〕。華藝線上圖書館。https://doi.org/10.6846/TKU.2017.00248

延伸閱讀