透過您的圖書館登入
IP:3.129.23.30
  • 學位論文

基於彩色深度攝影機及慣性感應器之自我運動估測

Ego-motion Estimation Based on RGB-D Camera and Inertial Sensor

指導教授 : 洪一平
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


自我運動估測在機器人控制及自動化上有相當廣泛的應用。正確的 區域自我運動估測可以幫助機器人了解、感知周遭環境,並建構出走 過的路徑。在這篇論文裡,我們提出了一個結合基於關鍵影格的視覺 里程計及慣性資料的自我運動估測系統。系統硬體包括擷取影像的彩 色深度攝影機和取得慣性資料的慣性測量單元。 兩張連續影像間的攝影機運動經由視覺特徵的對應關係來進行計 算。剛體限制可以有效地將初始對應點裡的異常對應點去除。此外, 我們估測運動的過程中利用隨機抽樣一致性算法來處理剩餘異常對應 點的影響。這些方式都能讓我們確保在進行攝影機運動估算時所用的 對應點幾乎都是正確的對應。 我們進行了各種實驗來證明演算法的穩固性和正確性,以及正確地 處理真實場景的能力。

並列摘要


Ego-motion estimation has a wide variety of applications in robot control and automation. Proper local estimation of ego-motion benefits to recognize surrounding environment and recover the trajectory traversed for autonomous robot. In this thesis, we present a system that estimates ego-motion by fusing key frame based visual odometry and inertial measurements. The hardware of the system includes a RGB-D camera for capturing color and depth images and an Inertial Measurement Unit (IMU) for acquiring inertial measurements. Motion of camera between two consecutive images is estimated by finding correspondences of visual features. Rigidity constraints are used to efficiently remove outliers from a set of initial correspondence. Moreover, we apply random sample consensus (RANSAC) to handle the effect of the remaining outliers in the motion estimation step. These strategies are reasonable to insure that the remaining correspondences which involved in motion estimation almost contain inliers. Several experiments with different kind of camera movements are performed to show that the robustness and accuracy of the ego-motion estimation algorithm, and the ability of our system to handle the real scene data correctly.

參考文獻


[1] Andrew J Davison. Real-time simultaneous localisation and mapping with a single camera. In Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on, pages 1403–1410. IEEE, 2003.
[6] Korbinian Schmid and Heiko Hirschmuller. Stereo vision and imu based real-time ego-motion and depth image computation on a handheld device. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pages 4671–4678. IEEE, 2013.
[7] Zheng Fang and Yu Zhang. Experimental evaluation of rgb-d visual odometry methods. International Journal of Advanced Robotic Systems, 12, 2015.
[8] Albert S Huang, Abraham Bachrach, Peter Henry, Michael Krainin, Daniel Maturana, Dieter Fox, and Nicholas Roy. Visual odometry and mapping for autonomous flight using an rgb-d camera. In International Symposium on Robotics Research (ISRR), pages 1–16, 2011.
[10] François Pomerleau, Francis Colas, Roland Siegwart, and Stéphane Magnenat. Comparing icp variants on real-world data sets. Autonomous Robots, 34(3):133–148, 2013.

延伸閱讀