透過您的圖書館登入
IP:18.191.21.86
  • 學位論文

利用結構亂度特徵進行室內環境三維表面重建

Three-Dimensional Surface Reconstruction of Indoor Environment Based on Structure-Entropy Feature

指導教授 : 連豊力

摘要


隨著低成本 RGB-D 感測器的發展以及二維即時定位與地圖建構的成功,環境的三維表面重建成了熱門的研究議題,大多數的重建方法著重在資料的轉換,藉由這些資料的重複區域來計算它們相對的位置,將不同時間所擷取到不同視角的四維資料集(RGB-D dataset)整合在同一坐標系下。三維的地圖可以應用在機器人視覺、虛擬擴增實境以及娛樂上,因為地圖提供了完整的色彩資訊以及幾何資訊,對於人類或者是機器人在認知環境有著相當大的幫助,例如微創手術。一般來說,三維環境重建的方法可以分為三個階段來討論,分別是特徵估測、去除離群匹配點以及最後的相對位置估算。特徵估測是利用找尋環境中具有代表性的特徵資料點來取代原本的全部資料點,並且將錯誤的特徵資料匹配對濾除,最後,再將剩餘正確的特徵點帶入疊代最近點演算法中,計算其相對位置,並利用計算出的相對位置將不同時擷取到不同視角的資料轉換至同一坐標系下。 在本文中,我們所提出的方法是利用結構亂度的特徵來描述環境中幾何資訊的變化,再來,利用亂度影像的配對來濾除錯誤的特徵資料對,亂度影像的配對方式是利用尋找兩張影像共同區域中最大亂度的積,此方法不僅可濾除錯誤匹配對,並且可以提供粗略的初始位置來當作疊代最近點演算法的起始位置,再由疊代最近點演算法計算出更精確的相對位置,最後,將每個位置所擷取的點雲利用座標轉換,轉至同一座標系中並畫在三維空間上形成一個三維地圖。 在本文的實驗結果裡展示了兩組資料集,分別是網狀系統控制實驗室的資料集以及Autonomous Systems Lab 所提供的資料集。實驗的結果顯示我們所提出的方法的準確度優於傳統的疊代最近點演算法,並且相對於色彩定位演算法,我們提出的方法不受色彩變化的影響甚至在黑暗的環境中也可進行。

並列摘要


With development of low-cost RGB-D sensor which can capture high resolutiondepth and visual information synchronously, and the success of two-dimensionalsimultaneous localization and mapping, three-dimension surface reconstruction of environment has been a popular research. Most of the 3D environment reconstruction approaches rely on data registration. By doing so, three-dimensional dataset scanned in different viewpoints can be transformed into the same coordinate system by aligning overlapping components of these sets. The constructed 3D map can be used in robot vision, virtual and augmented reality, and entertainment. With providing both color and spatial information, humans or robots can easily perceive their environments, such as minimal invasive surgery. In general, the task of three-dimension environment reconstruction can be divided into three stages: feature descriptor estimation, outlier rejection, and transformation estimation. First, feature descriptor estimation is used to find some distinct features with their special characteristics. Second, feature outlier removal can remove the incorrect corresponding pairs between two consecutive frames. Third, transformation estimation uses the correct corresponding pairs to find the transformation matrix which can transfer different viewpoint frames into global coordinate. In this thesis, the proposed method uses structure-entropy based feature to describe the energy in the environment. The region of the spatial structural change can be extracted because of their structure entropy energy. Then, a new method to remove outlier is presented, which is called entropy image matching. With finding maximum entropy energy of the overlapping area, the relative pose between two consecutive frames can be estimated roughly, which can serve as a good initial guess for transformation estimation. In the final step, transformation estimation uses the remaining region to implement iterative closest point (ICP) to determine a rigid transformation matrix. With that transformation matrix, all of the frames can be transformed into global coordinate and plot a 3D virtual map with point cloud format. Experimental results demonstrate two dataset, Networked Control Systems laboratory (NCSLab) dataset and Autonomous Systems Lab (ASL) dataset repository: Apartment. The experimental results show that the accuracy of the proposed method is better than traditional ICP algorithm. And the proposed method is unaffected by color or even in complete darkness compared to RGB-based feature mapping method. A 3D mapping system that can generate 3D maps of indoor environments with spatial information features is presented.

參考文獻


[1: Han et al. 2013]
Jungong Han, Ling Shao, Dong Xu, and Jamie Shotton, “Enhanced Computer Vision with Microsoft Kinect Sensor: A Review,” IEEE Transactions on Cybernetics, vol. 43, no. 5, Oct. 2013.
[2: Bailey & Durrant-Whyte 2006]
[3: Henry et al. 2012]
P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, “RGB-D mapping - Using Kinect-style depth cameras for dense 3D modeling of indoor environments,” The International Journal of Robotics Research, vol. 31, no. 5, pp. 647-663, February 10, 2012.

延伸閱讀