透過您的圖書館登入
IP:18.191.228.88
  • 學位論文

利用智慧眼鏡及電腦視覺技術作戶外街景之即時擴增實境式導覽

A Real-time Mobile Augmented-reality System for Street-area Exploration by Computer Vision Techniques Using Smart Glasses

指導教授 : 蔡文祥

摘要


當人們到戶外旅遊時,常會有迷路或不知如何到達想去地點或建築物的情形,而近年來發展之導航系統雖可解決部分問題,但不能普遍應用於各種戶外街景,且對使用者來說,仍不夠直覺。對此,本研究提出一個利用電腦視覺技術與智慧型眼鏡,應用於戶外旅遊場所的即時擴增實境式導覽系統。所提系統事先建立一導覽地圖,提供遊客旅遊資訊,並有兩個主要功能:旅遊導覽及建築物資訊介紹,皆是建立在擴增實境的技術上。此系統可以指引遊客到事先選取的建築物,以及將建築物資訊擴增到使用者所戴智慧型眼鏡的螢幕上。 為了完成上述功能,所提系統首先建立一環境地圖,包含旅遊路線區域平面圖及沿線建築物資訊;對此,本研究提出一個建立地圖資料庫之應用程式,以智慧型眼鏡的相機拍攝整段路線上建築物的影像,並記錄其對應位置;之後將所有資料儲存至一資料庫,供進行導覽時使用。 接下來,本研究提出一個利用加速穩健特徵(SURFs)做影像辨識的演算法,來對使用者進行定位的方法。該方法首先要求使用者對鄰近建築物拍攝影像,接著利用加速穩健特徵,將建築物影像與事先建立的影像資料庫做比對,進行建築物辨識。接著,本研究提出三個過濾不佳比對之演算法,用以提升辨識率。最後,根據使用者拍攝的影像與建築物辨識結果之間的透視關係,完成使用者定位,獲得使用者位置及方向參數。 此外,本研究提出三個加速系統之方法,達成即時導覽的作用。最後,本研究提出一擴增實境式導覽與建築物資訊介紹的方法。對此,所提系統會根據使用者的位置及方向參數,利用戴克斯特拉(Dijkstra)演算法,規劃一條從使用者位置到一事先選取的目的地之間的最短路徑,並在智慧型眼鏡螢幕中的沿路影像上貼上一擴增實境式箭頭,來指引使用者該往哪裡走。此外,也在智慧型眼鏡螢幕上擴增眼前所見建築物之名稱,以及建築物相關資訊,供使用者觀看。 最後,本研究實驗結果良好,足以證明所提系統確實可行。

並列摘要


In this study, a realtime augmented reality (AR)-based tour guidance system for outdoor street-area exploration has been proposed. The system is based on computer vision techniques for uses while visiting outdoor street areas. It solves a tourist’s problem of getting lost in unfamiliar streets, having no idea about reaching a desired location, or even being unable to comprehend a map of the area. The system has the following functions. Firstly, a simple way to learn the street-area environment before touring is provided. Secondly, accurate user positions and orientations are computed. Thirdly, the latest user position and orientation are updated in a realtime fashion so that the user can always know where he/she is at any moment. Fourthly, a proper and shortest guidance path is planned to guide the user and update an augmented guidance arrow shown on the display of a pair of smart glasses dynamically when the user is walking toward a wrong direction. Finally, the information about the traversed or visited buildings is displayed on the device in an AR manner, by which the user can get more knowledge about the buildings. To implement such a system, at first an environment map is generated in the learning phase, which includes a top-view map of the selected tour paths and the information about the along-path buildings. In addition, the images of the buildings taken by the camera of the smart glasses and the corresponding building locations along the entire tour path are also learned. All the learned data are saved into a database for use in the guidance phase. Next, a method for user localization is proposed, which is based on image matching using speeded up robust features (SURFs). At first, the server-side of the system receives the image taken by the camera of the smart glasses. Secondly, image matching is conducted against the pre-constructed image database by an SURF matching algorithm. Then, three methods of speeding up the feature-matching work are applied. Finally, user localization is conducted by the system to obtain the user’s position and orientation parameters according to the perspective relationship between the image taken by the user and the matching result. Furthermore, to realize realtime guidance, methods of speeding up images transmission and image processing are also proposed for use both in the learning phase and in the guidance phase, so that the AR information and guidance arrow can be updated and displayed on the screen of the smart device in realtime. Finally, a method for AR-based guidance and building-information introduction is proposed. Based on the user’s position and orientation, a shortest path from the user’s location to the pre-selected destination is planned by use of the Dijkstra algorithm. Accordingly, an AR-based arrow is rendered and augmented on the acquired along-path scene image on the screen to guide the user where to go. Also augmented on the screen is the information of the building according to the user localization result. Good experimental results are also presented to show the feasibility of the proposed methods and the system for real applications.

參考文獻


[3] T. Bruce, et al, “A Wearable Computer System with Augmented Reality to Support Terrestrial Navigation,” Proceedings of IEEE International Symposium on Wearable Computers, Pittsburgh, PA, USA, pp. 168-171, Oct. 1998.
[4] F. Steven, et al, “A Touring Machine Prototyping 3D Model Augmented Reality Systems for Exploring the Urban Environment,” Personal Technologies, Vol. 1, pp. 208-217, Dec. 1997.
[6] H. Katsura, J. Miura, M. Hild and Y. Shirai, “A View-Based Outdoor Navigation Using Object Recognition Robust to Changes of Weather and Seasons,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3, pp. 2974-2979, Oct. 2003.
[7] K. Satoh, M. Anabuki, H. Yamamoto and H.Tamura, “A Hybrid Registration Method for Outdoor Augmented Reality,” Proceedings of IEEE and ACM International Symposium on Augmented Reality, New York, NY, USA, pp. 67-76, Oct.2001.
[8] D. Stricker and T. Kettenbach, “Real-time and Markerless Vision-Based Tracking for Outdoor Augmented Reality Applications,” Proceedings of IEEE and ACM International Symposium on Augmented Reality, New York, NY, USA, pp. 189-190, Oct.2001.

延伸閱讀