Traditionally, image stitching allows users to combine multiple regular-sized images into a single wide-angle picture, often be named as a panoramic picture. In order to create such a panoramic picture, users usually need to take a lot of photographs, then upload them to a PC and stitch. Now, we only take one shot using an omnidirectional sensor, then we can get a 360-degree full environmental image. In this way, we can have a panoramic image easier and faster than before. Because of this reason, more and more researches and applications start to change traditional sensor to omnidirectional sensor. In recent year, mobile robot is an active research project, especially for robotic vision system. Eyes are the window of the soul. Hence, we believe that vision for mobile robot is as important as eye for human. Through omnidirectional sensor, mobile robots can receive more environmental information and then it will have more abilities for moving and locating. In this work, we will describe how to unwrap an omnidirectional image to a correctly panoramic image and use edge detection to get the whole environmental edge's information. Beside, we will implement the design into hardware to have a better computation performance. The experimental result will show out panoramic video output from an omnidirectional RGB camera input.