A novel self-localization is proposed for robots which using only an omnidirectional camera in this thesis. The method is first to generate a virtual space with a simulated omnidirectional camera to simulate robot equipped with an omnidirectional camera in the real world. Numerous vertical and horizontal lines are fictitious generated only in the virtual space and they are then projected to the simulated virtual omnidirectional image and often used the sphere camera model. We then combine the virtual omnidirectional image with the real one for matching the generated virtual lines with respect to the edges of the real omnidirectional image. Efficiently, matched lines are very precisely for estimating the pose or trajectory of the robot in the real world via the standard EPnP method. Experimental results validate the robot navigates in a trajectory and show the effectiveness of the proposed method in an indoor room.