The purpose of this study is to improve the method when users monitor environment by a remote control robot,so that the users no longer need to use the monitor screen in the hands on the remote control robot, but by tracking the user’s head.In the study, we use YCBCR color space for color image processing and calculation of the facial data in Labview 2009,and finally through the Wi-Fi wireless network transmission and issued control instructions to give the remote control on the remote robot.This paper will be a brief visual theory and analysis of robot corresponding to the location and analysis of how to control the robot to move it to the corresponding location.