In this thesis, a non-verbal interaction system is developed for a robot to interact with human. The research project is composed of four parts: First, we detect human face in difference distance using the methods of skin color matching and ellipse template matching. Second, from the detected human face, we extract six face features, including two center points of pupils and four extreme points of two eyebrows. Furthermore, a fuzzy system is designed, according to the relative positions of six features, to determine whether the operator gazes at the robot. Third, if the operator turns his gaze to the robot, the robot system will recognize the gesture performed by the operator as a command. Finally, after the robot system obtains the command from the gaze recognition and gesture recognition, it will implement the command according to an algorithm of two-wheeled motion control, and complete the non-verbal interaction with human.