透過您的圖書館登入
IP:216.73.216.32
  • 學位論文

應用於機械手臂之通用型自動手眼校正系統

Universal Automatic Hand-Eye Calibration System for Robot Manipulators

指導教授 : 翁慶昌

摘要


本論文提出一個應用於機械手臂之通用型自動手眼校正系統,主要有四個主題:(1)相機校正、(2)眼在手校正系統、(3)眼到手校正系統、以及(4)實驗驗證。在相機校正上,相機於非理想狀態下會有內部參數(intrinsic parameters)、外部參數(extrinsic parameters)以及畸變係數(distortion coefficients),而相機校正最主要的目的是為了求得相機內部參數及畸變係數。內部參數有關的參數為相機焦距在成像平面x, y軸方向之距離(以像素為單位)、相機焦距與成像平面z軸相交點之位置以及成像平面上x, y軸的夾角。外部參數由相機相對於世界座標原點之移動及旋轉組成,或相機不動,由物體相對於世界座標原點的移動及旋轉組成。 畸變係數則與相機鏡頭之透鏡相關,如光線經過透鏡時的所產生的彎曲情況及透鏡與成像平面是否為平行。若不平行,則需調整畸變係數。在眼在手校正系統部分,本論文使用開源軟體FlexBE進行狀態機管理,並結合MoveIt!規劃機械手臂之運動軌跡以進行運動控制。本論文藉由相機獲取校正板(ArucoMarker與棋盤格)之資訊,決定機械手臂之初始位置後便能生成多個獲取相機與機械手臂末端效應器關係之目標點。然後將生成之點位透過FlexBE匯入Moveit!進行運動規劃,並於每一點位間拍攝校正板取得機械手臂、相機、物體三者之轉換矩陣之間的關係式。當取得足夠資訊之後便可以計算出相機與機械手臂末端效應器之轉換矩陣,然後將相機所取得之物體位置代入,便能得到物體相對於機械手臂之位置。在眼到手校正系統部分,相機由機械手臂上移至場景中,所採用之方法與眼在手校正系統基本上是相同的,不同的是所求資訊為相機與機械手臂基座間之關係,一樣於每一點位間拍攝校正板取得機械手臂、相機、物體三者之轉換矩陣之關係式,取得足夠資訊之後便可以計算出相機與機械手臂基座之轉換矩陣,然後將相機所取得之物體位置代入,便能得到物體相對於機械手臂之位置。在實驗驗證部分,本論文使用兩種機械手臂與兩種相機來驗證本論文所提方法確實具有通用性。

並列摘要


In this thesis, a universal automatic hand-eye calibration system applied on robot manipulators is proposed. There are four main topics in this thesis: (1) camera calibration, (2) eye-in-hand calibration system, (3) eye-to-hand calibration system, and (4) experimental verification. In the camera calibration, ideally, the camera parameters include intrinsic parameters, extrinsic parameters, and distortion coefficients. The main purpose of camera calibration is to obtain the internal parameters of the camera and distortion coefficients. The parameters related to the internal parameters are the distance (in pixels) between the focal length of the camera and the z-axis of the imaging plane, the position of the intersection of the focal length of the camera and the z-axis of the imaging plane, and the angle between the x and y axis on the imaging plane. The extrinsic parameters consist of the translation and rotation of the camera relative to the world coordinate, or the translation and rotation of the object relative to the world coordinate if the camera does not move. The distortion coefficient is related to the lens of the camera lens, such as the bending of the light when it passes through the lens and whether the lens is parallel to the imaging plane. If it is not parallel, the distortion coefficient needs to be adjusted. In the eye-in-hand calibration system, the open source software FlexBE, which is used for state machine management, is used and combined with Moveit! to plan the trajectory of the robot manipulator for the motion control. The camera is used to obtain the information of the calibration board (ArucoMarker and checkerboard), and after the initial position of the robot manipulator is determined, multiple target points can be generated to obtain the relationship between the camera coordinate and the end effector coordinate of the robot manipulator. Then the generated points are imported into Moveit! through FlexBE for motion planning, and the calibration board is photographed between each point to obtain the relationship between the transformation matrices of the robot manipulator, camera, and object. After obtaining enough information, the transformation matrix between the camera coordinate and the end effector coordinate of the robot manipulator can be calculated, and then the position of the object obtained by the camera can be substituted to obtain the position of the object relative to the robot manipulator. In the eye-to-hand calibration system, camera is moved up from the robot manipulator to the scene. The adopted method is basically the same as that of the eye-in-hand calibration system. The difference is that the required information is the relationship between the camera coordinate and the base coordinate of the robot manipulator. In the experimental verification, two types of robot manipulators and two types of cameras are used to verify that the method proposed in this thesis is indeed universal.

參考文獻


[1] X. Xu, Y. Lu, B. Vogel-Heuser, and L. Wang, “Industry 4.0 and Industry 5.0—Inception, Conception and Perception,” Journal of Manufacturing Systems, 61:530-535, 2021.
[2] J.F. Engelberger, Robotics in Service, MIT Press, 1989.
[3] S. E. Zaatari, M. Marei, W. Li, and Z. Usman “Cobot programming for collaborative industrial tasks: An overview,” Robotics and Autonomous Systems, 116 : 162-180, 2019.
[4] The-UR10-collaborative-industrial-robot-arm, URL:https://www.universal-robots.com/blog/inventor-of-the-robot-arm-and-its-continued-development/
[5] K. Fukushima and M. Sei. “Neocognitron: A self-organizing neural network model for a mechanism of visual pattern recognition,” Competition and cooperation in neural nets, Berlin, Heidelberg, 267-285, 1982.

延伸閱讀