透過您的圖書館登入
IP:3.12.108.236
  • 學位論文

具機器視覺之智慧型機械手臂伺服控制

An Intelligent Visual Servo Control for a Robot System

指導教授 : 邱國慶
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


本論文之主要目的為研製一套影像追蹤機械手臂控制系統,系統利用影像辨識技術追蹤運動目標物,並結合適應性類神經滑動模式控制法則(Adaptive Neural Sliding Mode Control,ANSMC)來達成機械手臂即時追蹤抓取控制。整體架構分為四個子系統,分別是人機控制介面系統、嵌入式影像處理系統(CMUCam4)、微控制器(Arduino Mega2560)、及六軸機械手臂。系統人機介面係由LabVIEW套裝軟體發展而成,且使用無線藍牙通訊當作嵌入式系統與遠端監控電腦溝通之橋梁。當系統執行運動目標物追蹤時,機械手臂上的一台CMOS攝影機所擷取到的影像資訊直接傳送至嵌入式系統,影像資訊經嵌入式影像處理系統解析後,目標物中心點至CMOS影像畫面中心點之平面距離與動態影像面積變化率可推估出目標物之立體距離,接著,嵌入式系統將此距離誤差訊號傳給Arduino Mega2560微控制器。 本論文運用Lyapunov穩定準則推導出適應類神經網路滑動模式控制法則(Adaptive Neural Sliding Mode Control,ANSMC),將其控制理論建構在微控制器中運算,並且送出適當的角度訊號,透過微控制器控制六個伺服馬達,使機械手臂能夠平滑地追隨前方移動中的目標物再進行抓取動作。實驗結果顯示,本文所提控制方法可以使機械手臂能夠在靜止、正在移動、受到光線變化、及受到相同球體影響的情況下迅速做有效的抓取運動的目標物。 關鍵詞:機械手臂、影像處理、適應性類神經網路滑動模式控制法則

並列摘要


In this paper, a visual servo control system is developed for a robot arm. Based on image processing technology, this system utilizes an adaptive neural sliding mode control (ANSMC) algorithm to realize a vision guided robot system with real-time object tracking control. The proposed system is composed of a human-machine controlled interface (LabVIEW), an embedded image processing system (CMUCam4), a microcontroller (Arduino Mega2560) and a six-axis robot arm. The human-machine controlled interface is developed on a personal computer by the LabVIEW tool and takes a wireless Bluetooth communication with the embedded image processing system. When it is tracking the moving object, the CMOS camera feeds back the environment information to embed image processing system. After extracting the object image, the stereo distance information between the target object and the robot arm can be calculated by vision algorithm. Based on this distance information, the error which defined in image plane will be obtained and can be sended to the microcontroller. Using the error signals, the ANSMC algorithm can be derived from the Lyapunov stability theory by the microcontroller to determine the control parameters and sends out the control command (angle signal) to six servo motors which can grab the object at center of the image plane. Keyword: robot arm, image processing, adaptive neural sliding mode control.

參考文獻


[3] J.Feddema and O.Mitchell,1989,”Vision-Guided Servoing with Feature-Based Trajectory Genneration,”IEEE Trans. On Robotics and Automation,Vol.5,pp.691~700.
[6] Horn, K.P. and Schunck, B.G. ,1981, “Determining Optical Flow, ” Artificial Intelligence, Vol. 17, pp.185-203.
[7] Horn, K.p. and Schunck, B.G.,1981, “Determining Optical Flow” , Artificial Intelligence, Vol.17 , pp.185-203.
[8] Amar Mitiche and Abdol-Reza Mansouri , 2004, “On Convergence of the Horn and Schunck Optical-Flow Estimation Method” , IEEE Transactions on image processing, Vol.13 , pp.848-852.
[9] Fuh, C.S. and Maragos ,1989, “Region-Based Optical Flow Estimation,” Proceedings of IEEE conference on Computer Vision and Pattern Recognition, SanDiego, C.A., pp.130-133.

被引用紀錄


楊品中(2015)。智慧型自動物料搬運系統研製〔碩士論文,國立虎尾科技大學〕。華藝線上圖書館。https://doi.org/10.6827/NFU.2015.00195

延伸閱讀