透過您的圖書館登入
IP:18.218.129.92
  • 學位論文

符號資料的自我組織映射網路

Self-Organizing Map Networks for Symbolic Data

指導教授 : 楊敏生

摘要


摘 要 Kohonen的自我組織映射圖(SOM)是屬於競爭學習網路,透過鄰近側向交互作用函數來挖掘隱藏在資料的拓蹼結構(topological structure)。SOM網路為非監督式學習網路,類神經元只能夠處理數值型態的資料。然而資料除了是數值型態,還存在著許多其他資料型態,例如符號型資料。對於符號型資料而言,SOM演算法是無法直接處裡,因此本論文將對SOM演算法作修正,使之符合符號型資料的自我組織映射網路,我們將此種新的SOM演算法稱之為符號SOM建立一套網路。對於符號資料,El-Sonbaty和Ismail [16]提出處理符號型資料的模糊c均值(FSCM)演算法,其群心為一種結構性的想法,此群心包含資料的事件(events)和結合隸屬值(associated memberships),我們將利用這種結構性群心的想法來當作符號神經元,以不同的結合隸屬值來呈現出不同的類神經元。至於符號資料的距離定義,我們是採用Gowda和Diday [6]以及Yang等[9]的距離定義。在類神經元的學習上,側向交互作用是學習的重要因素之ㄧ。資料的輸入與類神經元影響側向交互作用的激發程度。Fan等[5]提出壓縮模糊c均值演算法,其概念是提高最大的隸屬值,壓低其餘的隸屬值。我們將使用壓縮隸屬值的想法建立類神經元的學習方法,也就是擴大最大的結合隸屬值,壓低其餘的結合隸屬值,因此使用壓縮結合隸屬值的想法建立符號類神經元的學習規則,這就是符號神經元學習的主要精神。最後,我們將針對所提出的符號SOM作符號型資料的模擬測試,其結果顯示符號SOM在處裡符號資料的有效性與合理性。

並列摘要


Abstract The Kohonen’s self-organizing map (SOM) is a competitive learning neural network that uses a neighborhood lateral interaction function to discover the topological structure hidden in the data set. It is an unsupervised approach. In general, the SOM neural network is constructed as a learning algorithm for numeric (vector) data. However, except these numeric data, there are many other data types such as symbolic data. The SOM algorithm cannot treat the symbolic data. In this dissertation we are interested in considering a modified SOM for symbolic data. Thus, a new SOM algorithm, called a symbolic SOM (S-SOM), is proposed to deal with symbolic data. El-Sonbaty and Ismail [16] proposed a concept of cluster center that is a structure where the cluster center contains events and associated memberships. We will use the cluster center structure as a symbolic neuron. We can use different associated memberships to display different symbolic neurons. The distance measures from Gowda and Diday’s dissimilarity [6] and Yang et al. [9] are used. Lateral interaction is an important factor of learning. It is excited by the input data and neurons. Fan et al. [5] proposed the suppressed fuzzy c-means (S-FCM) which expand the largest membership degree and suppress the others. We used the suppressed idea to create a learning rule of neurons. That is, to expand the largest associated membership and to suppress the others. Thus, we use the suppressed concept of associated membership to create a learning rule of symbolic neuron. This is the main spirit of learning rule for symbolic neurons. Finally, we apply the symbolic SOM to real examples. The results show feasibility of our symbolic SOM in real applications.

並列關鍵字

symbolic data SOM FSCM

參考文獻


[3] F. Rosenblatt, The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain, Psychology Review 65 (1958) 386-408.
[4] H. Ritter and K. Schulten, On the stationary state of Kohonen’s self-organizing sensory mapping, Biol. Cybern 54 (1986) 99-106.
[5] J.L. Fan, W.Z. Zhen, and W.X. Xie, Suppressed fuzzy c-means clustering algorithm, Pattern Recognition Letters 24 (2003) 1607-1612.
[6] K.C. Gowda, E. Diday, Symbolic clustering using a new dissimilarity measure, Pattern Recognition. 24 (1991) 567-578.
[7] K.C. Gowda and T.V. Ravi, Divisive clustering of symbolic objects using the concepts of both similarity and dissimilarity, Pattern Recognition 28 (1995) 1277-1282.

延伸閱讀