透過您的圖書館登入
IP:18.116.87.196

並列摘要


It is easy for a multi-layered perception (MLP) to form open plane classification borders, and for a radial basis function network (RBFN) to form closed circular or elliptic classification borders. In contrast, it is difficult for a MLP to form closed circular or elliptic classification borders, and for RBFN to form open plane classification borders. Hence, MLP and RBFN have their own advantages and disadvantages in dealing with various classification problems. To combine their advantages, in this paper, we proposed a novel neural network, Hybrid Transfer Function Network (HTFN), whose hidden layer contains sigmoid and Gaussian units at the same time. Although there are two kinds of processing units in HTFN, in this study, we used the principle of minimizing error sum of squares to derive the supervised learning rules for all the network parameters. When HTFN contains only either sigmoid units or Gaussian units in its hidden layer, HTFN can be transferred into MLP and RBFN, respectively. Hence, MLP and RBFN can be considered as a special case of HTFN. To verify that HTFN is superior to MLP and RBFN, this study employed three man-made examples and 15 real examples to test the three networks. The results showed that HTFN is more accurate than MLP and RBFN, confirming that combining sigmoid and Gaussian units into hidden layer can combine advantages of MLP and RBFN.

延伸閱讀