Have library access?
IP:3.92.91.54
  • Theses

卷積神經網路中基於投影的維度縮減層

A Dimensionality Reduction Layer by Projection in a Convolutional Neural Network

Advisor : 陳素雲

Abstracts


本研究提出了一個卷積神經網路中取代池化的降維方法。池化層是接在卷積層後面,並發揮維度縮減的作用。目前,最大池化或平均池化等的方法被廣泛使用,而我們提出的方法將卷積層的輸出利用截斷的正交矩陣來轉換為維度較小的矩陣。我們將該截斷的正交矩陣視為神經網路中的訓練參數,並推導反向傳播演算法中出現的相關微分。除此以外,我們實際將上述所提的方法寫為電腦程式,驗證其可行性;同時,針對上述所提的方法與池化方法,於盡量相同的條件下進行比較。在實驗中,我們的方法展現較池化方法佳的性能。

Parallel abstracts


In this research, we proposed a dimensionality reduction method that takes the place of the pooling methods. A pooling layer is usually put after a convolutional layer to summarize the output images from the convolutional layer. At the moment, the max-pooling method or the average-pooling method is widely used on CNN. On the other hand, our proposed method transforms an output image from a convolutional layer into a lower-dimensional image by multiplying truncated orthogonal matrices. We regard the truncated orthogonal matrices as parameters of the neural network, and we derived the derivatives that appear in the backpropagation algorithm. Moreover, we also verified the feasibility of our proposed method by implementing it as a computer program. We compared the performance of our proposed method with the pooling methods under similar conditions. In the experiment, our proposed method achieved better performance than the pooling methods.

References


[1] Chainer: A flexible framework for neural networks. https://chainer.org/. (Accessed on 02/28/2019).
[2] Pytorch. https://pytorch.org/. (Accessed on 07/05/2019).
[3] M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, S. Ghemawat, I. Goodfellow, A. Harp, G. Irving, M. Isard, Y. Jia, R. Jozefowicz, L. Kaiser, M. Kudlur, J. Levenberg, D. Mané, R. Monga, S. Moore, D. Murray, C. Olah, M. Schuster, J. Shlens, B. Steiner, I. Sutskever, K. Talwar, P. Tucker, V. Vanhoucke, V. Vasudevan, F. Viégas, O. Vinyals, P. Warden, M. Wattenberg, M. Wicke, Y. Yu, and X. Zheng. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org.
[4] J. C. Duchi, E. Hazan, and Y. Singer. Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12:2121–2159, 07 2011.
[5] F. Chollet et al. Keras. https://keras.io, 2015.

Read-around