本次研究想得知將損失函數更換為Renyi entropy及Gini index是否能夠在深度學習中表現依然良好。 首先以均勻分布隨機產生模擬資料集,使用Shannon entropy、Renyi entropy、cross entropy及Gini Index四種不同的信息熵進行比較,發現Renyi entropy、cross entropy在模擬測試中表現較佳;接下來選擇較shannon entropy廣義的Renyi entropy以及使用Gini index當作損失函數,並與深度學習常中見的損失函數cross entropy,將它們套用在深度學習中進行比較。 其中表現最好的是Renyi entropy及cross entropy。
The approach of this research is to the loss function cross entropy with Renyi entropy and Gini index of deep learning, and see is it still good at training and test of deep learning. First we'd given a simulation data to compare four entropies; Shannon entropy, Renyi entropy, cross entropy and Gini index. The Renyi entropy and cross entropy are performed well in our simulation dataset. Next step,take Renyi entropy and Gini index as loss function compare with cross entropy, which is deep learning default usual loss function. The following are the main research contributions of this dissertation. The propose Renyi entropy is slightly better than cross entropy.