透過您的圖書館登入
IP:3.146.35.203
  • 學位論文

利用詮釋資料分類音樂情緒

Exploiting Metadata for Music Emotion Classification

指導教授 : 陳宏銘

摘要


隨著社群標注系統與音樂網路服務的快速增長,獲取大量音樂詮釋資料變得相當容易。由於此類詮釋資料往往與人類對音樂的感知有關,因此可用於幫助基於音樂內容的自動化分類。現有的自動化音樂分類往往受制於音樂訊號特徵與人類感知之間的巨大隔閡,將詮釋資料納入系統,勢必可對分類校能有一定的提昇。在此篇論文中,我們先檢驗情緒與音樂詮釋資料的相關性,接著運用此相關性來幫助音樂情緒的分類。我們提出使用詮釋資料劃分歌曲,並建立特定詮釋資料的分類模型以專注於分類各群內的歌曲情緒。由於一首歌可同時被標上不同型態的音樂詮釋資料,我們再提出了一個新穎的可適性融合的架構,可將各種詮釋資料合併使用。相較於現有方法往往受制於詮釋資料的混亂與稀少,我們提出的方法克服了這些困難並大幅提昇音樂情緒分類的準確度。

並列摘要


Along with the explosive growth of the social tagging systems and musical web services, abundant musical metadata are readily obtainable from the Internet. Since most metadata are related to the human perception of music, they can be utilized to bridge the so-called semantic gap between audio signals and high-level semantics for content-based music classification. In this thesis, we first examine the correlation between emotion and the musical metadta by a statistical association test, and then exploit such correlation for emotion classification. We propose to divide songs according to the metadata and build a metadata-specific model to concentrate on the classification in each group. Since a song can be associated with different types of metadata, such as genre and style, we further propose a novel adaptive fusion scheme to utilize all types of metadata. While the existing methods of exploiting metadata are hampered by the noise and sparseness inherent to metadata, the proposed scheme overcomes these difficulties and significantly improves the accuracy of emotion classification.

參考文獻


[1] A. Agresti. Categorical data analysis. John Wiley and Sons Publications, 2002.
[6] T. Bertin-Mahieux, D. Eck, F. Maillet, and P. Lamere. Autotagger: a model for predicting social tags from acoustic features on large music databases. Journal of New Music Research, 37(2):115–135, 2008.
[7] M. A. Casey, R. Veltkamp, M. Goto, M. Leman, C. Rhodes, and M. Slaney. Content-based music information retrieval: Current directions and future challenges. Proceedings of the IEEE, 96(4):668–696, 2008.
[8] L. Chen, P. Wright, and W. Nejdl. Improving music genre classification using collaborative tagging data. In Proc. ACM Int. Conf. Web Search and Data Mining, pages 84–93, 2009.
[9] H.-T. Cheng, Y.-H. Yang, Y.-C. Lin, I.-B. Liao, and H.-H. Chen. Automatic chord recognition for music classification and retrieval. In Proc. IEEE Int. Conf. Multimedia and Expo, pages 1505–1508, 2008.

延伸閱讀