透過您的圖書館登入
IP:18.190.159.222
  • 學位論文

使用PEFT在維持高效率的訓練下對使用者評論進行基於面向的情感分析

ABSA : Opinion Tree Parsing with PEFT for Aspect-based Semtiment Analysis

指導教授 : 廖世偉
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


分析社交媒體用戶的評論呈現出重大挑戰,因為要辨識留言主體與留言情感之間的關係在複雜性上有所困難,特別是當使用者的評論在長度上有很大的變化時。本文介紹了一種新穎的意見樹解析模型,該模型能夠處理評論中不同方面之間錯綜複雜的互動,在訓練模型時,加入連接詞和語義修飾詞來提高解析的準確度。且在模型複雜化之下,為了提高訓練過程的效率並管理計算需求,我們在模型上實作了可以使參數量減少卻能達到差不多效能的方法(PEFT)。 我們在 ACOS 數據集上評估了我們提出的模型,鑑於描述用戶對特定方面情感的數據集有限,以及由於其資源密集性對大型預訓練語言模型(LLMs)進行下游調整的挑戰,我們的方法提出了一種改變計算方式的 OTP 模型。這種方法改變了模型的Loss function,專注於戰略性放置的模塊訓練,且在加入Adpater的情況下,顯著減少了 GPU 記憶體占用,並減輕了記憶體不足(OOM)問題,而不損害預訓練模型的整體完整性。這種方法不僅提高了訓練效率,而且還維持了與原始 LLM 配置接近的性能水平。

並列摘要


Analyzing social media user comments presents significant challenges due to the complexity of discerning relationships between opinions and aspects, particularly when comments vary greatly in length. This paper introduces a novel Opinion Tree Parser Model that navigates the intricate interplay between different aspects within comments, utilizing conjunctions and semantic modifiers to enhance the parsing accuracy. To improve the efficiency of the training process and manage the computational demands, we have implemented Position-Encoded Fine-Tuning (PEFT) methods on the decoder side. We evaluated our proposed model on ACOS datasets, given the limited availability of datasets that describe user sentiments towards specific aspects and the challenges of fine-tuning large pre-trained language models (LLMs) due to their resource intensity, our approach proposes an advanced context-free opinion grammar. This method integrates an adapter to focus training on strategically placed modules, significantly reducing the GPU memory footprint and mitigating out-of-memory (OOM) issues without compromising the overall integrity of the pre-trained model. This approach not only enhances training efficiency but also maintains performance levels close to those of the original LLM configurations.

參考文獻


X. Bao, X. Jiang, Z. Wang, Y. Zhang, and G. Zhou. Opinion tree parsing for aspect-based sentiment analysis. In Proceedings of the 2023 Annual Conference of the Association for Computational Linguistics (ACL), pages 1–12. Association for Computational Linguistics, 2023.
X. Bao, Z. Wang, X. Jiang, R. Xiao, and S. Li. Aspect-based sentiment analysis with opinion tree generation. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI 2022, pages 4044–4050. ijcai.org, 2022.
H. Cai, R. Xia, and J. Yu. Aspect-category-opinion-sentiment quadruple extraction with implicit aspects and opinions. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 340–350. Association for Computational Linguistics, 2021.
C. Chen, Z. Teng, Z. Wang, and Y. Zhang. Discrete opinion tree induction for aspect-based sentiment analysis. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2051–2064. Association for Computational Linguistics, 2022.
L. Cui, S. Yang, and Y. Zhang. Investigating non-local features for neural constituency parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2065–2075. Association for Computational Linguistics, 2022.

延伸閱讀