透過您的圖書館登入
IP:18.189.170.17
  • 學位論文

重整化群與機器學習

Renormalization Group and Machine Learning

指導教授 : 高英哲

摘要


深度學習擁有卓越的能力來探索資料中背後的特徵。雖然深度學習在實務上有重大突破,其理論的了解卻甚少。近期文獻指出,受限玻爾茲曼機與變分重整化群有一對一的對應。然而,這對應是有爭議的,我們希望能建立更嚴謹的對應關係。 在這篇論文中,我們使用受限玻爾茲曼機用以優化重整化群。理論上,重整化群的描述需要無限多的耦合常數。因此,在實務上人們會引進變分參數來取代耦合常數。然而,最佳變分參數的選擇常破壞自洽性,而有效率的投影算符是問題相依的。因次,我們使用受限玻爾茲曼機來參數化投影算符,並以相對熵的最小化當作選擇變分參數的最佳準則。我們相信,本演算法可以做為優化重整化群的通用架構,並給出重整化群與深度學習對應的解釋。

並列摘要


Deep learning has yielded impressive results in difficult machine learning tasks due to its ability to learn relevant features from data. Despite the success of deep learning, relatively little is understood theoretically. It has been shown recently an exact mapping between the variational renormalization group and the deep neural networks based on the restricted Boltzmann machines. Since the discussions are not uncontroversial, it remains desirable to establish a more rigorous connection between renormalization group and deep learning. In this work, we propose a general method for optimizing real-space renormalization-group transformation through divergence minimization. One of the main obstacle in real space renormalization group methods is that the renormalized Hamiltonian involves an infinity of coupling parameters. For this reason it is an old intention to improve the transformation by introducing variational parameters. However, the optimal criterion for choosing variational parmameter can lead to inherent inconsistency and the form of projection operators can be problem dependent. Therefore, we explore the structure of restricted Boltzmann machine to parameterize the projection operator and adopt the minimization of the Kullback-Leibler divergence between the normalizing factor and the Hamiltonian as the optimal criterion in choosing the variational parameter. It may serve as a general method for optimizing real-space renormalization-group transformation and shed light on the connection between renormalization group and deep learning.

參考文獻


[1] Maciej Koch-Janusz and Zohar Ringel. Mutual information, neural networks and the renormalization group. Nature Physics, 14(6):578, 2018.
[2] Giuseppe Carleo and Matthias Troyer. Solving the quantum many-body prob- lem with artificial neural networks. Science, 355(6325):602–606, 2017.
[3] Elina Robeva and Anna Seigal. Duality of graphical models and tensor net- works. arXiv preprint arXiv:1710.01437, 2017.
[4] Pankaj Mehta and David J Schwab. An exact mapping between the variational renormalization group and deep learning. arXiv preprint arXiv:1410.3831, 2014.
[5] Leo P Kadanoff. Variational principles and approximate renormalization group calculations. Physical Review Letters, 34(16):1005, 1975.

延伸閱讀


國際替代計量