透過您的圖書館登入
IP:3.145.125.13
  • 學位論文

計算最大概似量子態估計量之批次及線上演算法

Batch and Online Algorithms for Maximum Likelihood Quantum State Tomography

指導教授 : 李彥寰

摘要


在本論文中,我們提出了一個批次演算法Q-EM和一個線上算法Q-Soft-Bayes,用於計算最大概似量子態估計量。Q-EM可以看作是泊松逆問題中期望最大化演算法(expectation maximization, EM)的量子延伸。Q-Soft-Bayes可以被視為Soft-Bayes的量子延伸,這是一種用於線上投資組合選擇的演算法。假設要估計的量子態對應於\(D\times D\)的密度矩陣,並且有\(N\)個測量結果。對於Q-EM,其每次迭代的計算複雜度為\(O(ND^2 + D^3)\);它的最佳化誤差以\(O(T^{-1}\log D)\)的速率遞減至0,其中\(T\)表示迭代次數。對於Q-Soft-Bayes,其每次迭代的計算複雜度為\(O(D^3)\),與測量次數無關;它的最佳化誤差以\(O(\sqrt{\frac{D \log D}{T}} + \sqrt{\frac{(\log T + \log D)^2\log \frac{1}{\delta}}{T}})\)的速率遞減至0,機率至少為 \(1-\delta\),其中\(T\)表示迭代次數。我們進一步將Q-Soft-Bayes延伸到小批量(mini-batch)的Q-Soft-Bayes,它在每次迭代中處理\(B\)個資料點。小批的量Q-Soft-Bayes的每次迭代計算複雜度為\(O(BD^2 + D^3)\),與測量次數無關;它的最佳化誤差以\(O(\sqrt{\frac{D \log D}{T}} + \sqrt{\frac{(\log T + \log D)^2\log \frac{1}{\delta}}{TB}})\)的速率遞減至0,機率至少為 \(1-\delta\),其中 \(T\) 表示迭代次數。

並列摘要


In this thesis, we propose a batch algorithm, Q-EM, and an online algorithm, Q-Soft-Bayes, for maximum likelihood quantum state tomography. Q-EM can be seen as a quantum extension of the expectation maximization (EM) algorithm in Poisson inverse problems. Q-Soft-Bayes can be viewed as a quantum extension of Soft-Bayes, a recent algorithm for online portfolio selection (Orseau et al. (2017). “SoftBayes: Prod for Mixtures of Experts with LogLoss”. Int. Conf. Algorithmic Learn. Theory). Suppose that the quantum state to be estimated corresponds to a \(D\)-by-\(D\) density matrix and that there are \(N\) measurement outcomes. For Q-EM, its per-iteration computational complexity is \(O(ND^2 + D^3)\); its optimization error vanishes at an \(O(T^{-1}\log D / T)\) rate, where \(T\) denotes the number of iterations. For Q-Soft-Bayes, its per-iteration computational complexity is \(O(D^3)\), independent of the number of measurements; its optimization error vanishes at an \(O(\sqrt{\frac{D \log D}{T}} + \sqrt{\frac{(\log T + \log D)^2\log \frac{1}{\delta}}{T}})\) rate with probability at least \(1-\delta\), where \(T\) denotes the number of iterations. We further extend Q-Soft-Bayes to mini-batch Q-Soft-Bayes, which processes \(B\) data points in each iteration. The per-iteration computational complexity of mini-batch Q-Soft-Bayes is \(O(BD^2 + D^3)\), independent of the number of measurements; the optimization error of mini-batch Q-Soft-Bayes vanishes at \(O(\sqrt{\frac{D \log D}{T}} + \sqrt{\frac{(\log T + \log D)^2\log \frac{1}{\delta}}{TB}})\) rate with probability at least \(1-\delta\), where \(T\) denotes the number of iterations.

參考文獻


Algoet, P. H., Cover, T. M. (1988). “Asymptotic Optimality and Asymptotic Equipartition Properties of Log-Optimum Investment”. Ann. Probab., 16(2), 876–898.
Bertero, M., Boccacci, P. (1998). Introduction to Inverse Problems in Imaging. CRC Press.
Bertero, M., Boccacci, P., Ruggiero, V. (2018). Inverse Imaging with Poisson Data. IOP Publishing.
Bhatia, R. (1997). Matrix Analysis. Springer-Verlag.
Blume-Kohout, R. (2010a). “Hedged Maximum Likelihood Quantum State Estimation”. Phys. Rev. Lett., 105(20), 200504.

延伸閱讀