透過您的圖書館登入
IP:3.15.221.67
  • 期刊
  • OpenAccess

The Linear Quantization Strategy of Quadratic Hebbian-Type Associative Memories and Their Performance Analysis

並列摘要


The Quadratic Hebbian-type associative memories have superior performance, but they are more difficult to implement because of their large interconnection values in chips than are the first order Hebbian-type associative memories. In order to reduce the interconnection value for a neural network with M patterns stored, the interconnection value [-M, M] is mapped to [-H, H] linearly, where H is the quantization level. The probability of direct convergence equation of quantized Quadratic Hebbian-type associative memories is derived and the performances are explored. The experiments demonstrate that the quantized network approaches the original recall capacity at a small quantization level. Quadratic Hebbian-type associative memories usually store more patterns; therefore, the strategy of linear quantization reduces interconnection value more efficiently.

延伸閱讀