In this paper, a cybernetic method is proposed for batch and instantaneous update of weights in neural networks. Using the popular Hamilton-Jacobi-Bellman (HJB) equation, a new law of optimal weights is generated. The main contribution of this paper is that for any neural network using HJB, a closed form solution with optimal cost and weight updates can be obtained. This approach has been compared with some of the best performance learning algorithms available. The results show that this method has better performance in computation time and effect. Benchmark data, such as 8-bit parity, breast cancer, and credit approvals, as well as 2D Gabor functions have been used to verify our claims, and this article introduces this approach using the decoder as an example.