Neural networks have recently received much attention due to their universal approximation capabilities. In other words, neural networks can identify and learn correlated patterns between sets of input data and corresponding target values. Among the many existing artificial neural network paradigms, the back-propagation network (BPN) is most widely used network for several applications. In this paper, the BPN is used to deal with identifying and approximating some nonlinear functions. Finally, effects of some parameters in the structure of a multi-layer perceptron such as number of hidden layers and neurons in each hidden layer on the performance of training back-propagation network are also examined and discussed.