Performance Metrics of Various Topologies of a Feed Forward Error-Back Propagation Neural Network
Keywords:
Neural Network Topology, Gradient Descent Optimization, Supervised Learning, Backpropagation, Numerical optimization.Abstract
The use of artificial neural network in processing of information has continued to become a robust tool of choice for researchers especially for the modeling of both real-valued and vector-valued functions over continuous and discrete-valued attributes with ability to absolve noise in the training and validation data. This paper demonstrates some of the findings in the implementation of a fully connected feedforward error-back propagation artificial neural network on a range of pre-normalized inputs, with their corresponding output parameters and the deductions obtained from the various iterative tests performed with varying learning rate, η. A significant speed in network convergence and an appreciable error tolerance was achieved when η was set to 0.4. The neural network’s precision and F-score based on computing the confusion matrix of all iteration sessions was also used to analyzed the performance metrics of the investigated artificial neural network.