Performance Metrics of Various Topologies of a Feed Forward Error-Back Propagation Neural Network

Authors

  • M.S. Osigbemeh Dept of Electrical and Electronics, Federal University Ndufu-Alike Ikwo.
  • C.C. Okezie Dept of Electronic and Computer Engineering, Nnamdi Azikiwe University. Awka.
  • H.C. Inyiama Dept of Electronic and Computer Engineering, Nnamdi Azikiwe University. Awka.

Keywords:

Neural Network Topology, Gradient Descent Optimization, Supervised Learning, Backpropagation, Numerical optimization.

Abstract

The use of artificial neural network in processing of information has continued to become a robust tool of choice for researchers especially for the modeling of both real-valued and vector-valued functions over continuous and discrete-valued attributes with ability to absolve noise in the training and validation data. This paper demonstrates some of the findings in the implementation of a fully connected feedforward error-back propagation artificial neural network on a range of pre-normalized inputs, with their corresponding output parameters and the deductions obtained from the various iterative tests performed with varying learning rate, η. A significant speed in network convergence and an appreciable error tolerance was achieved when η was set to 0.4. The neural network’s precision and F-score based on computing the confusion matrix of all iteration sessions was also used to analyzed the performance metrics of the investigated artificial neural network.

Downloads

Published

2017-01-01