Error Rate Testing of Training Algorithm in Back Propagation Network
Hindayati Mustafidah1, Suwarsito2
1Hindayati Mustafidah, Department of Informatic Engineering, University Muhammadiyah of Purwokerto, Central Java, , Indonesia.
2Suwarsito, Department of Geography Education, University Muhammadiyah of Purwokerto, Central Java , Indonesia.
Manuscript received on August 17, 2015. | Revised Manuscript received on August 29, 2015. | Manuscript published on September 05, 2015. | PP: 46-50 | Volume-5 Issue-4, September 2015 . | Retrieval Number: D2688095415 /2015©BEIESP
Open Access | Ethics and Policies | Cite
©The Authors. Published By: Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/

Abstract: Artificial Neural Network (ANN), especially back propagation method has been widely applied to help solve problems in many areas of life, eg for the purposes of forecasting, diagnostics, and pattern recognition. An important part at ANN in determining the performance of the network is training algorithm used. Because there are 12 training algorithms that can be used at back propagation method, of course, it’s needed to be selected the most optimal algorithm in order to obtain the best results. Training algorithm performance is said optimal in providing solutions can be seen from the error generated. The smaller the error is generated, the more optimal performance of the algorithm. In this study, testing to get the training algorithm has the smallest error rate of 12 existing algorithms. Testing begins with the preparation of a computer program modules using MATLAB programming language to get the error value of the network output for each training algorithm. Each program for each training algorithm executed 20 times. Furthermore, the error of the network output was tested using analysis of variance with an alpha level of 5% to get a training algorithm which has the smallest error rate. The conclusion of the test results is that the training algorithm “trainlm” has the smallest error with the network parameters for the target error = 0.001 (10-3 ), the maximum epoch = 10000, learning rate (lr) = 0.01, and 5 neuron input data with 1 neuron output.
Keywords: Error rate, training algorithm, back propagation, network parameters.