Performance Evaluation of Two Distributed BackPropagation Implementations

TitlePerformance Evaluation of Two Distributed BackPropagation Implementations
Publication TypeConference Paper
Year of Publication2007
AuthorsBabii, S., V. Cretu, and E. M. Petriu
Conference NameNeural Networks, 2007. IJCNN 2007. International Joint Conference on
Pagination1578 -1583
Date Publishedaug.
Keywordsartificial neural network, backpropagation, distributed backpropagation implementations, feed-forward neural network, feedforward neural nets, LAN, local area networks, neural net learning algorithm, parallel algorithms, parallelization strategy
Abstract

This article presents the results of some experiments in parallelizing the training phase of a feed-forward, artificial neural network. More specifically, we develop and analyze a parallelization strategy of the widely used neural net learning algorithm called back-propagation. We describe two strategies for parallelizing the back-propagation algorithm. We implemented these algorithms on several LANs, permitting us to evaluate and analyze their performances based on the results of actual runs. We were interested on the qualitative aspect of the analysis, in order to achieve a fair understanding of the factors determining the behavior of this parallel algorithms. We were interested in discovering and dealing with some of the specific circumstances that have to be considered when a parallelized neural net learning algorithm is to be implemented on a set of workstations in a LAN. Part of our purpose is to investigate whether it is possible to exploit the computational resources of such a set of workstations.

URLhttp://dx.doi.org/10.1109/IJCNN.2007.4371193
DOI10.1109/IJCNN.2007.4371193

a place of mind, The University of British Columbia

Electrical and Computer Engineering
2332 Main Mall
Vancouver, BC Canada V6T 1Z4
Tel +1.604.822.2872
Fax +1.604.822.5949
Email:

Emergency Procedures | Accessibility | Contact UBC | © Copyright 2021 The University of British Columbia