Javascript must be enabled for the correct page display

A comparison between modular and non-modular neural network structures for classification problems

Rekker, J. (2000) A comparison between modular and non-modular neural network structures for classification problems. Master's Thesis / Essay, Computing Science.

Infor_Ma_2000_JRekker.CV.pdf - Published Version

Download (1MB) | Preview


Using a multi—layer perceptron as an implementation of a classifier can introduce some difficulties in the design process. When a lot of classes need to be identified traditional neural networks tend to become very large due to the monolithic concept. Because of the high internal connectivity of such a network, changing some weights during learning may negatively affect other weights. Training such a network can take a lot of time and may not lead to optimal performance. Also, it is very well possible that only a couple of classes produce a relatively high error which cause a decrease of the total performance.It would be useful to have the ability to retrain only the part of a network that is responsible for high errors,especially when working with large networks.To solve several of the before mentioned issues one has the option to use a modular neuralnetworks to build a neural classifier instead of using a monolithic concept. Modular neural networks consist of several (independent) modules that can be arranged according to several different structures. The most easy to design structure consist of placing several modules in parallel where each module is responsible for just one class. Each of these modules consists of a small neural network with just one output. This is the structure that is used in later experiments.The main goal for us is to determine the learning robustness of a modular neural network and its modules and to compare the performance of the resulting modular network with a non—modular network. First we will look at the learning behavior of a single module. For that purpose several neural modules were trained using different values for certain design parameters. Choosing values for these parameters appears not to be very critical when kept within reasonable ranges. The overall learning process of a module is robust.To test the performance of a modular neural network, several of these networks were trained for different kinds of classification problems. For each problem a non—modular network was also trained to use as a reference. The first problem consist of identifying three non overlapping classes of a synthesized dataset. As the classes are non overlapping, classifiers for this problem should show a relatively high performance. The modular as well as the non—modular network indeed perform very well and the learning process of both structures appear to be robust.Two other datasets that are part of the ELENA benchmarks have also been used to train a modular and a non—modular network. Both networks perform according to the benchmarks and show robust learning behavior. The last problem consists of identifying 22 classes using a very large dataset. Therefore 22 separate modules were successfully trained and the resulting modular network showed a very high performance. The non—modular reference network showed similar results.When looking at the results from the different experiments it is clear that the performance of a modular neural network is about the same as a non—modular networks performance. Training modules for a modular network appears to be a robust process. However, when the classification problem at hand has a lot of different classes we need to train a lot of modules, which means the total training time will increase. So when designing a classifier for a large problem one can use the simple design concept of a modular network but this will lead to longer total training times.

Item Type: Thesis (Master's Thesis / Essay)
Degree programme: Computing Science
Thesis type: Master's Thesis / Essay
Language: English
Date Deposited: 15 Feb 2018 07:29
Last Modified: 15 Feb 2018 07:29

Actions (login required)

View Item View Item