Thesis

Neural networks for supervised classification

With the advent of microarray technology, the amount of gene expression data
 available to scientist has greatly increased. This has, in tum, fostered a need for more
 effective and efficient supervised classification systems. One of the more commonly
 used classification systems, the Self-Organizing Map (SOM), has been used for both
 supervised and unsupervised classification of gene expression data with great success.
 Though effective, the SOM requires training times proportional to the square of the
 number of classes present in the data set. Classification times can become quite large
 when a large number of classes are present.
 In this work we present an alternative method for supervised classification known as
 the Adaptive Subspace Self-Organizing Map (ASSOM). The ASSOM has been used
 for a variety of classification and recognition purposes, including image segmentation
 and written digit recognition, with high degrees of accuracy. It also has the advantage
 of a training time proportional to the number of classes present in the data set. We
 show that the ASSOM performs at least as well as the SOM in a number of
 classification tasks, while offering a significant training time advantage. The use of
 the multilayer Perceptron for gene and written digit classification is also explored,
 and is shown to be inferior to both the SOM and the ASSOM for these purposes.

With the advent of microarray technology, the amount of gene expression data available to scientist has greatly increased. This has, in tum, fostered a need for more effective and efficient supervised classification systems. One of the more commonly used classification systems, the Self-Organizing Map (SOM), has been used for both supervised and unsupervised classification of gene expression data with great success. Though effective, the SOM requires training times proportional to the square of the number of classes present in the data set. Classification times can become quite large when a large number of classes are present. In this work we present an alternative method for supervised classification known as the Adaptive Subspace Self-Organizing Map (ASSOM). The ASSOM has been used for a variety of classification and recognition purposes, including image segmentation and written digit recognition, with high degrees of accuracy. It also has the advantage of a training time proportional to the number of classes present in the data set. We show that the ASSOM performs at least as well as the SOM in a number of classification tasks, while offering a significant training time advantage. The use of the multilayer Perceptron for gene and written digit classification is also explored, and is shown to be inferior to both the SOM and the ASSOM for these purposes.

Relationships

Items