Thesis

Implementation testing of several models of plasticity in long short-term memory

Artificial Neural Networks (ANN) are a set of models in Artificial Intelligence which attempt to replicate the function of the human brain. An ANN consists of three major components: weights, nodes, and layers. The network learns by adjusting the weighted values between the nodes. However,although the weights are important, the number of nodes and the topology of the network are also contribute to the ability of the network to learn. Smaller ANN can lack the number of weights required for the multitude of outputs which are desired. This same issue can also occur when a network is simply too large, requiring large input sets in order to train what could be a rather simple set of input-output pairs. In order to address these problems, this project looks at the creation of growing and shrinking neural networks of both the Feedforward and Recurrent variety. However, the results show relatively similar results for the given topology and growth functions, there is little effect on the overall overfitting.

Relationships

Items