NIELS K. PETERSEN

TEKSTER

KONTAKT

THRESHOLD VARIATION IN NEURAL NETWORKS

Abstract

Characteristic of the neuron is its excitability, which is an all-or-none phenomenon: either the neuron fires, or it does not. The firing occurs when the integrated stimuli exceed a certain threshold. This threshold is not static, but varies in a way that depends on the history of the neuron.

In artificial neural networks model neurons are used that frequently only include a static or stochastic threshold. Some work with an explicit threshold variation has however been done on e.g. pattern segmentation in Hopfield networks.

In this thesis we propose and study an artificial neural network, the Adaptive Threshold Network (ATN), which is a modified version of the so-called Adaptive Performance Networks. It incorporates a simple model of the neuron's threshold variation proposed by M. Colding-Jørgensen. The ATN is tested for some elementary tasks like path formation, and its behaviour in a non-stationary environment is studied.

The ATN shows good ability for parallel processing and search, but some restrictions on which tasks it is sutied for are found. When compared with a pseduo-ATN, i.e. an ATN with a constant threshold, the pseudo-ATN can perform all the tasks studied, but not as well on the tasks that include simple outputs.

Threshold Variation in Neural Networks. Cand. scient. thesis by Niels Kristian Petersen, December 2 1996, Center for Chaos and Turbulence Studies, The Niels Bohr Institute. The thesis is not available online.

www.nielskpetersen.dk