Abstract

Proceedings Abstracts of the Twenty-Third International Joint Conference on Artificial Intelligence

Self-Organized Neural Learning of Statistical Inference from High-Dimensional Data / 1226
Johannes Bauer, Stefan Wermter

With information about the world implicitly embedded in complex, high-dimensional neural population responses, the brain must perform some sort of statistical inference on a large scale to form hypotheses about the state of the environment. This ability is, in part, acquired after birth and often with very little feedback to guide learning. This is a very difficult learning problem considering the little information about the meaning of neural responses available at birth. In this paper, we address the question of how the brain might solve this problem: We present an unsupervised artificial neural network algorithm which takes from the self-organizing map (SOM) algorithm the ability to learn a latent variable model from its input. We extend the SOM algorithm so it learns about the distribution of noise in the input and computes probability density functions over the latent variables. The algorithm represents these probability density functions using population codes. This is done with very few assumptions about the distribution of noise. Our simulations indicate that our algorithm can learn to perform similar to a maximum likelihood estimator with the added benefit of requiring no a-priori knowledge about the input and computing not only best hypotheses, but also probabilities for alternatives.