Main /
## RichardNeuralNetworks## Neural Network Classifiers Estimate Bayesian A Posterioi Probabilities,## M.D.Richard and R.P.Lippmann, Neural Computation, 3, 461-483, 1991.For many, the idea of a neural network is a black box which can be trained for arbitrary purposes. It might be difficult to believe that there may be good reasons why these approaches might be considered to have a good theoretical basis. This paper explains how, provided a network architecture is trained in the right way, you would have good reason to expect the outputs to estimate meaningful quantities, specifically Bayesian a-Posteriori probabilities. This result is fundamentally useful , since by extension, it also explains how neural networks can be used as a principled solution to the problem of data fusion. You can think of it as like the more recent method of `boosting' but with theoretical validity and 15 years earlier. NAT 8/3/2005 |

Page last modified on March 05, 2012, at 03:51 PM