Limitations and Future Trends in Neural Computation
This work reports critical analyses on complexity issues in the continuum setting and on generalization to new examples, which are two basic milestones in learning from examples in connectionist models. The problem of loading the weights of neural networks, which is often framed as continuous optimization, has been the target of many criticisms, since the potential solution of any learning problem is limited by the presence of local minimal in the error function. The notion of efficient solution needs to be formalized so as to provide useful comparisons with the traditional theory of computational complexity in the discrete setting. It also covers up-to-date developments in computational mathematics.
- Hardback | 254 pages
- 167.6 x 243.8 x 20.3mm | 612.36g
- 01 Aug 2003
- IOS Press
- IOS Press,US
- Amsterdam, United States