Neural Networks for Perception: Computation, Learning and Architecture v. 2

Neural Networks for Perception: Computation, Learning and Architecture v. 2

Edited by 

List price: US$49.95

Currently unavailable

Add to wishlist

AbeBooks may have this title (opens in new window).

Try AbeBooks

Description

These volumes explore recent research in neural networks that has advanced our understanding of human and machine perception. Contributions from international researchers address both theoretical and practical issues related to the feasibility of neural network models explaining human perception and implementing machine perception. Volume 1 covers models for understanding human perception in terms of distributed computation as well as examples of neural network models for machine perception. Volume 2 examines computational and adaptational problems related to the use of neural systems and discusses the corresponding hardware architectures needed to implement neural networks for perception.
show more

Product details

  • Hardback | 384 pages
  • 161 x 230 x 23mm | 690g
  • Academic Press Inc
  • San Diego, United States
  • English
  • index
  • 0127412522
  • 9780127412528

Table of contents

Computation, learning and architectures: learning visual behaviours, D.Ballard and S.Whitehead; non-parametric regression analysis using self-organizing toplological maps, V.Cherkassy and H.Lari-Najafi; theory of the backpropagation neural network, non-parametric regression analysis using self-organizing topological maps; Hopfield model and optimization problems, B.Kamgar-Parsi and B.Kamgar-Parsi; DAM regression analysis and attentive recognition, W.Polzleitner; intelligent code machine, V.Stern; cycling logarithmically convergent networks that flow information to behave (perceive) and learn, L. Uhr; computation and learning in the context of neural network capacity, S.Venkatesh; competitive and cooperative multimode dynamics in photorefractive ring circuits, D.Anderson et al; hybrid neural networks and algorithms, D.Casasent; the use of fixed holograms for massively-interconnected, low-power neural networks, H.Jeon et al; electronic circuits for adaptive synapses, J.Mann and J.Raffel; neural network computations on a fine grain array processor, S.J.Wilson.
show more