Connectionist Approaches to Language Learning

Connectionist Approaches to Language Learning

Free delivery worldwide

Available. Dispatched from the UK in 3 business days
When will my order arrive?

Description

arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study- ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self- similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.
show more

Product details

  • Hardback | 149 pages
  • 155 x 235 x 11.18mm | 880g
  • Dordrecht, Netherlands
  • English
  • Reprinted from Machine Learning, Volume 7:2/3
  • IV, 149 p.
  • 0792392167
  • 9780792392163

Table of contents

Learning Automata from Ordered Examples.- SLUG: A Connectionist Architecture for Inferring the Structure of Finite-State Environments.- Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks.- Distributed Representations, Simple Recurrent Networks, and Grammatical Structure.- The Induction of Dynamical Recognizers.
show more