Learning a Class of Large Finite State Machines with a Recurrent Neural Network

 LEE GILES C.
 NEC Research Institute

 HORNE B. G.
 NEC Research Institute

 LIN T.
 NEC Research Institute
Access this Article
Search this Article
Author(s)

 LEE GILES C.
 NEC Research Institute

 HORNE B. G.
 NEC Research Institute

 LIN T.
 NEC Research Institute
Journal

 Neural Netw.

Neural Netw. 8(9), 13591365, 19951201
References: 30

1
 Connectionist learning for control

BARTO A. G.
Neural networks for control, 1990
Cited by (1)

2
 Training a 3node neural network is NPcomplete

BLUM A.
Proceedings of the Computational Learning Theory (COLT) Conference, 918, 1988
Cited by (1)

3
 <no title>

CLOUSE D. S.
Learning large De Bruijn automata with feedforward neural networks, 1994
Cited by (2)

4
 Finding Structure in Time

ELMAN J. L.
Cognitive Science 14, 179211, 1990
Cited by (53)

5
 <no title>

GILES C. L.
Learning a class of large finite state machines with a recurrent neural network, 1994
Cited by (1)

6
 <no title>

HOPCROFT J. E.
Introduction to Automata Theory,Languages and Computation, 1979
Cited by (66)

7
 Attractor dynamics and parallelism in a connectionist sequential machine

JORDAN M. I.
Proceedings of the Eighth Conference of the Cognitive Science Society, 531546, 1986
Cited by (1)

8
 <no title>

KOHAVI Z.
Switching and finite automata theory, 1978
Cited by (4)

9
 Finite state automata that recurrent cascadecorrelation cannot represent

KREMER S. C.
Advances in Neural Information Processing Systems 8, 1996
Cited by (1)

10
 A simple weight decay can improve generalization.

KROGH A.
Advances in Neural Information Processing Systems No. 4, 950957, 1992
Cited by (5)

11
 Random DFAs can be approximately learned from sparse uniform examples

LANG K.
Proceedings of the Fifth ACM Workshop on Computational Learning Theory, 1992
Cited by (1)

12
 Nonlinear prediction of speech signals using memory neuron networks. In B. H. Juang

PODDAR P.
Neural Networks for Signal Processing: Proceedings of the 1991 IEEE Workshop, 110, 1991
Cited by (1)

13
 Static and dynamic error propagation networks with application to speech coding

ROBINSON A. J.
Neural Information Processing Systems, 632641, 1988
Cited by (1)

14
 On the computational power of neural networks

SIEGELMANN H. T.
Proceedings of the Fifth ACM Workshop on Computational Learning Theory, 440449, 1992
Cited by (1)

15
 On the Complexity of Minimum Inference of Regular Sets

ANGLUIN D.
Information and Control 39, 337350, 1978
Cited by (5)

16
 FIR and IIR synapses, a new neural network architecture for time series modelling

BACK A.
Nueral Comp 3, 375385, 1991
DOI Cited by (5)

17
 Properties of neural networks to modelling nonlinear dynamical systems.

BILLINGS S. A.
Int. J. Control 55, 193224, 1992
Cited by (4)

18
 Finite state automata and simple recurrent networks

CLEEREMANS A.
Neural Computation 1, 372381, 1989
DOI Cited by (26)

19
 Local feedback multilayered networks

FRASCONI P.
Neural Computation 4, 120130, 1992
DOI Cited by (7)

20
 Grammatical inference : Introduction and surveyPart I

FU K. S.
IEEE Trans. Syst. Man Cybern. 5, 95111, 1975
Cited by (3)

21
 Learnign and extracting finite state automata with secondorder recurrent neural networks

GILES C. L.
Neural Computation 4(4), 393405, 1992
DOI Cited by (18)

22
 Constructive Learning of Recurrent Neural Networks : Limitations of Recurrent Casade Correlation and a Simple Solution

GILES C. L.
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995
Cited by (6)

23
 Discovering the structure of a reactive environment by exploration

MOZER M. C.
Neural Comput. 2(4), 447457, 1990
DOI Cited by (4)

24
 Identification and control of dynamical systems using neural networks

NARENDRA K. S.
IEEE Trans. Neural Networks 1(1), 427, 1990
DOI Cited by (166)

25
 The induction of dynamical recognizers

POLLACK J. B.
Machine Learning 7, 227252, 1991
DOI Cited by (21)

26
 Learning and Extracting Initial Mealy Automata with a Modular Neural Network Model

TINO P.
Neural Computation 7, 822844, 1995
DOI Cited by (7)

27
 The Gamma Model  A New Neural Model for Temporal Processing

DE VRIES B.
Neural Networks 5(4), 565576, 1992
Cited by (5)

28
 Induction of FiniteState Languages Using SecondOrder Recurrent Networks

WATROUS R. L.
Neural Computation 4, 406414, 1995
DOI Cited by (7)

29
 A learning algorithm for continually running fully recurrent neural networks

WILLIAMS R. J.
Neural Computation 1(2), 270280, 1989
DOI Cited by (77)

30
 An efficient gradientbased algorithm for online training of recurrent network trajectories

WILLIAMS R. J.
Neural Computation 2(4), 490501, 1990
DOI Cited by (6)
Cited by: 4

1
 Extraction of Rules from Discretetime Recurrent Neural Networks

OMLIN Christian W. , LEE GILES C.
Neural Networks 9(1), 4152, 19960101
DOI References (32) Cited by (8)

2
 Effective learning in recurrent maxmin neural networks

TEOW LooNin , LOE KiaFock
Neural networks : the official journal of the International Neural Network Society 11(3), 535547, 19980401
References (29)

3
 Learning Protein Structures and Expressing State Space using a Recurrent Neural Network [in Japanese]

SAITO Shigeru , SHIOYA Hiroyuki , DATE Tsutomu
IEICE technical report. Neurocomputing 100(618), 915, 20010202
References (13)

4
 GroupLinking Method : A Unified Benchmark for Machine Learning with Recurrent Neural Network

LIN Tsungnan , GILES C. Lee
IEICE transactions on fundamentals of electronics, communications and computer sciences 90(12), 29162929, 20071201
References (65)