A pruning method for the recursive least squared algorithm

 LEUNG ChiSing
 Department of Electronic Engineering, City University of Hong Kong

 WONG KwokWo
 Department of Electronic Engineering, City University of Hong Kong

 SUM PuiFai
 Department of Computing, The Hong Kong Polytechnic University

 CHAN LaiWan
 Department of Computer Science and Engineering, The Chinese University of Hong Kong
Search this Article
Author(s)

 LEUNG ChiSing
 Department of Electronic Engineering, City University of Hong Kong

 WONG KwokWo
 Department of Electronic Engineering, City University of Hong Kong

 SUM PuiFai
 Department of Computing, The Hong Kong Polytechnic University

 CHAN LaiWan
 Department of Computer Science and Engineering, The Chinese University of Hong Kong
Journal

 Neural Networks

Neural Networks 14(2), 147174, 20010301
References: 35

1
 The natural gradient learning for neural networks

AMARI S. I.
Proc. International Workshop TANC'97, 116, 1997
Cited by (1)

2
 <no title>

ANDERSON B. D. O.
Optimal Filtering, 1979
Cited by (34)

3
 Secondorder derivatives for network pruning: optimal brain surgeon

HASSIBI B.
Advances in neural information processing, 164171, 1993
Cited by (1)

4
 <no title>

HAYKIN S.
Adaptive Filter Theory, 1991
Cited by (105)

5
 Optimal brain damage

LE CUN Y.
Advances in neural information processing 1, 396404, 1989
Cited by (1)

6
 Note on generalization, regularization and architecture selection in nonlinear systems

MOODY J. E.
Proc. IEEE Workshop on Neural Networks for Signal Processing, 110, 1991
Cited by (1)

7
 <no title>

MOSCA E.
Optimal, predictive and adaptive control, 1995
Cited by (5)

8
 Learning internal representation by error propagation

RUMELHART D. E.
Parallel distributed processing, exploring the macrostructure of cognition Volume 1: Foundations., 318362, 1986
Cited by (1)

9
 Training feedforward networks with the extended Kalman filter

SINGHAL S.
Advances in neural information processing, 133140, 1989
Cited by (2)

10
 Training recurrent networks using the extended Kalman filter

WILLIAMS R. J.
Proc. IJCNN'92 IV, 241246, 1992
Cited by (1)

11
 The sample complexity of pattern classification with neural networks : The size of the weights is more important than the size of the network

BARTLETT P. L.
IEEE Trans. Inf. Theory 44(2), 525536, 1998
DOI Cited by (10)

12
 Recurrent Radial Basis Function Networks for Adaptive Noise Cancellation

BILLINGS Steve A. , FUNG Chi F.
Neural Networks 8(2), 273290, 19950301
References (37) Cited by (4)

13
 Training with noise is equivalent to Tikhonov regularization

BISHOP C. M.
Neural Computation 7(1), 108116, 1995
Cited by (8)

14
 On selforganizing algorithms and networks for classseparability features

CHATTERJEE C.
IEEE Trans. Neural Networks 8(3), 663678, 1997
Cited by (3)

15
 Parallel recursive error algorithm for training layered neural networks

CHEN S.
INT. J. CONTROL 51(6), 12151228, 1990
Cited by (2)

16
 Recurrent neural networks and robust time series prediction

CONNOR J. T.
IEEE Trans. Neural Networks 5(2), 240254, 1994
DOI Cited by (18)

17
 Learning Algorithm of Layered Neural Networks via Extended Kalman Filters

WATANABE K.
Int. J. Systems Sci. 22(4), 753768, 1991
Cited by (7)

18
 Pruning from adaptive regularization

HANSEN L. K.
Neural Computation 6, 12231232, 1994
DOI Cited by (2)

19
 A RealTime Learning Algorithm for a Multilayered Neural Network Based on the Extended Kalman Filter

IIGUNI Y.
IEEE Trans. on Signal Processing 40(4), 959966, 1992
Cited by (18)

20
 Structural Learning with Forgetting

ISHIKAWA Masumi
Neural Networks 9(3), 509521, 19960401
DOI References (26) Cited by (97)

21
 A novel noise robust fourthorder cumulants cost function

LEUNG C. T.
Neurocomputing 16, 139147, 1997
Cited by (1)

22
 Recursive algorithms for principal component extraction

LEUNG C. S.
Network: Computation in Neutral Systems 8, 323334, 1997
Cited by (1)

23
 Bayesian interpolation

MACKAY D. J. C.
Neural Computation 4(3), 415447, 1992
DOI Cited by (62)

24
 A practical Bayesian framework for backpropagation networks

MACKAY D. J. C.
Neural Computation 4, 448472, 1992
DOI Cited by (37)

25
 Regularization in the selection of radial basis function centers

MARK O. J. L.
Neural Computation 7, 606623, 1995
DOI Cited by (1)

26
 Neural networks and dynamical systems

NARENDA K. S.
International Journal of Approximate Reasoning 6, 109131, 1992
Cited by (1)

27
 Fast exact multiplication by the Hessian

PEARLMUTTER B.
Neural Computation 6(1), 147160, 1994
DOI Cited by (5)

28
 Neurocontrol of nonlinear dynamical systems with Kalman filter trained recurrent networks

PUSKORIUS G.
IEEE Trans. Neural Networks 5, 279297, 1994
Cited by (2)

29
 Pruning AlgorithmsA Survey

REED R.
IEEE Trans. NN4(5), 704, 1993
Cited by (81)

30
 A stochastic approximation method

ROBBINS H.
Ann. Math. Stat. 22, 400407, 1951
DOI Cited by (55)

31
 A Fast New Algorithm for Training Feedforward Neural Networks

SCALERO R. S.
IEEE Trans. on Signal Processing 40(1), 202210, 1992
Cited by (23)

32
 Optimal filtering algorithm for fast learning in feedforward neural networks

SHAH S.
Neural Networks 5, 779787, 1992
Cited by (6)

33
 Averaging regularized estimators

TANIGUCHI M.
Neural Computation 9, 11631178, 1997
Cited by (4)

34
 Recurrent neural networks: a constructive algorithm, and its properties

TSOI A. C.
Neurocomputing 15, 309326, 1997
Cited by (2)

35
 Recursive least squares approach to combining principal and minor components analyses

WONG A. S. Y.
Electronics Letters 34, 10741076, 1998
Cited by (1)
Cited by: 1

1
 Dual extended Kalman filtering in recurrent neural networks

LEUNG ChiSing , CHAN LaiWan
Neural networks : the official journal of the International Neural Network Society 16(2), 223239, 20030301
References (27)