Model Reduction with Time Delay Combining Least-Squares Method with Artificial Bee Colony Algorithm
-
- Hachino Tomohiro
- Kagoshima University
-
- Sameshima Kosuke
- Kagoshima University
-
- Takata Hitoshi
- Kagoshima University
-
- Nakayama Shigeru
- Kagoshima University
-
- Iimura Ichiro
- Prefectural University of Kumamoto
-
- Fukushima Seiji
- Kagoshima University
-
- Igarashi Yasutaka
- Kagoshima University
Abstract
In this paper we propose a novel method of model reduction with a time delay for single-input, single-output continuous-time systems by a separable least-squares (LS) approach. The reduced-order model is determined by minimizing the integral of the magnitude squared of the transfer function error. The denominator parameters and time delay of the reduced-order model are represented by the positions of the food sources of the employed bees and searched for by the artificial bee colony algorithm, while the numerator parameters are estimated by the linear LS method for each candidate of the denominator parameters and time delay. All the best parameters and the time delay of the reduced-order model are obtained through the search by the employed, onlooker and scout bees. Simulation results show that the accuracy of the proposed method is superior to that of the genetic algorithm (GA)-based model reduction algorithm.
Journal
-
- Journal of Signal Processing
-
Journal of Signal Processing 17 (5), 189-198, 2013
Research Institute of Signal Processing, Japan
- Tweet
Details 詳細情報について
-
- CRID
- 1390282679440114944
-
- NII Article ID
- 130004849303
-
- ISSN
- 18801013
- 13426230
-
- Text Lang
- en
-
- Data Source
-
- JaLC
- Crossref
- CiNii Articles
-
- Abstract License Flag
- Disallowed