Model Reduction with Time Delay Combining Least-Squares Method with Artificial Bee Colony Algorithm

Abstract

In this paper we propose a novel method of model reduction with a time delay for single-input, single-output continuous-time systems by a separable least-squares (LS) approach. The reduced-order model is determined by minimizing the integral of the magnitude squared of the transfer function error. The denominator parameters and time delay of the reduced-order model are represented by the positions of the food sources of the employed bees and searched for by the artificial bee colony algorithm, while the numerator parameters are estimated by the linear LS method for each candidate of the denominator parameters and time delay. All the best parameters and the time delay of the reduced-order model are obtained through the search by the employed, onlooker and scout bees. Simulation results show that the accuracy of the proposed method is superior to that of the genetic algorithm (GA)-based model reduction algorithm.

Journal

References(15)*help

See more

Details 詳細情報について

  • CRID
    1390282679440114944
  • NII Article ID
    130004849303
  • DOI
    10.2299/jsp.17.189
  • ISSN
    18801013
    13426230
  • Text Lang
    en
  • Data Source
    • JaLC
    • Crossref
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top