Specific and class object recognition for service robots through autonomous and interactive methods Specific and Class Object Recognition for Service Robots through Autonomous and Interactive Methods
Access this Article
Search this Article
copyright(c)2008 IEICE総務省戦略的情報通信研究開発推進制度(SCOPE) 特定領域重点型研究開発 次世代ヒューマンインタフェース・コンテンツ技術視覚情報に基づく人間とロボットの対面およびネットワークコミュニケーション(051303007)平成17年度〜平成19年度 総務省戦略的情報通信研究開発推進制度(SCOPE)研究成果報告書 Strategic Information and Communications R&D Promotion Programme(SCOPE)Ministry of Internal Affairs and CommunicationsHuman-Robot Communication through Visual Information in Face-to-Face and Network Modes(051303007)2005-2007Final Project Report平成20年3月研究代表者 久野義徳(埼玉大学大学院理工学研究科教授)
Service robots need to be able to recognize and identify objects located within complex backgrounds. Since no single method may work in every situation, several methods need to be combined and robots have to select the appropriate one automatically. In this paper we propose a scheme to classify situations depending on the characteristics of the object of interest and user demand. We classify situations into four groups and employ different techniques for each. We use Scale-invariant feature transform (SIFT), Kernel Principal Components Analysis (KPCA) in conjunction with Support Vector Machine (SVM) using intensity, color, and Gabor features for five object categories. We show that the use of appropriate features is important for the use of KPCA and SVM based techniques on different kinds of objects. Through experiments we show that by using our categorization scheme a service robot can select an appropriate feature and method, and considerably improve its recognition performance. Yet, recognition is not perfect. Thus, we propose to combine the autonomous method with an interactive method that allows the robot to recognize the user request for a specific object and class when the robot fails to recognize the object. We also propose an interactive way to update the object model that is used to recognize an object upon failure in conjunction with the user's feedback.
- IEICE Transactions on Information and Systems
IEICE Transactions on Information and Systems 91(6), 1793-1803, 2008-06-01
The Institute of Electronics, Information and Communication Engineers