Target-Tracking based on Fusion of Unsynchronized Sensor Data from Vision System and Thermal Imaging Sensor

この論文をさがす

抄録

This paper proposes a method for the integration of sensor data from a thermal imaging sensor and a vision system such as a camera to enable autonomous robots to perform probabilistic target tracking. Person tracking is essential to enable robots to interact with people and to track a target person. It is necessary to integrate multiple types of sensor data to improve the recognition performance because that of a vision system alone is not very good. However, the sensor data from an imaging device are captured asynchronously. Moreover, the interval of the acquisition of the sensor data is different. We use a stochastic model based on the asynchronous updating of the model of the tracking target by using measurement data from the camera and the thermal imaging sensor. The tracking model is implemented using RT-Middleware, which is used as a platform for the construction of distributed networked robots. The experimental results indicate the feasibility of the proposed method. <br>

収録刊行物

被引用文献 (1)*注記

もっと見る

参考文献 (15)*注記

もっと見る

関連プロジェクト

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ