Control of multiple robots using vision sensors
著者
書誌事項
Control of multiple robots using vision sensors
(Advances in industrial control)
Springer, 2017
- : hbk
大学図書館所蔵 件 / 全4件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
内容説明・目次
内容説明
This monograph introduces novel methods for the control and navigation of mobile robots using multiple-1-d-view models obtained from omni-directional cameras. This approach overcomes field-of-view and robustness limitations, simultaneously enhancing accuracy and simplifying application on real platforms. The authors also address coordinated motion tasks for multiple robots, exploring different system architectures, particularly the use of multiple aerial cameras in driving robot formations on the ground. Again, this has benefits of simplicity, scalability and flexibility. Coverage includes details of:
a method for visual robot homing based on a memory of omni-directional images;
a novel vision-based pose stabilization methodology for non-holonomic ground robots based on sinusoidal-varying control inputs;
an algorithm to recover a generic motion between two 1-d views and which does not require a third view;
a novel multi-robot setup where multiple camera-carrying unmanned aerial vehicles are used to observe and control a formation of ground mobile robots; and
three coordinate-free methods for decentralized mobile robot formation stabilization.
The performance of the different methods is evaluated both in simulation and experimentally with real robotic platforms and vision sensors.
Control of Multiple Robots Using Vision Sensors will serve both academic researchers studying visual control of single and multiple robots and robotics engineers seeking to design control systems based on visual sensors.
目次
Introduction.- Angle-based Navigation using the 1D Trifocal Tensor.- Vision-based Control for Nonholonomic Vehicles.- Controlling Mobile Robot Teams from 1D Homographies.- Control of Mobile Robot Formations using Aerial Cameras.- Coordinate-free Control of Multirobot Formations.- Conclusions and Directions for Future Research.
「Nielsen BookData」 より