Tsukuba Challenge 2017 Dynamic Object Tracks Dataset for Pedestrian Behavior Analysis
-
- Lambert Jacob
- Graduate School of Informatics, Nagoya University
-
- Liang Leslie
- Institute of Innovation for Future Society, Nagoya University
-
- Morales Luis Yoichi
- Institute of Innovation for Future Society, Nagoya University
-
- Akai Naoki
- Institute of Innovation for Future Society, Nagoya University
-
- Carballo Alexander
- Institute of Innovation for Future Society, Nagoya University
-
- Takeuchi Eijiro
- Graduate School of Informatics, Nagoya University
-
- Narksri Patiphon
- Institute of Innovation for Future Society, Nagoya University
-
- Seiya Shunya
- Graduate School of Informatics, Nagoya University
-
- Takeda Kazuya
- Graduate School of Informatics, Nagoya University
この論文をさがす
抄録
<p>Navigation in social environments, in the absence of traffic rules, is the difficult task at the core of the annual Tsukuba Challenge. In this context, a better understanding of the soft rules that influence social dynamics is key to improve robot navigation. Prior research attempts to model social behavior through microscopic interactions, but the resulting emergent behavior depends heavily on the initial conditions, in particular the macroscopic setting. As such, data-driven studies of pedestrian behavior in a fixed environment may provide key insight into this macroscopic aspect, but appropriate data is scarcely available. To support this stream of research, we release an open-source dataset of dynamic object trajectories localized in a map of 2017 Tsukuba Challenge environment. A data collection platform equipped with lidar, camera, IMU, and odometry repeatedly navigated the challenge’s course, recording observations of passersby. Using a background map, we localized ourselves in the environment, removed the static background from the point cloud data, clustered the remaining points into dynamic objects and tracked their movements over time. In this work, we present the Tsukuba Challenge Dynamic Object Tracks dataset, which features nearly 10,000 trajectories of pedestrians, cyclists, and other dynamic agents, in particular autonomous robots. We provide a 3D map of the environment used as global frame for all trajectories. For each trajectory, we provide at regular time intervals an estimated position, velocity, heading, and rotational velocity, as well as bounding boxes for the objects and segmented lidar point clouds. As additional contribution, we provide a discussion which focuses on some discernible macroscopic patterns in the data.</p>
収録刊行物
-
- Journal of Robotics and Mechatronics
-
Journal of Robotics and Mechatronics 30 (4), 598-612, 2018-08-20
富士技術出版株式会社
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390845712986550144
-
- NII論文ID
- 130007437238
-
- NII書誌ID
- AA10809998
-
- ISSN
- 18838049
- 09153942
-
- NDL書誌ID
- 029149110
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- NDL
- Crossref
- CiNii Articles
-
- 抄録ライセンスフラグ
- 使用不可