ウェアラブル拡張現実システムのための利用者のビューペースト位置・方位取得手法(<特集>複合現実感2)

書誌事項

タイトル別名
  • A View-Based Method of Obtaining Personal Positioning and Orientation Information for Wearable Augmented Reality Systems
  • ウェアラブル拡張現実システムのための利用者のビューベースト位置・方位取得手法
  • ウェアラブル カクチョウ ゲンジツ システム ノ タメ ノ リヨウシャ ノ ビューベースト イチ ホウイ シュトク シュホウ

この論文をさがす

抄録

In this paper, we describe an improved method of personal positioning and orientation using image registration between input video frames and panoramic images captured beforehand. In our previous work, we proposed the method of image registration with an affine transform. However, the affine transform is generally not capable of image registration between a frame and a panorama. We improved the previous method so that it can estimate projective transform parameters without severely increasing computational cost. We also improved the method to be robust with respect to lighting changes by using the weighted sum of absolute difference of both brightness and its gradient between images. Inertial sensors tied with the camera are used to improve robustness and processing throughput and delay. We confirmed that this improved method could estimate image registration parameters under conditions that hindered the previous method. Its computational cost increased by only 10-20% and its software implementation was capable of real-time processing.

収録刊行物

被引用文献 (5)*注記

もっと見る

参考文献 (13)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ