複数の全方位動画像を用いた自由視点テレプレゼンス

  • 石川 智也
    奈良先端科学技術大学院大学 情報科学研究科
  • 山澤 一誠
    奈良先端科学技術大学院大学 情報科学研究科
  • 横矢 直和
    奈良先端科学技術大学院大学 情報科学研究科

書誌事項

タイトル別名
  • Telepresence Using Multiple Omni-directional Videos
  • フクスウ ノ ゼンホウイ ドウガゾウ オ モチイタ ジユウ シテン テレプレゼンス

この論文をさがす

抄録

The advent of high-speed network and high performance PCs has prompted research into networked telepresence, which allows a user to see a virtualized real scene in remote places. View-dependent representation, which provides a user with arbitrary images using an HMD or an immersive display, is especially effective in creating a rich telepresence. The goal of this study is to create a novel view telepresence that enables a user to control the viewpoint and view-direction by virtualizing real dynamic environments. We describe a novel method of generating views that uses image-based rendering techniques from multiple omni-directional images captured from different positions and that evaluates image quality using a simulated environment. We also describe our prototype system and an experiment with the novel view telepresence that used the system in a real environment. Our prototype novel view telepresence system constructs a virtualized environment from real live videos. The system synthesizes a view based on the user's viewpoint and view-direction as measured by a magnetic sensor attached to an HMD and presents the generated view on the HMD. Our system can generate a user's view in real-time by presenting corresponding points and estimating camera parameters in advance.

収録刊行物

被引用文献 (3)*注記

もっと見る

参考文献 (25)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ