TY - GEN
T1 - An Instant See-Through Vision System Using a Wide Field-of-View Camera and a 3D-Lidar
AU - Oishi, Kei
AU - Mori, Shohei
AU - Saito, Hideo
N1 - Funding Information:
This work was supported in part by JSPS Grant Numbers 16J05114.
Publisher Copyright:
© 2017 IEEE.
PY - 2017/10/27
Y1 - 2017/10/27
N2 - Diminished reality (DR) enables us to see through real objects occluding some areas in our field of view. This interactive display has various applications, such as see-through vision to visualize invisible areas, work area visualization in surgery and landscape simulation. In this paper, we propose two underlying problems in see-through vision, in which hidden areas are observed in real time. First, see-through vision methods require a common area to calibrate every camera in the environment. However, the field of view is limited and many approaches rely on a time-consuming calibration, sensors, or fiducial markers. Second, see-through vision applications assume that the background is planar to ease image alignment. We therefore present a place-and-play see-through vision system using a wide field-of-view RGB-D camera. We validated the accuracy and the robustness of our system and showed results in various environments to show the applicability.
AB - Diminished reality (DR) enables us to see through real objects occluding some areas in our field of view. This interactive display has various applications, such as see-through vision to visualize invisible areas, work area visualization in surgery and landscape simulation. In this paper, we propose two underlying problems in see-through vision, in which hidden areas are observed in real time. First, see-through vision methods require a common area to calibrate every camera in the environment. However, the field of view is limited and many approaches rely on a time-consuming calibration, sensors, or fiducial markers. Second, see-through vision applications assume that the background is planar to ease image alignment. We therefore present a place-and-play see-through vision system using a wide field-of-view RGB-D camera. We validated the accuracy and the robustness of our system and showed results in various environments to show the applicability.
KW - 3D-Lidar
KW - Diminished reality
KW - Fish-eye camera
UR - http://www.scopus.com/inward/record.url?scp=85040237698&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85040237698&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct.2017.99
DO - 10.1109/ISMAR-Adjunct.2017.99
M3 - Conference contribution
AN - SCOPUS:85040237698
T3 - Adjunct Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017
SP - 344
EP - 347
BT - Adjunct Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017
A2 - Broll, Wolfgang
A2 - Regenbrecht, Holger
A2 - Bruder, Gerd
A2 - Servieres, Myriam
A2 - Sugimoto, Maki
A2 - Swan, J Edward
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 16th Adjunct IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017
Y2 - 9 October 2017 through 13 October 2017
ER -