TY - GEN
T1 - Easy extrinsic calibration of vr system and multi-camera based marker-less motion capture system
AU - Takahashi, Kosuke
AU - Mikami, Dan
AU - Isogawa, Mariko
AU - Sun, Siqi
AU - Kusachi, Yoshinori
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/10
Y1 - 2019/10
N2 - This paper proposes a novel easy extrinsic calibration algorithm for an off-The-shelf VR system and a multi-camera based marker-less motion capture system. To realize interactions between 3D user motions and virtual objects reconstructed from multi-view videos in a common 3D space, the extrinsic calibration of the VR system and the multiple cameras must be conducted beforehand. This calibration, which involves estimating the pose and position of each coordinate system, is a key technology for handling 3D information in a system with various type of input sources. In general, extrinsic calibration is carried out by identifying and utilizing some common 3D points. However, since most of off-The-shelf VR systems do not include any imaging device, it is difficult to apply conventional automatic calibration approaches as there are no common points shared with the cameras. Against this problem, this paper introduces an easy calibration algorithm by generating corresponding points from the trajectories of the user's motion with VR devices and 3D human pose reconstructed from multi-view videos. Our study provides the following two contributions; (1) our method does not need to introduce additional devices, such as a chessboard and (2) our method does not need manual processes as the extrinsic parameters are automatically estimated. We demonstrate the performance of the proposed method in a practical scenario.
AB - This paper proposes a novel easy extrinsic calibration algorithm for an off-The-shelf VR system and a multi-camera based marker-less motion capture system. To realize interactions between 3D user motions and virtual objects reconstructed from multi-view videos in a common 3D space, the extrinsic calibration of the VR system and the multiple cameras must be conducted beforehand. This calibration, which involves estimating the pose and position of each coordinate system, is a key technology for handling 3D information in a system with various type of input sources. In general, extrinsic calibration is carried out by identifying and utilizing some common 3D points. However, since most of off-The-shelf VR systems do not include any imaging device, it is difficult to apply conventional automatic calibration approaches as there are no common points shared with the cameras. Against this problem, this paper introduces an easy calibration algorithm by generating corresponding points from the trajectories of the user's motion with VR devices and 3D human pose reconstructed from multi-view videos. Our study provides the following two contributions; (1) our method does not need to introduce additional devices, such as a chessboard and (2) our method does not need manual processes as the extrinsic parameters are automatically estimated. We demonstrate the performance of the proposed method in a practical scenario.
KW - Calibration
KW - Motion
KW - Multi view
UR - http://www.scopus.com/inward/record.url?scp=85078799249&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078799249&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct.2019.00036
DO - 10.1109/ISMAR-Adjunct.2019.00036
M3 - Conference contribution
AN - SCOPUS:85078799249
T3 - Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
SP - 83
EP - 88
BT - Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
Y2 - 14 October 2019 through 18 October 2019
ER -