TY - JOUR
T1 - CorsNet
T2 - 3D Point Cloud Registration by Deep Neural Network
AU - Kurobe, Akiyoshi
AU - Sekikawa, Yusuke
AU - Ishikawa, Kohta
AU - Saito, Hideo
N1 - Funding Information:
Manuscript received September 10, 2019; accepted January 9, 2020. Date of publication February 3, 2020; date of current version April 20, 2020. This letter was recommended for publication by Associate Editor E. Ricci and Editor E. Marchand upon evaluation of the reviewers’ comments. This work was supported by the Japan Science and Technology Agency (JST) under Grant CREST-JPMJCR1683. (Corresponding author: Akiyoshi Kurobe.) A. Kurobe and H. Saito are with the Department of Science and Technology, Keio University, Yokohama 223-8522, Japan (e-mail: kurobe@hvrl.ics. keio.ac.jp; saito@hvrl.ics.keio.ac.jp).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - Point cloud registration is a key problem for robotics and computer vision communities. This represents estimating a rigid transform which aligns one point cloud to another. Iterative closest point (ICP) is a well-known classical method for this problem, yet it generally achieves high alignment only when the source and template point cloud are mostly pre-aligned. If each point cloud is far away or contains a repeating structure, the registration often fails because of being fallen into a local minimum. Recently, inspired by PointNet, several deep learning-based methods have been developed. PointNetLK is a representative approach, which directly optimizes the distance of aggregated features using gradient method by Jacobian. In this paper, we propose a point cloud registration system based on deep learning: CorsNet. Since CorsNet concatenates the local features with the global features and regresses correspondences between point clouds, not directly pose or aggregated features, more useful information is integrated than the conventional approaches. For comparison, we also developed a novel deep learning approach (DirectNet) that directly regresses the pose between point clouds. Through our experiments, we show that CorsNet achieves higher accuracy than not only the classic ICP method, but also the recently proposed learning-based proposal PointNetLK and DirectNet, including on seen and unseen categories.
AB - Point cloud registration is a key problem for robotics and computer vision communities. This represents estimating a rigid transform which aligns one point cloud to another. Iterative closest point (ICP) is a well-known classical method for this problem, yet it generally achieves high alignment only when the source and template point cloud are mostly pre-aligned. If each point cloud is far away or contains a repeating structure, the registration often fails because of being fallen into a local minimum. Recently, inspired by PointNet, several deep learning-based methods have been developed. PointNetLK is a representative approach, which directly optimizes the distance of aggregated features using gradient method by Jacobian. In this paper, we propose a point cloud registration system based on deep learning: CorsNet. Since CorsNet concatenates the local features with the global features and regresses correspondences between point clouds, not directly pose or aggregated features, more useful information is integrated than the conventional approaches. For comparison, we also developed a novel deep learning approach (DirectNet) that directly regresses the pose between point clouds. Through our experiments, we show that CorsNet achieves higher accuracy than not only the classic ICP method, but also the recently proposed learning-based proposal PointNetLK and DirectNet, including on seen and unseen categories.
KW - Computer vision for other robotic applications
KW - deep learning in robotics and automation
KW - perception for grasping and manipulation
UR - http://www.scopus.com/inward/record.url?scp=85084107962&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084107962&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.2970946
DO - 10.1109/LRA.2020.2970946
M3 - Article
AN - SCOPUS:85084107962
SN - 2377-3766
VL - 5
SP - 3960
EP - 3966
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 3
M1 - 8978671
ER -