TY - GEN
T1 - 3D semantic segmentation for high-resolution aerial survey derived point clouds using deep learning (demonstration)
AU - Xiu, Haoyi
AU - Vinayaraj, Poliyapram
AU - Kim, Kyoung Sook
AU - Nakamura, Ryosuke
AU - Yan, Wanglin
PY - 2018/11/6
Y1 - 2018/11/6
N2 - Three-dimensional (3D) Semantic segmentation of aerial derived point cloud aims at assigning each point to a semantic class such as building, tree, road, and so on. Accurate 3D-segmentation results can be used as an essential information for constructing 3D city models, for assessing the urban expansion and economical condition. However, the fine-grained semantic segmentation is a challenge in high-resolution point cloud due to irregularly distributed points unlike regular pixels of image. In this demonstration, we present a case study to apply PointNet, a novel deep learning network, to outdoor aerial survey derived point clouds by considering intensity (depth) as well as spectral information (RGB). PointNet was basically designed for indoor point cloud data based on the permutation invariance of 3D points. We firstly fuse two surveying datasets of Light Detection and ranging (LiDAR) and aerial images for generating multi-sourced aerial point clouds (RGB-DI). Then, each point of fused data is classified into a semantic class of ordinary building, public facility, apartment, factory, transportation network, park, and water by reworking PointNet. The result of our approach by using deep learning shows about 0.88 accuracy and 0.64 F-measure of semantic segmentation with the RGB-DI data we have fused. It outperforms a Support Vector Machine(SVM) approach based on geometric features of linearity, planarity, scattering, and verticality of a set of 3D points.
AB - Three-dimensional (3D) Semantic segmentation of aerial derived point cloud aims at assigning each point to a semantic class such as building, tree, road, and so on. Accurate 3D-segmentation results can be used as an essential information for constructing 3D city models, for assessing the urban expansion and economical condition. However, the fine-grained semantic segmentation is a challenge in high-resolution point cloud due to irregularly distributed points unlike regular pixels of image. In this demonstration, we present a case study to apply PointNet, a novel deep learning network, to outdoor aerial survey derived point clouds by considering intensity (depth) as well as spectral information (RGB). PointNet was basically designed for indoor point cloud data based on the permutation invariance of 3D points. We firstly fuse two surveying datasets of Light Detection and ranging (LiDAR) and aerial images for generating multi-sourced aerial point clouds (RGB-DI). Then, each point of fused data is classified into a semantic class of ordinary building, public facility, apartment, factory, transportation network, park, and water by reworking PointNet. The result of our approach by using deep learning shows about 0.88 accuracy and 0.64 F-measure of semantic segmentation with the RGB-DI data we have fused. It outperforms a Support Vector Machine(SVM) approach based on geometric features of linearity, planarity, scattering, and verticality of a set of 3D points.
KW - 3D-segmentation
KW - Aerial images
KW - Deep learning
KW - Point cloud
KW - PointNet
UR - http://www.scopus.com/inward/record.url?scp=85058658588&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85058658588&partnerID=8YFLogxK
U2 - 10.1145/3274895.3274950
DO - 10.1145/3274895.3274950
M3 - Conference contribution
AN - SCOPUS:85058658588
T3 - GIS: Proceedings of the ACM International Symposium on Advances in Geographic Information Systems
SP - 588
EP - 591
BT - 26th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, ACM SIGSPATIAL GIS 2018
A2 - Xiong, Li
A2 - Tamassia, Roberto
A2 - Banaei, Kashani Farnoush
A2 - Guting, Ralf Hartmut
A2 - Hoel, Erik
PB - Association for Computing Machinery
T2 - 26th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, ACM SIGSPATIAL GIS 2018
Y2 - 6 November 2018 through 9 November 2018
ER -