Residential College | false |
Status | 已發表Published |
Sparse-to-Dense Matching Network for Large-scale LiDAR Point Cloud Registration | |
Lu,Fan1; Chen,Guang1; Liu,Yinlong2; Zhan,Yibing3; Li,Zhijun4; Tao,Dacheng5; Jiang,Changjun6 | |
2023 | |
Source Publication | IEEE Transactions on Pattern Analysis and Machine Intelligence |
ISSN | 0162-8828 |
Volume | 45Issue:9Pages:11270-11282 |
Abstract | Point cloud registration is a fundamental problem in 3D computer vision. Previous learning-based methods for LiDAR point cloud registration can be categorized into two schemes: dense-to-dense matching methods and sparse-to-sparse matching methods. However, for large-scale outdoor LiDAR point clouds, solving dense point correspondences is time-consuming, whereas sparse keypoint matching easily suffers from keypoint detection error. In this paper, we propose SDMNet, a novel Sparse-to-Dense Matching Network for large-scale outdoor LiDAR point cloud registration. Specifically, SDMNet performs registration in two sequential stages: sparse matching stage and local-dense matching stage. In the sparse matching stage, we sample a set of sparse points from the source point cloud and then match them to the dense target point cloud using a spatial consistency enhanced soft matching network and a robust outlier rejection module. Furthermore, a novel neighborhood matching module is developed to incorporate local neighborhood consensus, significantly improving performance. The local-dense matching stage is followed for fine-grained performance, where dense correspondences are efficiently obtained by performing point matching in local spatial neighborhoods of high-confidence sparse correspondences. Extensive experiments on three large-scale outdoor LiDAR point cloud datasets demonstrate that the proposed SDMNet achieves state-of-the-art performance with high efficiency. |
Keyword | Cloud Computing Correspondence Feature Extraction Feature Matching Filtering Laser Radar Lidar Optimal Transport Pipelines Point Cloud Compression Point Cloud Registration Rigid Sparse-to-dense Three-dimensional Displays |
DOI | 10.1109/TPAMI.2023.3265531 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic |
WOS ID | WOS:001045832200043 |
Scopus ID | 2-s2.0-85153334619 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) |
Corresponding Author | Chen,Guang |
Affiliation | 1.School of Automotive Engineering and Department of Computer Science, Tongji University, Shanghai, China 2.State Key Laboratory of Internet of Things for Smart City (SKL-IOTSC), University of Macau, Macao SAR, China 3.JD Explore Academy, Beijing, China 4.Hefei Comprehensive National Science Center, Department of Automation, Institute of Artificial Intelligence, University of Science and Technology of China, Hefei, China 5.School of Computer Science, in the Faculty of Engineering, The University of Sydney, 6 Cleveland St, Darlington, NSW, Australia 6.Department of Computer Science, Tongji University, Shanghai, China |
Recommended Citation GB/T 7714 | Lu,Fan,Chen,Guang,Liu,Yinlong,et al. Sparse-to-Dense Matching Network for Large-scale LiDAR Point Cloud Registration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(9), 11270-11282. |
APA | Lu,Fan., Chen,Guang., Liu,Yinlong., Zhan,Yibing., Li,Zhijun., Tao,Dacheng., & Jiang,Changjun (2023). Sparse-to-Dense Matching Network for Large-scale LiDAR Point Cloud Registration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(9), 11270-11282. |
MLA | Lu,Fan,et al."Sparse-to-Dense Matching Network for Large-scale LiDAR Point Cloud Registration".IEEE Transactions on Pattern Analysis and Machine Intelligence 45.9(2023):11270-11282. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment