Use of lidar-derived NDTI and intensity for rule-based object-oriented extraction of building footprints

被引:4
作者
Zhao, Ting [1 ]
Wang, Jinfei [1 ,2 ]
机构
[1] Univ Western Ontario, Dept Geog, London, ON N6A 5C2, Canada
[2] Beijing Normal Univ, State Key Lab Earth Surface Proc & Resource Ecol, Beijing 100875, Peoples R China
基金
加拿大自然科学与工程研究理事会;
关键词
LAND-COVER CLASSIFICATION; AERIAL IMAGERY; FUSION; GENERATION;
D O I
10.1080/01431161.2013.871394
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Buildings play an essential role in urban intra-construction, planning, and climate. The precise knowledge of building footprints not only serves as a primary source for interpreting complex urban characteristics, but also provides regional planners with more realistic and multidimensional scenarios for urban management. The recently developed airborne light detection and ranging (lidar) technology provides a very promising alternative for building-footprint measurement. In this study, lidar intensity data, a normalized digital surface model (nDSM) of the first and last returns, and the normalized difference tree index (NDTI) derived from the two returns are used to extract building footprints using rule-based object-oriented classification. The study area is chosen in London, Ontario, based on the various types of buildings surrounded by trees. An integrated segmentation approach and a hierarchical rule-based classification strategy are proposed during the process. The results indicate that the proposed object-based classification is a very effective semi-automatic method for building-footprint extraction, with buildings and trees successfully separated. An overall accuracy of 94.0% and a commission error of 6.3% with a kappa value of 0.84 are achieved. Lidar-derived NDTI and intensity data are of great importance in object-based building extraction, and the kappa value of the proposed method is double that of the object-based method without NDTI or intensity.
引用
收藏
页码:578 / 597
页数:20
相关论文
共 43 条
[31]  
Morgan M., 2000, International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, V33, P616
[32]  
ROSENFIELD GH, 1986, PHOTOGRAMM ENG REM S, V52, P223
[33]   Aulomatic generation of high-quality building models from Lidar data [J].
Rottensteiner, F .
IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2003, 23 (06) :42-50
[34]  
Rottensteiner F., 2002, INT ARCH PHOTOGRAMME, V34, P295
[35]  
SHUFELT JA, 1993, CVGIP-IMAG UNDERSTAN, V57, P307, DOI 10.1006/ciun.1993.1021
[36]   Data fusion of high-resolution satellite imagery and LiDAR data for automatic building extraction [J].
Sohn, Gunho ;
Dowman, Ian .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2007, 62 (01) :43-63
[37]   Multi-scale solution for building extraction from LiDAR and image data [J].
Vu, T. Thuy ;
Yamazaki, Fumio ;
Matsuoka, Masashi .
INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2009, 11 (04) :281-289
[38]   Integrating LiDAR Intensity and Elevation Data for Terrain Characterization in a Forested Area [J].
Wang, Cheng ;
Glenn, Nancy F. .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2009, 6 (03) :463-466
[39]  
Weidner U., 1996, INT ARCH PHOTOGRAMME, V31, P924
[40]  
Weidner U., 1997, Auto-matic Extraction of Man-Made Objects From Aerial and Space Images(II), P193, DOI DOI 10.1007/978-3-0348-8906-3_19