Rapid Characterization of Vegetation Structure with a Microsoft Kinect Sensor

被引:84
作者
Azzari, George [1 ]
Goulden, Michael L. [1 ]
Rusu, Radu B. [2 ]
机构
[1] Univ Calif Irvine, Dept Earth Syst Sci, Irvine, CA 92697 USA
[2] Open Percept Inc, Menlo Pk, CA 94025 USA
来源
SENSORS | 2013年 / 13卷 / 02期
关键词
terrestrial ecology; field measurements; canopy structure; biomass; LIDAR; Microsoft Kinect; point clouds; depth images; convex hulls; concave hulls; GROUND BIOMASS; FOREST;
D O I
10.3390/s130202384
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The importance of vegetation structure and biomass in controlling land-atmosphere exchange is widely recognized, but measurements of canopy structure are challenging, time consuming, and often rely on destructive methods. The Microsoft Kinect is an infrared sensor designed for video gaming that outputs synchronized color and depth images and that has the potential to allow rapid characterization of vegetation structure. We compared depth images from a Kinect sensor with manual measurements of plant structure and size for two species growing in a California grassland. The depth images agreed well with the horizontal and vertical measurements of plant size made manually. Similarly, the plant volumes calculated with a three-dimensional convex hulls approach was well related to plant biomass. The Kinect showed some limitations for ecological observation associated with a short measurement range and daytime light contamination. Nonetheless, the Kinect's light weight, fast acquisition time, low power requirement, and cost make it a promising tool for rapid field surveys of canopy structure, especially in small-statured vegetation.
引用
收藏
页码:2384 / 2398
页数:15
相关论文
共 31 条
[21]   INFLUENCE OF LANDSCAPE STRUCTURE ON LOCAL AND REGIONAL CLIMATE [J].
PIELKE, RA ;
AVISSAR, R .
LANDSCAPE ECOLOGY, 1990, 4 (2-3) :133-155
[22]   Quantification of live aboveground forest biomass dynamics with Landsat time-series and field inventory data: A comparison of empirical modeling approaches [J].
Powell, Scott L. ;
Cohen, Warren B. ;
Healey, Sean P. ;
Kennedy, Robert E. ;
Moisen, Gretchen G. ;
Pierce, Kenneth B. ;
Ohmann, Janet L. .
REMOTE SENSING OF ENVIRONMENT, 2010, 114 (05) :1053-1068
[23]   Tree species classification from fused active hyperspectral reflectance and LIDAR measurements [J].
Puttonen, Eetu ;
Suomalainen, Juha ;
Hakala, Teemu ;
Raikkonen, Esa ;
Kaartinen, Harri ;
Kaasalainen, Sanna ;
Litkey, Paula .
FOREST ECOLOGY AND MANAGEMENT, 2010, 260 (10) :1843-1852
[24]   Use of digital webcam images to track spring green-up in a deciduous broadleaf forest [J].
Richardson, Andrew D. ;
Jenkins, Julian P. ;
Braswell, Bobby H. ;
Hollinger, David Y. ;
Ollinger, Scott V. ;
Smith, Marie-Louise .
OECOLOGIA, 2007, 152 (02) :323-334
[25]   Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera [J].
Rosnell, Tomi ;
Honkavaara, Eija .
SENSORS, 2012, 12 (01) :453-480
[26]   Distinct patterns of changes in surface energy budget associated with forestation in the semiarid region [J].
Rotenberg, Eyal ;
Yakir, Dan .
GLOBAL CHANGE BIOLOGY, 2011, 17 (04) :1536-1548
[27]   3D-laser scanning: A non-destructive method for studying above- ground biomass and growth of juvenile trees [J].
Seidel, Dominik ;
Beyer, Friderike ;
Hertel, Dietrich ;
Fleck, Stefan ;
Leuschner, Christoph .
AGRICULTURAL AND FOREST METEOROLOGY, 2011, 151 (10) :1305-1311
[28]   Point Feature Extraction on 3D Range Scans Taking into Account Object Boundaries [J].
Steder, Bastian ;
Rusu, Radu Bogdan ;
Konolige, Kurt ;
Burgard, Wolfram .
2011 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2011, :2601-2608
[29]   Development of a UAV-LiDAR System with Application to Forest Inventory [J].
Wallace, Luke ;
Lucieer, Arko ;
Watson, Christopher ;
Turner, Darren .
REMOTE SENSING, 2012, 4 (06) :1519-1543
[30]   A Lidar point cloud based procedure for vertical canopy structure analysis and 3D single tree modelling in forest [J].
Wang, Yunsheng ;
Weinacker, Holger ;
Koch, Barbara .
SENSORS, 2008, 8 (06) :3938-3951