Direct derivation of maize plant and crop height from low-cost time-of-flight camera measurements
© The Author(s) 2016
Received: 9 June 2016
Accepted: 22 November 2016
Published: 28 November 2016
In agriculture, information about the spatial distribution of crop height is valuable for applications such as biomass and yield estimation, or increasing field work efficiency in terms of fertilizing, applying pesticides, irrigation, etc. Established methods for capturing crop height often comprise restrictions in terms of cost and time efficiency, flexibility, and temporal and spatial resolution of measurements. Furthermore, crop height is mostly derived from a measurement of the bare terrain prior to plant growth and measurements of the crop surface when plants are growing, resulting in the need of multiple field campaigns. In our study, we examine a method to derive crop heights directly from data of a plot of full grown maize plants captured in a single field campaign. We assess continuous raster crop height models (CHMs) and individual plant heights derived from data collected with the low-cost 3D camera Microsoft® Kinect® for Xbox One™ based on a comprehensive comparison to terrestrial laser scanning (TLS) reference data.
We examine single measurements captured with the 3D camera and a combination of the single measurements, i.e. a combination of multiple perspectives. The quality of both CHMs, and individual plant heights is improved by combining the measurements. R2 of CHMs derived from single measurements range from 0.48 to 0.88, combining all measurements leads to an R2 of 0.89. In case of individual plant heights, an R2 of 0.98 is achieved for the combined measures (with R2 = 0.44 for the single measurements). The crop heights derived from the 3D camera measurements comprise an average underestimation of 0.06 m compared to TLS reference values.
We recommend the combination of multiple low-cost 3D camera measurements, removal of measurement artefacts, and the inclusion of correction functions to improve the quality of crop height measurements. Operating low-cost 3D cameras under field conditions on agricultural machines or on autonomous platforms can offer time and cost efficient tools for capturing the spatial distribution of crop heights directly in the field and subsequently to advance agricultural efficiency and productivity. More general, all processes which include the 3D geometry of natural objects can profit from low-cost methods producing 3D geodata.
KeywordsPrecision agriculture Site-specific crop management Continuous raster crop height model Individual plant height 3D geodata Low-cost time-of-flight camera
Information about crop height and its spatial distribution is of high value for agriculture. By including this information into the management and field work processes, agricultural productivity and efficiency can be improved , which in turn can be a means of improving global food supply and of tackling challenges related to climatic changes [2, 3].
Examples for the usage of crop height models (CHMs) are site-specific crop management [4, 5], plant nitrogen estimates , and yield and biomass estimations [7–9]. In addition to CHM raster models continuously covering a whole crop stand, the height of an individual plant at is of high value for agricultural research. Freeman et al. , for example, present a high correlation between maize plant height and biomass. Similar, models for corn yield estimation are improved by including plant height [11, 12], and Muharam et al.  state significant correlations between plant height and nitrogen nutrition status.
Approaches for a non-invasive collection of 3D geodata as a basis for deriving crop height models vary widely . High-end airborne laser scanning (ALS) is used to capture the crop height of maize fields in the study presented by Li et al. , with high correlations stated between the ALS data and manual field measurements. Friedli et al.  apply terrestrial laser scanning (TLS) to monitor crop growth, and Crommelinck and Höfle  examine the requirements in terms of TLS sensor resolution for deriving CHMs, aiming at low-cost devices for permanent crop monitoring. Following a low-cost photogrammetric approach to generate 3D geodata, Li et al.  and Bareth et al.  present crop surface models generated on the basis of image collections captured from unmanned aerial vehicle platforms. Marx et al.  describe subjective crop height data collection using smartphone devices by non-experts and successfully derive seamless crop height models of high quality when compared to TLS reference data. Another approach is suggested in [20, 21] where the crop height is directly derived via the distance between a LiDAR device and the crop surface.
The methods for gathering 3D geodata applied in the mentioned studies have their particular advantages and restrictions. Laser scanners are active systems, not depending on specific lighting conditions. Furthermore, laser beams can penetrate vegetation so that measurements of the terrain are possible in vegetated areas. However, static terrestrial laser scanning is prone to occlusion of the terrain by vegetation and can include unfavorable scanning geometries . Airborne laser scanning offers an advantageous perspective close to nadir which minimizes occlusion of the terrain by plants, but the method comprises restrictions in terms of temporal and spatial resolution. Regarding photogrammetric approaches, data acquisition and derivation is straight forward, but the sensors are passive and subsequently sensitive to different lighting conditions. Furthermore, crops can occlude the terrain in photogrammetrically analyzed images, which leads to restrictions in terms of seamless crop height derivation .
Additionally, most of the mentioned studies comprise at least two field campaigns, one for capturing data of the terrain without vegetation, and subsequent campaigns for capturing the crop surface. Contrary, Li et al.  and Luo et al.  profit from ALS measurements reaching the terrain through gaps in the crop canopy and successfully achieve a direct CHM derivation from only one measurement campaign. Similarly, Grenzdörffer  tests a direct CHM derivation from low-cost photogrammetry point clouds, but conclude that this approach is less reliable compared to the usage of a digital terrain model (DTM) captured before plant growth due to the terrain being highly occluded by the crop canopy in the used images.
The motivation for our study draws from the idea of directly deriving crop height models and individual plant heights without the need of a prior DTM, using an active low-cost sensor scanning from a nadir perspective and, subsequently, minimizing terrain occlusion by plants. The device used in this study for capturing 3D data of crops is the time-of-flight 3D camera Microsoft® Kinect® for Xbox One™ (i.e. the second Kinect® generation, in our study abbreviated with ‘K2’). Similar to Marinello et al. , who apply the first Kinect generation for dynamic soil surface characterization under field conditions, a setup of K2 devices mounted in nadir perspective on autonomous mobile platforms or in arrays along booms of agricultural machines can be imagined, offering a time and cost efficient tool for capturing the distribution of crop heights directly in the field.
The aim of this study is to assess (1) raster crop height models and (2) individual plant heights directly derived from K2 data without prior measurements of the bare soil. The CHMs and plant heights are calculated from single and combined K2 measurements to examine the improvement of derivatives via the combination of multiple K2 perspectives. A TLS dataset provides the reference for comparing the CHMs on a raster cell level and the plant heights on a point cloud level. We address advantages and limitations of using the K2 especially for capturing vegetation objects such as agricultural crops.
The Kinect® for Xbox One™ sensor
The K2 measures the distances between sensor and objects within a field of view (FOV) of 70° × 60° by actively emitting a near infrared signal (850 nm) and measuring the time shift between signal emission and backscattered signal detection for each of the 512 × 424 sensor pixels. With the resulting point cloud consisting of 217,088 XYZ coordinates derived from a depth image, distances from 0.5 to 4.5 m can be covered [25–27]. Calculated from FOV angles and the number of sensor pixels, both the horizontal and vertical theoretical resolution (spacing of range measurements) range from 0.0014 m at 0.5 m scanning range to 0.0109 m at 4.0 m scanning range. For our study, the device was operated with the software toolkit KinectPV2 .
Performance of K2 sensor
To assess the performance of the K2 sensor, experiments are performed under controlled conditions (Fig. 1). We examine precision (repeatability), accuracy (conformity of measurements to true value), and measurement artefacts in the form of 3D coordinates recorded in a completely empty scene as produced by the sensor.
To test the device for measurement artefacts, an empty scene is measured 100 times under four different lighting conditions (night, in diffuse light i.e. shadow, direct sunlight with the sensor facing away from the sun, direct sunlight with the sensor facing directly into the sun). The distribution of measurement artefacts within the sensor’s FOV is assessed by calculating the distance of each recorded XYZ coordinate to a mesh of the FOV edges (Additional file 1) after minimizing the distance between the K2 point clouds and the FOV edges via the iterative closest point algorithm [29, 30]. The mean number of measurement artefacts is derived, and their spatial distribution is assessed by calculating the median, maximum, and standard deviation of the distances between the points and the FOV edges.
Precision and accuracy of the device used in this study are examined on the basis of K2 measurements of the center of a planar screen [26, 31]. K2 measurements of the screen are taken from 0.5 to 4.0 m distance in 0.5 m steps, covering the minimum measurement range given by the manufacturer  and the maximum scanning range applied in our field study. The point cloud captured from 0.5 m distance contains a data gap of approximately 50% in the center area of the point cloud so that, additionally, one dataset was captured from 0.80 m distance, which was found to be the minimum distance to provide a seamless point cloud of the measured area. To exclude pincushion distortion effects at the outer edges of K2 measurements , only the inner third of the K2 field of view is considered for the experiment. The precision of the K2 device is expressed as standard deviation (SD) of residual distances to a plane fitted into the point cloud via a robust random sample consensus (RANSAC) algorithm. The accuracy is examined via the root mean square of differences between given range and the mean of actually measured range values (RMSE) .
Direct derivation of crop height models
To capture also small parts of the maize plants, the applied terrestrial laser scanner Riegl VZ-400 collected the reference dataset at a high horizontal and vertical angular resolution of 0.029°, corresponding to a point spacing of 2.5 mm at 5 m scanning range. The TLS device offers a range measurement precision of 3 mm and an accuracy of 5 mm at 100 m scanning range , and it was mounted approximately 3.50 m above ground. To account for the device’s field of view restriction of 50° relative to nadir, the scanner was mounted on a tilted platform (Additional file 2). To cover the ground completely, the field was scanned from 5 TLS scan positions (Fig. 3).
To prepare both the K2 and the TLS data for the analyses, several pre-processing steps are applied (Fig. 1). First, the TLS point clouds are registered and georeferenced. The single TLS scan positions are registered by means of corresponding tie points which were manually defined at distinct corners of the 3D markers. The registration of the single TLS scan positions is achieved via 11–21 tie point pairs, resulting in a standard deviation of 0.20–0.35 cm for the residual 3D distances between the used tie points. The mean cloud-to-cloud distances on selected 3D marker surfaces range from 0.10 to 0.30 cm. Additionally, overlapping areas on stable objects such as the 3D registration marker pipes are visually inspected regarding shifts between the point clouds, and also the visual control indicates a high TLS registration quality.
To level and georeference the TLS data, a 3D transformation is applied with parameters for translation and rotation. These are derived by picking the local coordinates of 9 distinct maize plant positions in the registered TLS point cloud and by subsequently linking the local coordinates to their respective global coordinates surveyed with a high-end RTK GNSS Leica Viva GS10/GS15. A standard deviation of 3D distance residuals of 3.00 cm is achieved, being a valid result especially regarding the sole aim of the georeferencing step, i.e. the leveling of the TLS data.
The workflow of the second pre-processing step, i.e. the co-registration of the K2 point clouds onto the TLS reference data, corresponds to the process used for TLS registration. For the co-registration, 5–14 tie point pairs are used. The standard deviation of residual 3D distances ranges from 0.50 to 1.60 cm. The mean cloud-to-cloud distances between 3D markers in the TLS and K2 data are between 0.20 and 3.00 cm. The achieved registration and co-registration accuracy has to be kept in mind when interpreting the comparison between TLS and K2-based CHMs.
To exclude data of plants not being completely within the FOV and subsequently not being relevant for CHM or plant height derivations, each K2 point cloud is clipped to the extent of the FOV at the distance between sensor and the highest plant within the measured scene (Additional file 3). Finally, all point clouds are clipped to the area of interest with an extent of 2.5 m × 8.0 m. To exclude measurement artefacts from the point clouds, a statistical outlier filter  is applied on both the TLS and the K2 point clouds. The filter removes all points which are spatially isolated in terms of the mean distance to the five closest neighbors being larger than the standard deviation of the distances.
Number of points in the original data and the final point clouds achieved after outlier removal
Number of points
Original point cloud
Cropped to FOV
After SOR filter
% of original point cloud
The pre-processed point clouds are the basis for deriving 11 CHMs: 1 CHM for each of the 8 single scan K2 positions, partly covering the maize plot according to the respective FOV, and 3 CHMs for the combined scan positions 1-3-5-7, 2-4-6-8, and 1-8, extending over the whole maize plot. The TLS reference CHMs are calculated on the basis of all TLS point clouds combined in order to achieve the best possible coverage of ground and plants.
The crop height models of 0.25 m × 0.25 m cell size are derived with the software package OPALS  by normalizing a digital surface model (DSM) with a digital terrain model (DTM). The raster cell size was chosen based on plant spacing in order to achieve a seamless CHM and to avoid multiple plant tips within the same raster cell. The DSMs are derived by assigning maximum elevation of all points within a raster cell to the respective cell value . The DTM values correspond to the lowest point within the respective cell. The outermost cells of each CHM are removed to exclude cells covered only partly by the point clouds. The final CHMs consist of 42–246 raster cells.
Direct derivation of plant heights
In addition to the CHM analyses, individual plant heights are derived directly from the point clouds. The plants are extracted according to two scenarios: (1) the K2 point clouds are not georeferenced and measurements are available only for one specific date, and (2) the K2 point clouds are georeferenced and the plant positions are extracted from measurements at an early plant development stage (e.g., ).
To derive the plant height for scenario 1 (i.e., the K2 point clouds are not georeferenced), points representing the local maximum height of the canopy surface are selected. For each local maximum point, the lowest point within a search radius of 0.125 m is extracted. The plant height is subsequently calculated by subtracting the local minimum height from the local maximum height. All difference values below 0.500 m are excluded based on the a priori knowledge that all plants in the maize plot are larger. The extracted K2 plant heights are compared to the nearest local maximum point in the TLS cloud within a radius of 0.125 m. In case of scenario 2 (i.e., the K2 point clouds are georeferenced and the plant positions are known), the local maxima and minima and subsequently the plant heights are extracted from the K2 and TLS point clouds from within a radius of 0.125 m around the known plant position. The plant heights are compared based on the coefficient of determination of linear models fitted into the data pairs, and additionally on the median, standard deviation, and RMSE of plant height difference values.
Performance of K2 sensor
Results of the measurement artefact experiments (100 measurements per lighting condition)
Lighting conditions of scanned empty scene
Average count of measurement artefacts (SD)
Distances between measurement artefacts and FOV edges (m)
Direct sunlight facing away from sun
Direct sunlight facing into sun
The lab experiments for determining precision and accuracy of range measurements result in precision values from 0.001 m at 0.80 m distance to 0.003 m at 4.0 m distance. The RMSE values representing accuracy range from 0.005 m (0.80 m distance) to 0.024 m (4.0 m distance).
Direct derivation of crop height models
The calculated accuracy measures are summed up for scan positions 1 (low plants), 8 (high plants), 1-3-5-7 combined and all combined in Fig. 5 (for the complete list covering all scan positions see Additional file 4). The RMSE of the single K2 point clouds increases in accord with the increasing number and magnitude of blunders stated above (Fig. 4), whereas the RMSEs of combined point clouds tend to be lower. However, large CHM difference values also occur for the combined point clouds, ranging from −1.65 to 1.17 m in case of the CHMs derived from all point clouds combined.
The mean CHM differences are negative in all cases, also when excluding CHM difference values larger than three times the RMSE. The largest values for mean deviation occur for scan positions in the plot area with higher plants, where more and larger blunders occur in the CHMs. Assuming an average plant height of 2.08 m for the area captured from SP8, a mean CHM underestimation of 11.54% is calculated whereas in best case (all point clouds combined), the mean CHM difference results in an underestimation of 3.39% assuming an average plant height of 1.77 m for the covered area. Also regarding the standard deviation, combining K2 point clouds captured from different scan positions results in lower values compared to almost all of the single point clouds. The highest R2 of Q–Q plots is derived for the CHM of scan position 2, indicating the best correspondence of the CHM difference value distribution to a normal distribution, whereas the values derived from the point clouds of other scan positions are lower due to cells containing pronounced CHM underestimations.
To achieve an accuracy assessment more robust against blunders, Höhle and Höhle  recommend comparisons based on quantiles and NMAD. Compared to the 68.3% quantile values, the NMAD is larger for all CHMs. Differences are largest for two point clouds captured in the area with highest plants. The 95% quantiles are more than two times larger than the 68.3% quantile in 5 of 11 cases (scan positions 2–6) which can be attributed to the occurrence of pronounced CHM differences .
The median CHM differences (i.e., the 50% quantiles) range from −0.11 m (scan position 7) to −0.03 m (scan position 1). The CHM difference values of all datasets except for scan position 7 can be regarded as being within the performance of the sensors in terms of precision and accuracy. The tendency stated above for the K2-based CHMs to underestimate crop height is reflected in all of the median values being negative.
The R2 of corresponding CHMs offers an insight into the CHM quality on a cell level. In accordance with the other accuracy measures, combining the single K2 point clouds again results in a strongly improved R2. When regarding only the single K2 point clouds, scan position 3 shows the highest coefficient of correlation (0.88) and scan positions 5 and 7 the lowest (0.48).
Direct derivation of plant heights
When examining the heights of individual plants extracted directly from the point clouds, the results are also indicating a general height underestimation in K2 data and an improvement of plant height derivations by combining K2 point clouds.
The plant heights derived via the extraction of local maxima and minima (scenario 1: plant positions are not known) lead to similar results with an R2 of 0.96 and an RMS of residuals of 0.01 m in case of the combined K2 point clouds (n = 13), as well as R2 = 0.73 and RMS = 0.07 m in case of the single K2 point clouds (n = 44). The number of plant height values extracted for scenario 1 is relatively low because of the restriction that only TLS plant heights within a radius of 0.125 m around a K2 local maximum are taken into account.
Performance of K2 sensor
The scans of empty scenes in different lighting conditions show that most of the measurement artefacts occur on the FOV edges. Similar to , filtering the outermost pixels of the depth image can be recommended to exclude most of the measurement artefacts. Additionally, algorithms to remove remaining measurement artefacts within the FOV volume, such as the statistical outlier filter applied in our study, should be included in studies working with K2 data. The major difference between the data captured in various lighting conditions was found in the number of measurement artefacts. Apart from removing artefacts via filters, a strategy to reduce the number of artefacts can, thus, be to capture data at night which could be achieved by deploying an autonomous mobile system to the fields .
Regarding precision and accuracy derived from the lab experiment, the values correspond to the findings of other studies, for example precision values below 0.016 m for distances between 0.5 and 4.0 m , or from 0.003 m at 0.8 m distance to 0.016 m at 3.0 m distance . Similar in case of accuracy with, for example, Sarbolandi et al.  reporting an RMSE of 0.004 m at a distance of 1.3 m. Subsequently, the used K2 device is considered to capture data of sufficient quality for the derivation of crop height models and plant heights, especially as the results may contain small errors such as residual tilting between K2 and screen despite a thorough measurement setup.
In our workflow, the measurement artefacts along the FOV faces and the distorted FOV corners of far measurements are removed by clipping the K2 data to the extent of the upper FOV defined by the highest plant within the measured scene. We recommend including this pre-processing step in all studies working with K2 data to achieve measurements of high accuracy and precision. Furthermore, the minimum scanning range should be 0.80 m or larger to avoid data gaps in the central FOV area, so that the uppermost parts of the plants are included in the point clouds.
Direct derivation of crop height models
The results of our field experiments indicate a general underestimation of crop heights similar to Li et al.  who directly derive CHMs from ALS data. Also Crommelinck and Höfle  report CHM underestimations on the basis of TLS data and CHMs derived from DTMs and DSMs. Contrary, the mean CHM deviation in  ranges from an underestimation of 14.55% to an overestimation of 17.95%, with the CHMs derived from TLS point clouds of rice paddies via interpolating DTMs and CSMs. Taking the mean CHM differences (Fig. 5) as an example, the maximum relative CHM underestimation is 11.33% of the crop height when assuming an average crop height of 2.08 m for scan position 8. Subsequently, it has to be decided whether or not an underestimation of that order can be accepted within the frame of a study or application. A possible approach to tackle the deviations can be to apply a removal of systematic errors by empirical, site-specific and crop height-adaptive correction functions that need to be trained with reference samples.
Despite single cells containing pronounced differences, the TLS and K2 CHMs generally exhibit high accordance as indicated by high R2 values especially for the combined K2 frames. Comparable R2 values are reached in other studies such as Tilly et al.  (0.72–0.91), or Tilly et al.  with same or higher coefficients of correlation (0.88–0.98) for TLS-based crop surface models and manual measurements on an averaged plot level.
Direct derivation of plant heights
Regarding the methods applied on the field data, the quality of co-registration influences the derived measures: If the tips of plant organs reach into certain CHM cells in case of the TLS reference data but not in case of the K2 data due to a minor relative displacement between the datasets, the respective CHM cells contain different crop height values. Accordingly, also the choice of CHM raster cell size further affects the CHM quality. Similar, TLS and K2 datasets were not captured at the same time so that despite the totally calm weather, movements of the plants may be included in the datasets. In case of the analyses on an individual plant level, the mentioned issue of raster cell size is overcome, but also the determination of plant positions via local maxima as well as the extraction of maxima and minima around a certain position can involve the effects such as the movement of plants.
In our study we show that deriving crop height models of a maize plot directly from K2 point clouds without the need of prior or supplementary measurements is feasible, offering data of high value for site-specific crop management and precision agriculture. The examined CHMs exhibit a general underestimation of crop height and include some cells with pronounced differences to the TLS reference. Also the derivation of individual plant heights directly from the point clouds comprises plant height underestimations. Combining the K2 point clouds leads to improved plant height estimations also in case of the individual plant height derivation. The combination of point clouds reduces both the underestimation of the maximum plant extent, and the overestimation of terrain elevation. By combining multiple K2 point clouds, differences between K2 and TLS amount to an average underestimation of 0.06 m (3.39% of the mean plant height of 1.77 m) for CHMs and the individual plant heights. To achieve a combination of multiple K2 point clouds in operational use, promising approaches for on-line registration are available [41, 42].
The advantage of fewer measurement artefacts when capturing data in darkness can potentially be exploited by operating autonomous mobile platforms, collecting crop heights at night as a preparation for field treatments the following day. Using unmanned aerial vehicles as platforms can be feasible, but may involve issues regarding downdraft-induced plant movement, especially when the measuring range restricts the platform’s height above ground.
Similar, the maize plot examined in this study required the K2 to be mounted relatively high. Assessing the performance of low-cost devices for different crop types with other growth characteristics in terms of height thus opens further research paths. Also follow-up studies examining crop types with other densities, plant organ morphologies, etc. are of high importance especially regarding the idea of direct CHM derivation, because different crop and plant geometries can have strong influences on the visibility of the bare ground.
In any case, devices such as the K2 can contribute to the analyses of growth dynamics via collecting high-resolution 3D geodata in terms of temporal and spatial resolution [16, 43]. Also further applications, for instance monitoring of soil erosion, can use information derived from data originally captured for CHM monitoring. More general, all processes which include a change in the 3D geometry of natural objects and which can be captured in terms of temporal and spatial scale can profit from low-cost methods producing 3D geodata.
airborne laser scanning
crop height model
digital surface model
digital terrain model
field of view
Microsoft® Kinect® for Xbox One™
normalized median absolute difference
- Q–Q plot:
random sample consensus
root mean square
terrestrial laser scanning
MH and BH designed the experiments. Experiments were performed by MH. MH drafted the manuscript with help and contributions from BH. Analyses, figures and tables were mainly performed and produced by MH. Both authors read and approved the final manuscript.
We want to thank Sebastian Bechtold for his support concerning the software used to operate the K2 device, and Florian Klopfer and Luisa Griesbaum for helping in the campaigns. Frank Korn permitted access to the research crop plots of the Botanical Garden of Heidelberg University, Michael Schilbach allowed us to capture and analyze the 3D data of the maize plants originally raised for studies at the Center for Organismal Studies of Heidelberg University.
The authors declare that they have no competing interests.
This research was performed within the research project ‘4D Near Real-Time Environmental Monitoring (4DEMON)’ funded by the Federal Ministry of Science, Research and Arts (MWK), Baden-Wuerttemberg, Germany. The first author received substantial funding from the Graduate School CrowdAnalyser. We acknowledge financial support of the Deutsche Forschungsgemeinschaft and Ruprecht-Karls-Universität Heidelberg within the funding program Open Access Publishing.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Rosell-Polo JR, Auat Cheein F, Gregorio E, Andújar D, Puigdomènech L, Masip J, Escolà A. Advances in structured light sensors applications in precision agriculture and livestock farming. In: Sparks DL, editor. Advances in agronomy, vol. 133. Cambridge: Elsevier Academic Press; 2015. p. 71–112.
- Auernhammer H. Precision farming—the environmental challenge. Comput Electron Agric. 2001;30:31–43.View ArticleGoogle Scholar
- Anwar MR, Liu DL, Macadam I, Kelly G. Adapting agriculture to climate change: a review. Theor Appl Climatol. 2013;113:225–45.View ArticleGoogle Scholar
- Zhang N, Wang M, Wang N. Precision agriculture—a worldwide overview. Comput Electron Agric. 2002;36:113–32.View ArticleGoogle Scholar
- Schellberg J, Hill MJ, Gerhards R, Rothmund M, Braun M. Precision agriculture on grassland: applications, perspectives and constraints. Eur J Agron. 2008;29:59–71.View ArticleGoogle Scholar
- Eitel JUH, Magney TS, Vierling LA, Brown TT, Huggins DR. LiDAR based biomass and crop nitrogen estimates for rapid, non-destructive assessment of wheat nitrogen status. Field Crop Res. 2014;159:21–32.View ArticleGoogle Scholar
- Bendig J, Yu K, Aasen H, Bolten A, Bennertz S, Broscheit J, Gnyp ML, Bareth G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int J Appl Earth Obs Geoinf. 2015;39:79–87.View ArticleGoogle Scholar
- Li W, Niu Z, Chen H, Li D, Wu M, Zhao W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol Indic. 2016;67:637–48.View ArticleGoogle Scholar
- Hoffmeister D, Waldhoff G, Korres W, Curdt C, Bareth G. Crop height variability detection in a single field by multi-temporal terrestrial laser scanning. Precis Agric. 2016;17(3):296–312.View ArticleGoogle Scholar
- Freeman K, Girma K, Arnall D, Mullen R, Martin K, Teal R, Raun W. By-plant prediction of corn forage biomass and nitrogen uptake at various growth stages using remote sensing and plant height. Agron J. 2007;99:530–6.View ArticleGoogle Scholar
- Sharma LK, Bu H, Franzen DW, Denton A. Use of corn height measured with an acoustic sensor improves yield estimation with ground based active optical sensors. Comput Electron Agric. 2014;124:254–62.View ArticleGoogle Scholar
- Yin X, Hayes RM, McClure MA, Savoy HJ. Assessment of plant biomass and nitrogen nutrition with plant height in early-to mid-season corn. J Sci Food Agric. 2012;92:2611–7.View ArticlePubMedGoogle Scholar
- Muharam FM, Bronson KF, Maas SJ, Ritchie GL. Inter-relationships of cotton plant height, canopy width, ground cover and plant nitrogen status indicators. Field Crop Res. 2014;169:58–69.View ArticleGoogle Scholar
- McCarthy CL, Hancock NH, Raine SR. Applied machine vision of plants: a review with implications for field deployment in automated farming operations. Intell Serv Robot. 2010;3:209–17.View ArticleGoogle Scholar
- Li W, Niu Z, Huang N, Wang C, Gao S, Wu C. Airborne LiDAR technique for estimating biomass components of maize: a case study in Zhangye City, Northwest China. Ecol Indic. 2015;57:486–96.View ArticleGoogle Scholar
- Friedli M, Kirchgessner N, Grieder C, Liebisch F, Mannale M, Walter A. Terrestrial 3D laser scanning to track the increase in canopy height of both monocot and dicot crop species under field conditions. Plant Methods. 2016;12(1):1–15.View ArticleGoogle Scholar
- Crommelinck S, Höfle B. Simulating an autonomously operating low-cost static terrestrial LiDAR for multitemporal maize crop height measurements. Remote Sens. 2016;8(3):205.View ArticleGoogle Scholar
- Bareth G, Bendig J, Tilly N, Hoffmeister D, Aasen H, Bolten A. A comparison of UAV- and TLS-derived plant height for crop monitoring: using polygon grids for the analysis of crop surface models (CSMs). Photogramm Fernerkund Geoinf. 2016;2:85–94.View ArticleGoogle Scholar
- Marx S, Hämmerle M, Klonner C, Höfle B. 3D participatory sensing with low-cost mobile devices for crop height assessment—a comparison with terrestrial laser scanning data. PLoS ONE. 2016;11(4):1–22.View ArticleGoogle Scholar
- Ehlert D, Adamek R, Horn HJ. Vehicle based laser range finding in crops. Sensors. 2009;9:3679–94.View ArticlePubMedPubMed CentralGoogle Scholar
- Ehlert D, Heisig M, Adamek R. Suitability of a laser rangefinder to characterize winter wheat. Precis Agric. 2010;11:650–63.View ArticleGoogle Scholar
- Grenzdörffer GJ. Crop height determination with UAS point clouds. Int Arch Photogramm Remote Sens Spatial Inf Sci. 2014;XL-1:135–40.View ArticleGoogle Scholar
- Luo S, Chen JM, Wang C, Xi X, Zeng H, Peng D, Li D. Effects of LiDAR point density, sampling size and height threshold on estimation accuracy of crop biophysical parameters. Opt Express. 2016;24:11578–93.View ArticlePubMedGoogle Scholar
- Marinello F, Pezzuolo A, Gasparini F, Arvidsson J, Sartori L. Application of the Kinect sensor for dynamic soil surface characterization. Prec Agricult. 2015;16(6):601–12.View ArticleGoogle Scholar
- Gonzalez-Jorge H, Rodríguez-Gonzálvez P, Martínez-Sánchez J, González-Aguilera D, Arias P, Gesto M, Díaz-Vilariño L. Metrological comparison between Kinect I and Kinect II sensors. Measurement. 2015;70:21–6.View ArticleGoogle Scholar
- Sarbolandi H, Lefloch D, Kolb A. Kinect range sensing: structured-light versus time-of-flight Kinect. Comput Vis Image Underst. 2015;139:1–20.View ArticleGoogle Scholar
- Microsoft. Kinect for Xbox One hardware specifications. https://dev.windows.com/en-us/kinect/hardware. Accessed 10 Oct 2016.
- Lengeling T. Kinect for Windows v2 library for processing (provided under MIT license). https://github.com/ThomasLengeling/KinectPV2. Accessed 10 Oct 2016.
- Besl PJ, McKay HD. A method for registration of 3-D shapes. IEEE Trans Pattern Anal Mach Intell. 1992;14(2):239–56.View ArticleGoogle Scholar
- Chen Y, Medioni G. Object modelling by registration of multiple range images. Image Vis Comput. 1992;10(3):145–55.View ArticleGoogle Scholar
- Boehm J. Accuracy investigation for structured-light based consumer 3D sensors. Photogramm Fernerkund Geoinf. 2014;2:117–27.View ArticleGoogle Scholar
- Riegl Measurement Systems GmbH. VZ-400 data sheet. 2014. http://www.riegl.com/uploads/tx_pxpriegldownloads/10_DataSheet_VZ-400_2014-09-19.pdf. Accessed 10 Oct 2016.
- Point Cloud Library. Removing outliers using a StatisticalOutlierRemoval filter. http://pointclouds.org/documentation/tutorials/statistical_outlier.php. Accessed 10 Oct 2016.
- Pfeifer N, Mandlburger G, Otepka J, Karel W. OPALS—a framework for airborne laser scanning data analysis. Comput Environ Urban Syst. 2014;45:125–36.View ArticleGoogle Scholar
- Höhle J, Höhle M. Accuracy assessment of digital elevation models by means of robust statistical methods. ISPRS J Photogramm Remote Sens. 2009;64(4):398–406.View ArticleGoogle Scholar
- Höfle B. Radiometric correction of terrestrial LiDAR point cloud data for individual maize plant detection. IEEE Geosci Remote Sens Lett. 2014;11(1):94–8.View ArticleGoogle Scholar
- Lachat E, Macher H, Mittet MA, Landes T, Grussenmeyer P. First experiences with Kinect v2 sensor for close range 3D modelling. Int Arch Photogramm Remote Sens Spatial Inf Sci. 2015;XL-5/W4:93–100.View ArticleGoogle Scholar
- Ball D, Ross P, English A, Patten T, Upcroft B, Fitch R, Sukkarieh S, Wyeth G, Corke P. Robotics for sustainable broad-acre agriculture. In: Proceedings of 9th international conference on field and service robotics. 2013. p. 439–453.
- Tilly N, Hoffmeister D, Cao Q, Lenz-Wiedemann V, Miao Y, Bareth G. Transferability of models for estimating paddy rice biomass from spatial plant height data. Agriculture. 2015;5:538–60.View ArticleGoogle Scholar
- Tilly N, Aasen H, Bareth G. Fusion of plant height and vegetation indices for the estimation of barley biomass. Remote Sens. 2015;7:11449–80.View ArticleGoogle Scholar
- Avian M, Wujanz D. Movement of Hinteres Langtalkar rock glacier 2009–2013 by using the ICProx-algorithm at very high resolution point clouds from terrestrial laserscanning. In: EGU General Assembly conference abstracts, vol 16. 2014. p 12152.
- Nüchter A, Borrmann D, Koch P, Kühn M, May S. A man-portable, IMU-free mobile mapping system. ISPRS Ann Photogramm Remote Sens Spatial Inf Sci. 2015;II-3/W5:17–23.View ArticleGoogle Scholar
- Höfle B, Canli E, Schmitz E, Crommelinck S, Hoffmeister D, Glade T. 4D near real-time environmental monitoring using highly temporal LiDAR. In: Geophysical research abstracts, vol 18(EGU2016-11295-2). 2016. p 1.
- Maize plant drawing (provided under creative commons license CC0 1.0). http://all-free-download.com/free-vector/download/maize_plant_143968_download.html. Accessed 10 Oct 2016.