Skip to main content

Phenotypic techniques and applications in fruit trees: a review

Abstract

Phenotypic information is of great significance for irrigation management, disease prevention and yield improvement. Interest in the evaluation of phenotypes has grown with the goal of enhancing the quality of fruit trees. Traditional techniques for monitoring fruit tree phenotypes are destructive and time-consuming. The development of advanced technology is the key to rapid and non-destructive detection. This review describes several techniques applied to fruit tree phenotypic research in the field, including visible and near-infrared (VIS–NIR) spectroscopy, digital photography, multispectral and hyperspectral imaging, thermal imaging, and light detection and ranging (LiDAR). The applications of these technologies are summarized in terms of architecture parameters, pigment and nutrient contents, water stress, biochemical parameters of fruits and disease detection. These techniques have been shown to play important roles in fruit tree phenotypic research.

Background

Plant phenotype describes the expression of plant traits. Phenotypes are studied at multiple levels, including cells, tissues, organs, individual plants and the whole orchard [1]. Plant phenotypic traits include but are not limited to plant height, biomass content, water state, and yield [2]. The expression of phenotypic traits is controlled by a large number of genetic factors. Therefore, accurate analysis of phenotypic traits is of great significance for the selection of dominant genes and marker-assisted selection [3, 4].

Fruit tree planting is an important part of agricultural production. In some cases, studies on fruit tree phenotypes have shown great reference value for accurate irrigation [5, 6], disease control [7, 8], and fruit quality evaluation [9]. In the past, digital callipers were used to measure tree height and crown diameter [10]. Physicochemical methods were applied to detect the pigment and nutrient content of blades, for example, the Kjeldahl method for the measurement of nitrogen (N) and oven drying method for the determination of moisture [11]. These methods are valuable but time-consuming and destructive to the plant.

With the development of technology, researchers began to develop rapid and non-destructive methods for the study of plant phenotypes. Spectroscopy has been found to be able to detect contents of biochemical substances [12]. Visible and near-infrared (VIS–NIR) spectrometers have become an effective instrument for spectral data collection because of their convenience [13, 14]. Some imaging devices are being used to speed up information acquisition [15, 16]. These techniques help to extend research from the level of single leaf to the level of the whole orchard, promoting the study of high-throughput phenotypes [3, 15, 17]. The present research not only focuses on the study of phenotypic information but also seeks to describe the spatial distribution of phenotypic traits. Light detection and ranging (LiDAR) scanning [18] can measure the spatial coordinates of monitoring points and provide reliable location information for describing the spatial variability in phenotypic traits. In addition, many efforts have been made to replace manual labour with automated mechanical equipment to build automated phenotypic platforms [2, 19].

This paper aims to provide a comprehensive and in-depth review of the techniques for fruit tree phenotypic studies. We summarized the technologies and applications in the field around five aspects of fruit tree phenotypes (Fig. 1). The development trends and future challenges of phenotypic techniques are prospected at the end of this paper.

Fig. 1
figure1

Five aspects and related phenotypic parameters of fruit trees

VIS–NIR spectroscopy

VIS–NIR spectroscopy is a new non-destructive measurement technique. Based on the different reflection and radiation information of different substances in the same spectral band, VIS–NIR spectroscopy is widely used to detect chemical substances [20, 21], soil [22], minerals [23] and food [24]. The following articles in this section describe the principle of VIS–NIR spectroscopy and its application in the study of fruit tree phenotypes.

The principle of VIS–NIR spectroscopy

Electromagnetic waves in a range of 400–2500 nm are often used in VIS–NIR spectroscopy [25]. Some of the groups in a substance, especially those containing hydrogen (C–H, O–H, N–H), absorb energy in VIS–NIR spectroscopy, resulting in changes in the reflected or transmitted spectrum [26]. When the substance content of the sample to be measured is different, various spectral curves will be generated. The spectral comprises broad bands arising from overlapping absorptions [27, 28]. Therefore, the corresponding relationship between spectra and the parameters to be measured can be established based on spectral features, to carry out quantitative analysis of parameters.

The application of VIS–NIR spectroscopy

Portable spectrometer is a frequently used instrument in VIS–NIR spectroscopy studies, which can be applied for non-destructive detection. The sample can be measured directly with a light probe [27]. Moving the spectrometer to the sample location, rather than moving the sample to the laboratory, is the most convenient feature of portable spectrometers [28]. Different spectrometers have different spectral band ranges, and it is critical to select the suitable one for detection. Table 1 summarizes the application of VIS–NIR spectroscopy in fruit tree phenotypic studies in the field, and the details are described in the following sections.

Table 1 The applications of VIS–NIR spectroscopy in the study of fruit tree phenotypes

Detection of pigment and nutrient contents

Spectra are recorded with a sampling resolution of nanoscale by spectrometers so that hundreds or thousands of spectral variables will be obtained with each sample. Such a large amount of data often leads to the unreliability of the dependent variable prediction. Many variable selection methods have been developed to eliminate variables containing mostly noise, such as partial least squares (PLS), artificial neural networks (ANN), genetic algorithm (GA) and so on. For a detailed introduction to these methods, the reader is referred to [29].

Photosynthesis is an important process in the growth of green plants. Chlorophyll absorbs light energy and converts it into water and carbon dioxide via photosynthesis. Chlorophyll content in leaves can reflect the photosynthetic capacity and growth status of fruit trees [30]. An optical fiber spectrometer within the range of 500–1100 nm was used to determine the chlorophyll content in apple tree leaves [31]. Backward interval partial least squares (BiPLS) algorithm was applied to spectral data processing. From 1490 measuring bands, 71 bands with valid information were selected as input variables of the prediction model of chlorophyll content, and the value of R was 0.91. Wang et al. used the first derivative (FD) for the pre-process of spectral data [30]. Wavelengths of 530 nm, 581 nm, 697 nm and 734 nm were selected as sensitive wavelengths. The FDs were treated by the ratio and normalization methods, and four new parameters FD530, FD734 − FD530, (FD734 − FD530)/(FD734 + FD530), FD697 − FD581 were chosen to establish the PLS model. The PLS model exhibited an R2 value of 0.6213 for estimating the chlorophyll content in young apple leaves.

The vegetation index (VI) is the integration of spectral data from two or more bands after a certain mathematical transformation [32]. Researchers usually establish mathematical prediction models by calculating the spectral value of the sensitive wavelength or by calculating and optimizing vegetation indices defined in botany.

Zarco-Tejada et al. used an optical USB2000 spectrometer to detect the chlorophyll and carotenoid contents in grape leaves [33]. The spectrometer has a sampling interval of 0.5 nm which is beneficial for calculating narrow-band spectral indices. Several indices calculated within the range of 700–750 nm yielded good results with R2 value of 0.8–0.9 for chlorophyll estimation. The Structure Insensitive Pigment Index (SIPI) calculated by R430/R680 was used to estimate carotenoids with R2 = 0.49. The Photochemical Reflectance Index (PRI), calculated by (R570 − R539)/(R570 + R539), had a clear correlation with chlorophyll-carotenoid ratios. The results of the experiments above indicate that some spectral bands within the spectral ranges of green light (490–560 nm) and red light (620–780 nm) are significantly correlated with pigment contents.

For the estimation of nutrient contents, Ordonez et al. used FieldSpec 3 to characterize the components of vine leaves [34]. They applied functional nonparametric methods to establish prediction models. A curve was fitted to the discrete spectral data of different wavelength by the smoothing process. Moisture and nitrogen were predicted with high R2 values (R2 = 0.96 and R2 = 0.95, respectively). The relationship between the amount of Ca in leaves and reflectance is not sensitive, which may be due to the lack of comparative experiments on different fertilizers. The functional model use all the spectral features detected by the spectrometer instead of using characteristic wavelength values so that the utilization rate of hyperspectral information is improved [35].

The accuracy of portable spectrometers is lower than that of laboratory spectrometers, but portable spectrometers are affordable, small in size, and easy to use, which are useful for non-scientists [28]. VIS–NIR spectroscopy could be an effective diagnostic tool for predicting nutrient deficiencies in fruit leaves [34], and implementing reasonable fertilization management.

Detection of water stress

The evaluation of water stress aims to determine the status of the water deficit in orchards. Before water stress has a significant impact on fruit trees, reasonable irrigation will reduce the degree of damage to trees. Stomatal conductance (gs) and stem water potential (Ψs) are two representative indicators reflecting vegetation water status.

FieldSpec Pro with a spectral range of 350–2500 nm was applied to detect water status in citrus trees [36]. The study found significant differences in the monthly mean reflectance of the citrus canopy in summer (~ 22%) and winter (~ 15%), which indicated that canopy reflectance can be used to provide the water condition of citrus trees. Rallo et al. used a portable spectrometer to collect the spectral information of the paraxial position of the blade on a one-year-old shoot in olive groves [37]. The spectrometer was placed on an aluminium mast mounted on a horizontal arm at a distance of 1 m above the canopy. The angle of view of the sensor was vertically downward, which covered an area of approximately 0.12 m2 of the canopy. The results showed that optimized indices, the Normalized Difference Greenness Vegetation Index (NDGI) and the Normalized Difference Water Index (NDWI), had strong correlations with leaf water potentials (Ψleaf).

For field spectrometers, the spectral resolution can be 2 nm, which means that there are many closely spaced bands in the same property. The selection of the most suitable wavelength for mathematical modelling will improve the accuracy of the phenotypic parameter estimation. Rallo et al. utilized NDWI and Moisture Spectral Index (MSI) to evaluate the water content [37]. In the calculation, they found that the central wavelength of the NIR band should be selected at 715 nm for the estimation at the canopy level and 750 nm for the estimation at the leaf level, which are lower than the standard of 858 nm. Pôças et al. also analysed the optimal wavelength when evaluating the water status in a vineyard [38]. The wavelengths 520 nm (blue), 539 nm (green) and 586 nm (red) were selected as the best wavelengths for the calculation of the Visible Atmospherically Resistant Index (VARI). González-Fernández et al. used FieldSpec 4 to detect the water status in a vineyard [39]. Spectral acquisition experiments were carried out at both the leaf and canopy levels. The canopy measurements were made at nadir and at 0.30 m above the canopy. Researchers used continuous removal analysis to highlight the absorption and reflection characteristics of the spectral curves. In relation to the equivalent water thickness of the blade, the band area at 1450 nm contributes to a higher correlation than that at 1200 nm or 1950 nm.

The advent of field spectrometer allowed spectral detection in the field, but the instrument is too large and heavy for workers. The application of handheld spectrometers makes it possible to measure without complex optical fiber connections and backpacks. Diago et al. used a handheld digital transform spectrometer working in the spectral range of 1600–2400 nm to detect water stress at the leaf level in different vineyards [40]. Reliable predictions of Ψs and leaf relative water content (RWC) were achieved from regression models. In addition, handheld spectrometers can also be used at the canopy level. Similar to the way portable spectrometers are used in canopy level experiments, the spectrometer sensor would be maintained above the canopy, and the angle of view was vertically downward. The detectable canopy diameter should be smaller than the canopy diameter to eliminate interference from soil [38].

Using portable and handheld spectrometers in the field avoids the destruction of vegetation and improves the collection speed; however, there are still challenges in terms of time and manpower for collecting large amounts of data. Realizing the automation of data collection is the key to researches on high-throughput fruit tree phenotypes. Diago et al. installed a VIS–NIR spectrometer on an all-terrain vehicle, and the sensor head was mounted at a height of 1.40 m above the ground. The spectrometer was fixed at a distance of 25–50 cm from the canopy [41]. When detecting the spectral information of grape leaves, the vehicle needed to be stopped first. In the following year, the same team, using a similar device, performed spectral measurements while the vehicle was in continuous motion [42]. In this case, the original spectral information obtained will contain information about voids, wood, metal, etc. To filter information about the canopy from the original data, static blade characteristics should be collected before the experiment. The spectral detection instrument, mounted on a vehicle to operate contactless detection, was called on-the-go spectroscopy [41]. Instead of human workers, the vehicle carries the spectrometer for movement. The automation of acquisition tools greatly improves the efficiency of data acquisition.

Compared with traditional methods, the application of VIS–NIR spectroscopy to evaluate phenotypic information can reduce the damage to fruit trees. For the leaf level, the detection is rapid and effective. For the canopy level, a spectrometer on a tripod is used to detect spectral information for individual trees. The emergence of on-the-go spectroscopy speeds up data collection and contributes to the study of high-throughput phenotypes. On-the-go spectroscopy has the ability to take measurements in multi-rows and enables mapping of the variability in the fruit tree water status in orchards, which is of great value for formulating reasonable irrigation measures. The orchard could be divided into differentiated zones according to the variability in the water status. Different watering schedules and doses for different zones can greatly reduce water waste [43], which is responding to the policy of sustainable development.

Detection of biochemical parameters of fruits

Chlorophyll, carotenoids, total soluble solids (TSS) content and titratable acidity (TA) are biochemical parameters of fruit, and they will change gradually with the fruit growth [44]. Accurate prediction of biochemical parameters will contribute to judging the maturity of fruit and determining whether it is suitable for harvest. This section mainly focuses on the detection of the fruit in living conditions.

Elsayed et al. used a handheld spectrometer with wavelengths of 302–1148 nm to test the biochemical parameters of mangoes [45]. The optical fiber probe was placed at a zenith angle of 30 degrees and 0.15 m above the mango fruit for non-contact detection. A contour map was made for the coefficients of determination of all biochemical parameters of mango fruits with all possible wavelength (302–1048 nm) combinations. Twelve wavelengths (810, 780, 760, 750, 730, 720, 710, 686, 620, 570, 550 and 540 nm) were selected to estimate TSS (R2 = 0.72) and TA (R2 = 0.64). The results of partial least square regression (PLSR) models revealed that the newly developed index (NDVI − VARI)/(NDVI + VARI) (NDVI: Normalized Difference Vegetation Index) showed a close association with chlorophyll meter readings (R2 = 0.78).

In addition to the detection of specific points, on-the-go spectroscopy has successfully realized continuous spatial detection. A PSS 1050 spectrometer operating in the 570–990 nm spectral range was installed on an all-terrain vehicle [46]. To align the detection probe to the position of the grape cluster, the height of the spectrometer sensor was adjusted to 0.8 m above the ground, the angle was adjusted to level, and the sensor had a distance of 0.3 m from the canopy. According to the spectral characteristics of the grape clusters obtained artificially, the threshold was constantly adjusted to separate out the true berry spectrum from the raw data. TSS was estimated with R2 value of 0.95.

On-the-go spectroscopy is proven to be feasible for the detection of canopy water stress and fruit biochemical parameters in vineyards. It should be noted that the canopy of vineyard is continuous and different from that of citrus or apple orchards. Extracting effective spectral information is the key to data processing in the application of on-the-go devices in orchards with discontinuous canopies. The calculation of spectral indices based on sensitive wavelength is convenient, but the spectral characteristics of other wavelengths are neglected in this process. Establishing prediction models for fruit tree phenotypes based on the full spectral information will greatly improve the utilization of spectral information, to obtain results with high accuracy.

Digital photography

With the rapid development of digital computer and image processing technology, digital photography is becoming increasingly popular in scientific research and daily life. The approach of obtaining plant colour and spatial information from digital images has been successfully applied in the study of plant phenotypes [47,48,49].

The principle of digital photography

The charge-coupled device (CCD) is a semiconductor device, which is applied in imaging technology as an image capture component. CCD can directly convert optical signals into analogue current signals and realize image acquisition and reproduction through analogue-to-digital conversion. With the continuous progress of chip technology, complementary metal oxide semiconductor (CMOS) has gradually replaced CCD with the advantages of low energy consumption and moderate price [50]. Digital photography is an image acquisition technology for colour communication [51]. Digital images can be taken instantly and easily transmitted and edited. Depending on these advantages, digital photography is rapidly applied in scientific research. This section mainly focuses on the application of digital photography in the study of fruit tree phenotypes in the field.

The application of digital photography

In the study of fruit tree phenotypes, digital photography is mainly used for the determination of canopy structural and biochemical parameters. Fisheye photography and digital cover photography are two techniques with different lenses, both cameras are useful in plant phenotypic analysis, especially in the determination of leaf area index (LAI) [52, 53]. A summary of applications of digital photography in fruit tree phenotypic studies is given in Table 2.

Table 2 The applications of digital photography in the study of fruit tree phenotypes

Detection of architecture parameters

Digital image has high image resolution, which is valuable for the calculation of canopy architecture parameters. The architecture parameters include tree height, crown diameter, crown volume (Cv), leaf area (LA) and LAI. LAI is the total one-sided area of leaf tissue per unit ground surface area [54], it can be regarded as a reliable basis for pruning branches and leaves, to improve light transmittance and promote the growth of branches and leaves.

Digital hemispherical photography (DHP) is a type of digital imaging with fisheye lenses. Pictures are usually acquired from beneath the canopy towards the zenith, or from above the canopy looking downward in phenotypic research. Jonckheere et al. reviewed the methods for indirect measurement of LAI by using DHP technology [47]. The advantage of using DHP is that several available commercial integrated instruments have been invented for LAI estimation for the image processing to reduce the intervention of operators. Each system contains a specific imaging device and a free analysis software [54, 55].

Illumination condition and shooting distance are intuitive factors affecting image quality. To improve the accuracy of LAI estimation, Knerl et al. conducted multiple experiments to determine the optimal shooting environment [56]. Two kinds of coloured anti-hail nets (blue and pearl nets) were artificially created over the apple trees to mimic uniform overcast and ideal clear sky conditions. The images were taken at distances of 10, 20 and 40 cm respectively, above the ground under the canopy. The OTSU algorithm was selected for threshold prediction. The processing result showed that when the images were taken for a tree group from approximately 10 cm away from the ground in a net-free environment, the predicted LAI had the smallest deviation from the destructive LAI. In the threshold selection, Zarate-Valdez et al. [57] discovered that the contrast threshold for distinguishing leaves from the sky needed to be verified many times to generate reliable LAI.

Digital cover photography (DCP) has become a substitute for DHP with the advantage of high resolution. DCP uses a narrow field-of-view lens aimed at the zenith for imaging [58]. Compared with hemispherical photography, DCP is not sensitive to image exposure; however, there is a lack of software for digital cover image processing automatically [53, 59].

To improve the automation of the analysis methods for cover images, Fuentes et al. used a script written in MATLAB 7.4 to replace the manual technique for LAI estimation of eucalyptus woodland [60]. The developed script can directly connect the laptop to the digital camera to obtain cover photographs and LAI analysis in real time. In subsequent research, the script was also applied to determine the LAI of fruit trees in apple orchards and vineyards [61]. In addition, Fuentes et al. added an automated module to the original code, and frames (images) were extracted from videos by commands from the Image Analysis Toolbox [62]. The new script could be successfully applied to analyse the LAI of grape trees from videos.

The development of specific software and automation programs for hemispherical and digital cover images provides an accurate and rapid method for the determination of the LAI of fruit trees. However, some studies have indicated that an ordinary consumer digital camera without special sensors can also be used to detect phenotypic information of fruit trees with its ability to perceive colour information.

Taking advantage of high resolution of digital images, Klodt et al. presented an image segmentation method based on colour information [63]. Image pairs with overlapping information were obtained from different locations for each plant. The depth map was constructed by calculating the depth information according to the displacement of the target point in image pairs. Fruits, leaves, stems and background in the image were segmented according to the colour information. According to the depth information, the pixel size in the segmented image was weighted to calculate the vine leaf area. This method has been successfully applied to the calculation of LA and fruit-to-leaf ratios in vineyards.

In addition, the structure from motion (SfM) of orchards can be carried out by using digital images, which is convenient to detect the canopy volume of fruit trees. Haris et al. obtained low-altitude images of an citrus orchard by UAV and generated a 3D map of the orchard [64]. They proposed a method to divide the 3D image of trees into a collection of voxels for the estimation of canopy volume. A voxel was a 3D array that represented the depth of an image. The canopy volume was calculated by calculating the number of voxels occupied by each canopy and the volume of each voxel. Canopy volumes of 78 trees can be measured in 15 min by this method. The efficiency has been significantly improved compared to the manual measurement (10 min for each tree measured).

LAI is a dimensionless quantity representing the canopy and a significant parameter for quantitative analysis of ecosystem productivity [54]. In traditional measurement approaches, the LAI is equivalent to the cumulative leaf area of the leaf fall period in a known collection area [65]. Although this calculation method obtains the most realistic results, it needs to go through a long process. Studies have shown that digital photography is a reliable method for the measurement of LAI. Moreover, the estimation of LA and Cv by digital imaging can help farmers monitor the growth condition of fruit trees.

Detection of biochemical parameters of fruits

The colour digital image represented by red, green and blue components is called RGB image [50]. RGB images can accurately reflect the colour information of the target. Extracting the three colour components of R, G and B is the key to RGB image processing [66]. Some vegetation indices (VIs) expressed by colour components can be used to predict biochemical parameters of fruits.

Elsayed et al. proposed a method for the determination of the chlorophyll content of mango fruits by the VARI and the NDVI calculated by (R − B)/(R + B) and (G − R)/(G + R − B), respectively [45]. According to the PLSR models, the newly developed index (NDVI − VARI)/(NDVI + VARI) showed close and highly significant associations with chlorophyll a and chlorophyll t (the sum of chlorophyll a and chlorophyll b). In addition, the index (R − B)/(R + B) was a good predictor of TA.

The determination of phenotypic information of fruit trees by digital photography results in no damage to fruit trees, and its ability to view images instantly without rinsing film brings great convenience to data acquisition. In addition, the segmentation of fruit trees and backgrounds based on colour information provides a new method for image processing.

Multispectral and hyperspectral imaging

Spectral imaging is a technique used to divide the breakdown of ground object electromagnetic radiation into several narrow spectral segments and obtain information for different bands of the same target at the same time by means of photography or scanning. Spectral imaging sensors can detect information in spectral bands beyond the visible range, such as infrared wavelengths, providing researchers with additional raw data [67].

The principle of multispectral and hyperspectral imaging

The visible to long-wave infrared spectral spectrum (0.4–14 µm) is commonly used in scientific research. The electromagnetic waves in this band can be divided into four categories: VIS band (400–700 nm), NIR band (700–1000 nm), short-wave infrared band (1000–2500 nm) and long-wave infrared band (7.5–14 µm) [68].

Spectral imaging is a technology can simultaneously obtain the two-dimensional spatial information and one-dimensional spectral information of the target, covering a variety of disciplines such as spectroscopy, optics, computer technology, electronics technology, and precision machinery [69]. Multispectral imaging adopts parallel sensor arrays and detects a small amount of reflection over broad wavelength, which is generally composed of three to six discontinuous bands. Hyperspectral imaging detects reflection of hundreds of continuous spectral bands, and the band widths are narrower than the widths of multispectral bands [5]. Therefore, hyperspectral imaging can yield in-depth information about specimens that are easily lost in multispectral imaging.

The application of multispectral and hyperspectral imaging

As computer technology and new optical equipment have evolved, many kinds of multispectral and hyperspectral imager devices have been developed. The spectral imager needs to be stable during image acquisition. Darkroom and halogen lamps are usually designed for spectral image acquisition in the laboratory [16, 70]. Ground-based spectral imaging system is suitable for experiments in the field. The tripods and vehicles are used as the bearing device for the camera [36, 71]. To quickly obtain spectral data of the whole orchard, an unmanned aerial vehicle (UAV) was applied for imaging [72]. As the UAV flies along the route path, the spectral camera takes continuous images at regular intervals [19]. In addition, spectral cameras mounted on manned spacecrafts and satellites can capture spectral images on a large scale. The acquisition and processing methods of multispectral and hyperspectral imaging in the study of fruit tree phenotypes are shown in Fig. 2. The application of spectroscopy in phenotypic studies has a long history [16], and this review mainly focused on the research over the last 5 years. A summary is listed in Table 3, and some details are described in the following section.

Fig. 2
figure2

The acquisition and processing methods of multispectral and hyperspectral imaging in the study of fruit tree phenotypes. The analysis has four steps, as shown in the figure

Table 3 The applications of multispectral and hyperspectral imaging in the study of fruit tree phenotypes

Detection of architecture parameters

It is a useful method to establish digital terrain models (DTMs) of orchards by using low-altitude images and global positioning system (GPS) for the identification of canopy architectural features. DTM is an ordered numerical array that describes the spatial distribution of various information on the Earth’s surface. DTM without ground objects is referred to as digital elevation model (DEM), and DTM with ground objects is known as digital surface model (DSM). Agisoft PhotoScan is a special computer vision software that can automatically identify and match features of multiple images and build a DTM of the research area by combining ground control point parameters, GPS positioning and internal parameters of the camera. Matese et al. measured the canopy height of vine rows by constructing DSMs and DTMs [73]. Images of the vineyard in the R, G and NIR bands were obtained by a multispectral camera. The canopy height model, representing the relief of the vine row surface, was obtained by subtracting DTM from DSM. The estimated canopy height is approximately 0.5 m lower than the actual canopy height. They also built a NDVI map of the vineyard and found a good correlation between NDVI values and canopy heights in the aera with high canopy height. This finding provided an idea for estimating canopy architecture parameters using VIs.

Pixel-based segmentation results are prone to produce salt and pepper noise because that the size of a single pixel is much smaller than the detected object. Therefore, object-based image segmentation techniques are increasingly used in phenotypic studies [74]. Díaz-Varela used multi-resolution segmentation and supervised classification algorithms to segment the olive canopy and background from UAV images captured by a modified RGB camera [75]. The segmentation of single crowns is performed by the watershed algorithm. The canopy was isolated by a segmented contour line, and tree height was retrieved from DSM based on the identification of local maxima. As a result, crown diameter was predicted with R2 = 0.58 and R2 = 0.22 in discontinuous and continuous canopies, respectively, and tree height was estimated with R2 = 0.07 and R2 = 0.58. Koc-San et al. proposed circular Hough transform algorithm to extract citrus trees from DSM [76]. Combined with the specific canopy size and spacing, the images were processed by threshold analysis, median filtering and edge detection to obtain the edge of the tree shadow. Then, according to the azimuth of the sun, the circular shadow was moved to obtain the exact boundary of tree crowns. This method is of great value for distinguishing tree crowns from other plants which have similar radiation conditions. A conclusion can be drawn from this result that circular Hough transform algorithm is available for the identification and feature extraction of fruit trees with green, round and compact features. Torres-Sánchez et al. classified vegetation and bare land area based on vegetation index values. The DSM layer was applied to separate trees with the surrounding soil according to the difference in height [77]. This method provides a good estimation of tree height (R2 = 0.90) and canopy area (R2 = 0.94). Considering the spatial characteristics and contextual features, the object-oriented classification method takes spatial pixel cluster as the classification feature instead of a single pixel, that is suitable for high resolution image processing.

RGB images have high spatial resolution, which is conducive to the accurate acquisition and matching of ground control points in the modelling of DTMs. The spatial resolution of multispectral images is slightly lower than that of RGB images, so it is easy to lose similarities in the matching process of multispectral images. However, multispectral cameras can detect reflection beyond RGB bands, which is valuable in image segmentation for vegetation and background pixels with significant contrast in infrared bands. The segmentation of canopy and background pixels is an important part of image processing, an algorithm that is suitable for the distribution characteristics of fruit trees will help to obtain ideal results. In summary, it is necessary to find a balance between the accuracy of DTMs and the complexity of image processing to select the appropriate technology for phenotypic research.

Biomass is one of the most important parameters of canopy management. Architecture parameters can be used as the basis for assessing biomass [73]. The estimation of architecture parameters of fruit trees with UAV imaging at orchard level, allows creating maps of orchard heterogeneity and observing zones with different tree sizes, which provide a prerequisite for precision agriculture.

Detection of pigment and nutrient contents

At different growth stages, the pigment and nutrient contents of fruit leaves will change accordingly which will generate different reflection under light radiation. Spectral imaging records the spectral information of the target, which can be used to analyse growth conditions of the plant.

The Transformed Chlorophyll Absorption in Reflectance Index (TCARI) and Optimized Soil-adjusted Vegetation Index (OSAVI) usually be applied to minimize the effects of soil and LAI during pigment estimation. Zarco-Tejada et al. estimated the leaf carotenoid content of vineyards using UAV multispectral and hyperspectral images [78]. The combination of R515/R570 and TCARI/OSAVI indices could provide good prediction of carotenoid content. However, the results obtained with multispectral imagery yielded (R2 = 0.43) lower R2 values than those obtained with hyperspectral imagery (R2 = 0.48). A reason might be that multispectral cameras have independent lenses, resulting in errors in pixel matching in different wavebands.

Chlorophyll fluorescence is a probe for the study of photosynthesis, which can reflect the photochemical reaction process and is related to the chlorophyll content. The quantification of chlorophyll fluorescence aims to evaluate photosynthesis. The nonuniformity of the canopy will affect the measurement of the fluorescence signal. To extract the pure canopy fluorescence emission from the clustered pixels, the coverage range of each pixel should be fully considered [79]. Fraunhofer line depth (FLD) principle is the fundamental principle of chlorophyll fluorescence detection. Zarco-Tejada et al. captured multispectral images of a citrus orchard from a UAV [80]. Irradiance spectra at wavelengths of 763, 750 and 780 nm were selected as parameters of the model. They compared fluorescence retrieval models established by structural indices and chlorophyll index with FLD model and found that the prediction result of FLD model was obviously better.

N is the main mineral nutrient needed for chlorophyll production and other plant cell components (proteins, nucleic acids and amino acids) [81]. The determination of N can help with the timely management of nitrogen elements in the orchards to ensure growth vitality. Xuefeng et al. obtained spectral images of a citrus orchard at the height of 100 m above the canopy using a multispectral camera equipped on a UAV [82]. The camera had eleven spectral channels with wavelengths of 490, 550, 570, 671, 680, 700, 720, 800, 840, 900 and 950 nm. Mature and young leaf areas were selected manually in the images. The PLS model based on the original spectrum was the best prediction model for the total nitrogen content with R2 = 0.6469. The model that combined supported vector machine (SVM) and least square methods could estimate the starch content of mature leaves with R = 0.6822. In a red-blush pear orchard, Perry et al. used a six-band (at 550, 660, 710, 720, 730, 810 nm, and all bands were 10 nm wide) multispectral camera to collect images of the canopy with UAV [83]. They provided a new index, the Modified Canopy Chlorophyll Content Index (M3CI_710 nm), utilized for the assessment of canopy nitrogen. M3CI_710 nm was calculated according to the formula, (RNIR + RRed − RRE)/(RNIR − RRed + RRE), where RNIR is the measured reflectance in the 810-nm band, RRed is the measured reflectance in the 660-nm band, and RRE is the measured reflectance in the 710-nm band. Regression results showed the highest R2 value (R2 = 0.67) for the leaf %N with the new index.

Spectral camera equipped on UAVs can capture canopy images of an orchard in a short time, but the flight is affected by air traffic control and battery power. Remote sensing satellites are man-made satellites used as remote sensing platforms in outer space, capable of covering the Earth or designated areas. Satellite data from remote sensing platforms can be used for agricultural research.

Multispectral sensors carried by satellites mainly include blue, green, red and NIR bands. Sentinel-2, which was launched by the European Space Agency, has sensor in the red-edge bands. Li et al. used Sentinel-2A remote sensing images to estimate the chlorophyll content of apple canopies [32]. The (NDVIgreen + NDVIred + NDVIre) was the best indices for the determination of chlorophyll content, and the SVM model provided better predictive results with R2 = 0.729 than back-propagation neural network (BPNN) method.

The above research results indicate that spectral imaging has great value in monitoring the pigment and nutrient contents of fruit trees. Satellite spectral remote sensing has a broad field of vision and can record macro features of large areas on the ground; nonetheless, the spatial resolution of the images is much lower than that of UAV images. The spectral imaging sensors carried by a UAV have more bands than satellite sensors. Thus, spectral imaging with a UAV is an available method for agricultural phenotypic research when time and space permit.

Compared with VIS–NIR spectroscopy, spectral imaging technology can obtain information more quickly and economize more labor force. It is noteworthy that spectral imaging cannot obtain spectral data directly, so complex image processing techniques are needed to extract spectral information from the images.

Detection of biochemical parameters of fruits

The experiments of fruit detection using spectral imaging are mainly carried out in the laboratory under controlled conditions including illumination, temperature, and distance [84,85,86]. Fruits were tested separately, which would take a long time when there are a large number of samples. Recently, on-the-go spectral imaging devices have been successfully applied in fruit detection [87].

Gutiérrez et al. installed a hyperspectral camera (400–1000 nm) on an all-terrain vehicle, to obtain dynamic hyperspectral images of a vineyard [87]. A relation matrix was established between all the pixels in the spectral image and the characteristic spectrum of the grape, the pixels with correlation coefficients that reached a predetermined value were selected as grape pixels. Epsilon-SVM algorithm was applied for the prediction of TSS (R2 = 0.91) and anthocyanin concentration (R2 = 0.72). The application of on-the-go hyperspectral imaging accomplished the detection of fruit components in the field, and the results could be compared with those under laboratory conditions.

Replacing all-terrain vehicles with field robotics, Wendel et al. implemented a driverless, automatic spectral scanner to predict the dry matter (DM) content of mangoes [71]. They developed an analytical method that unified the classification and regression analysis of hyperspectral images based on a convolutional neural network (CNN) and the PLS algorithm. The DM content prediction was not for individual fruit, but for the average over each tree. The prediction results revealed that the CNN model had a higher prediction accuracy (R2 = 0.64) than the PLS model (R2 = 0.58). To make a more accurate estimation of the mango yield, the research team counted the number of mangoes for each tree [88]. RGB images and hyperspectral images of mango trees were obtained simultaneously. After classifying the mango and non-mango pixels, the width and height of the local area of the mango pixels were parameterized to determine the local maximum. The number of mangoes was determined by the number of local maxima. The estimation of the mango counts showed that the accuracy of hyperspectral counting was lower than that of RGB imaging.

Although the resolution of RGB imaging is higher than that of spectral imaging, which is more conducive to image segmentation, spectral imaging can be applied in many aspects of phenotypic research, bringing much more information to researchers than RGB imaging. The estimation of the ripeness and the number of fruits by spectral imaging is beneficial for farmers to make a detailed harvest plan and maximize the benefits [88].

Detection of diseases

Plant diseases can cause considerable losses of plant quality and yield. Hence, effective identification methods should be adopted to prevent disease aggravation and infection [7]. Traditional detection methods are visual feature analysis and microbiological methods by laboratory experiments [89, 90]. However, these methods require specialized pathological knowledge and a long time to complete the detection process, resulting in the missing of the best opportunity for treatment. Non-invasive spectral imaging technology provides a rapid non-destructive testing method for plant disease detection. This section mainly focuses on the applications of hyperspectral and multispectral imaging for the disease detection of fruit trees in the field.

Verticillium wilt (VW) caused by the soil-borne fungus Verticillium dahliae Kleb is the most limiting disease in all traditional olive-growing regions worldwide. To detect VW, Calderón et al. captured airborne thermal, multispectral and hyperspectral images of a 7-ha commercial orchard. Through general linear model analysis, visible ratios (B/BG/BR) and fluorescence index (FLD3) were found to be effective in detecting VW at early stages of disease development [91]. To verify the applicability of spectral imaging methods in large-scale orchards, the research team carried out VW detection experiments in a 3000-ha commercial olive area. A manned aircraft replaced the UAV for image acquisition, since the UAV cannot be used in flight for a long time. Linear discriminant analysis (LDA) and SVM algorithms were used to classify healthy and diseased trees. For the whole data set, SVM expressed a high classification accuracy of 79.2%, while LDA achieved a classification accuracy of 59.0%. FLD3 was a good indicator that could identify olive trees at the early stages of disease development over as much at the orchard scale and even larger scales [92]. López-López et al. used the same analytical algorithms to detect red leaf blotch disease in an almond orchard [93]. Pigment indices (chlorophyll and carotenoid) and chlorophyll fluorescence can identify infected trees effectively in the early stage.

Laurel wilt (LW) is a lethal disease that spreads throughout the southeastern United States and has severely affected avocado industry. A digital colour camera was modified by adding a 37-mm filter ring to the front nose to capture images in the blue band (390–520 nm), green band (470–570 nm) and red-edge band (670–750 nm) [94]. The M-statistic was applied to evaluate the separability of healthy and diseased trees. According to the analysis of variance for the spectral images of the avocado canopy, B/G was found to be capable of separating the healthy trees from the laurel wilt-affected trees with M = 1.53. However, the researchers suggested using a high-spectral-resolution camera to improve the classification accuracy. A Tetracam mini-MCA-6 multispectral camera with six individual digital sensors (green: 580–10 nm; red: 650–10 nm, red-edge, Redge740: 740–10 nm, red-edge, Redge750: 750–10 nm, NIR760: 760–10 nm, and NIR850: 850–40 nm) was applied to obtain spectral images of an avocado orchard [95]. To make the tests more accurate, the researchers divided the degree of infection into four stages. The VIs TCARI760–650, NIR/G and redge/G, were able to discriminate LW at each developmental stage, and the value of M was up to 2.1. Although the modified digital camera had a significant reduction in cost, the multispectral camera had a higher number of bands and narrower bandwidth, so more spectral information could be applied for the classification of diseased trees to achieve an improved accuracy. Perez-Bueno et al. mounted a multispectral camera limiting the radiation to the bands at 560, 660 and 830 nm on a UAV [96]. ANN, logistic regression analysis (LRA), LDA and SVM were trained on NDVI to identify white root rot disease in avocado orchards. All four algorithms had the same resolution capability. The sensitivity of the LDA model was 55.5%, which is lower than that of the ANN and SVM models (78.6%). LRA had higher universality and a lower rate of false negatives than SVM in terms of classification. These conclusions can provide a reference for the selection of classification models.

When infected fruit trees show different response characteristics from healthy trees, spectral imaging technology can provide reliable information for the identification of infected fruit trees. Various forms of VIs can be indicators for identification. Effective identification of disease facilitates the implementation of healthy control and yield optimization measures, rather than relying on the chemical action of pesticides [90].

Multispectral cameras have separate sensors for each spectral band, and a multispectral image provides information on all pixels in the corresponding bands. Hyperspectral cameras adopt the push-sweep method to obtain all spectral information for all pixels in the bands [67]. The essence of a hyperspectral image is a cube composed of a large number of images, two dimensions are pixels, and the third dimension is the spectrum of each pixel [97]. For multispectral images, high precision is needed in pixel matching of images obtained from different sensors at the same time. We can conclude that spectral imaging is an effective method to realize contactless and spatially continuous monitoring for fruit tree phenotypic studies at the orchard level.

Thermal imaging

Thermal imaging can produce digital images and draw a thermal map of the scene in false colour [98]. Traditionally, temperature is measured with thermometers, thermocouples, thermistors, and temperature detectors. These techniques are limited to the determination of specific points while thermal imaging enables continuous monitoring in space [99].

The principle of thermal imaging

Everything in nature whose temperature is above absolute zero can emit infrared radiation, and this infrared radiation carries information about the characteristics of the object. Thermal motion of molecules or atoms will be more intense with increasing temperature, and the infrared radiation will also be enhanced [99]. The core of a thermal imaging camera is the infrared detector, which absorbs the infrared energy emitted by the object and converts it into voltage or current [100]. Thermal imaging technology can visualize the temperature information of the detected object, which has played an important role in the analysis of meteorological disaster management [101, 102], animal behaviour recognition [103, 104], and medical research [105, 106].

The application of thermal imaging

The application of thermal imaging in the study of fruit tree phenotypes over recent years is summarized in Table 4. Some details and analysis are shown in the following section, especially focusing on the detection of water stress and disease.

Table 4 The applications of thermal imaging in the study of fruit tree phenotypes

Detection of water stress

The lack of sufficient moisture in fruit trees can be considered water stress. Water stress is the most harmful environmental stress to the development and production of fruit trees and can affect cell division and vegetative growth. Water decreasing in plants leads to the photosynthetic rate subtracting and stomatal closure increasing, which result in a reduction of CO2 uptake and transpiration and thus a rise of plant temperature [107]. Although gs cannot be directly measured by thermal imaging, it is feasible to measure canopy temperature (Tc) to reflect stomatal status [108].

For the purpose of reducing the influence of field changes, Struthers et al. adjusted irrigation amount and conducted control experiments on 30 pear trees [107]. The stress treatment included 18 canopies and a control treatment of 12 canopies (normal irrigation). A long-wave thermal imager in 7.5–13 µm wavelength was attached to a mechanical lift. Thermal images of the canopy were acquired with a field of view of 25 degrees at nadir 1.3 m above the canopy. The results of multivariate analysis proved that Tc obtained by thermal imaging varied with gs, but this change may lag behind due to the influence of air temperature (Ta) and vapor pressure deficit.

The Crop Water Stress Index (CWSI) is a reasonable quantitative evaluation parameter for crop water stress under evaporation pressure loss [109,110,111]. The CWSI can be calculated by the formula follows:

$$CWSI = \frac{{\left( {T_{c} - T_{a} } \right) - \left( {T_{c} - T_{a} } \right)_{ll} }}{{\left( {T_{c} - T_{a} } \right)_{ul} - \left( {T_{c} - T_{a} } \right)_{ll} }}$$
(1)

where Tc − Ta represents the temperature difference between the crop canopy and the air; (Tc − Ta)ul is the upper limit of (Tc − Ta), indicating that the canopy is immediately dried; and (Tc − Ta)ll is the lower limit of (Tc − Ta), indicating the canopy under good irrigation conditions [112]. The estimations of (Tc − Ta)ll and (Tc − Ta)ul need to be careful and accurate, as they play important roles in the calculation.

Remote and proximal sensing measurements were compared with plant physiological variables by Matese et al. [113]. A small thermal imaging camera (7.5–13 µm) was mounted on a UAV as the remote sensing device, and images were collected at 70 m above the ground with a resolution of 9 cm/pixel. Proximal sensing images were collected at a 1.5 m distance from the lateral canopy with an infrared thermal imaging camera (8–14 µm). In the calculation of the CWSI, the researcher revised the formula as follows according to the actual situation:

$$CWSI = \frac{{T_{leaf} - T_{wet} }}{{T_{dry} - T_{wet} }}$$
(2)

Tdry and Twet represent the temperature of a stressed leaf and an unstressed wet leaf, respectively, while Tleaf replaces Tc − Ta indicating the leaf surface temperature. The leaves were treated with petroleum jelly or wetted to simulate the phenomenon of leaf stress and wetting. The results showed that remote sensing data had the same value as the proximal data. The CWSI value will increase when the net photosynthesis (Pn) rate decreases under water stress. Therefore, the CWSI could be used as an indicator to evaluate the water status of the vineyard. In addition, the research team also detected the variation trend of the water state on a seasonal scale in the vineyard [114]. The CWSI correlated well with Ψs (R2 = 0.6931) and gs (R2 = 0.7061). These results suggested that high-resolution thermal images can create great value for accurate vineyard management.

Egea et al. proposed a method to calculate the CWSI at different moments with Non-Water-Stressed Baselines (NWSBs) [115]. The NWSB was derived from Tc measured by infrared sensors mounted above olive trees, which is associated with weather changes such as solar radiation. The slope and intercept of the NWSB will change at different times in 1 day. To prevent the influence of rainy weather on leaf temperature and humidity, NWSB measurements were made only on continuous sunny days. This method is practical for simplifying the calculation of CWSI at different times. García-Tejero, IF et al. evaluated NWSBs in an orchard with three varieties of almonds [116]. It could be concluded from the results of the different varieties that the slopes of the NWSB were similar, but the intercepts were different. This conclusion also indicated that the NWSB intercept is related to weather conditions. The definition of the NWSB provides a reference for irrigation treatment under different water stress levels.

Thermal imagery is a spatial image with many mixed pixels, similar to spectral imagery, so separating the study area from the background is still the critical step in image processing. Moller et al. aligned a digital colour image with a thermal image and used the segmentation of the digital image as a mask, to perform a statistical analysis of the temperature in thermal images [117]. Agisoft PhotoScan was used to create a 3D point cloud and DEM using thermal images and GPS positions. Pixels of soil and leaves can be separated by determining the height threshold [113, 114]. All steps required a high level of image processing technology and related procedures. Salgadoe et al. proposed a method for automatically segmenting canopy pixels according to temperature histograms [118]. A histogram gradient threshold was set with a pre-defined local gradient to identify the highest and lowest canopy temperature. Compared with the segmentation methods for specific pixels, the thresholding segmentation method based on histograms is more time- and labour-saving and suitable for images of various resolutions, can be a reliable method for fast and standardized thermal analysis.

Although thermal cameras have contributed significantly to canopy temperature and water stress assessments, its cost is a burden for ordinary farmers. To reduce the cost of the camera, García-Tejero et al. used a thermal imaging camera connected to a smartphone (Flir One) and a conventional Thermal Imaging Camera Flir SC600 to capture images of almond trees [119]. The Flir One camera has a lower resolution (80 × 60 pixels) than the Flir SC600 (640 × 480 pixels). There was a strong similarity between Tc obtained by the Flir One camera and that measured by the Flir SC600 camera (R2 = 0.90), which indicated that the Flir One camera was available for water state assessment. The design of thermal imaging devices connected to mobile phones not only speeds up the monitoring process but also facilitates the use by fruit farmers.

Traditionally, plant water status is usually estimated by diffusion porometers or pressure chambers [6]. The manual measurement methods are not timely. Thermal imaging technology can analyse water status of fruit trees in a short time with the evaluation of Tc. Using thermal imaging to monitor the spatial variation in orchard water status, the data from a large orchard area can be obtained quickly without installing an unreasonable number of on-site sensors. In addition, thermal imaging based on UAVs can be used to map the water status of the whole orchard, which can provide more detailed reference for the modulated irrigation strategy.

Detection of diseases

Plant disease pathogens may damage the cuticular cell structure of plant tissues, affect stomatal conductance and transpiration, and cause changes in leaf temperature [120]. The ability of thermal imaging to evaluate canopy temperature makes it possible to detect plant diseases.

Apple scab pathogen grows under the epidermis of apple leaves and absorbs nutrients from the subcuticular space and destroys the cuticle, causing water loss and temperature changes. Oerke et al. found significant differences in the thermal images corresponding to different stages of disease severity [121]. The maximum temperature difference (MTD) between the infected area and healthy area increased with the increase in the degree of infection. It was correlated with the infection area (R2 = 0.85) and overall infection severity (R2 = 0.71). Polystigma amygdalinum PF Cannon is also a fungus that lives on the surface of the leaf, causing almond trees to be infected with red leaf blotch disease. López-López et al. [93] collected thermal images in an almond orchard and found that the Tc − Ta increased with the severity of the disease, especially in the stages of moderate or severe infection.

When a plant is affected by VW, the vascular system will be damaged, which impedes the flow of water, resulting in water stress [122, 123]. Calderón et al. identified the VW severity levels in olive orchards with airborne thermal imagery [91]. The gs was measured in the leaf and near-canopy fields at the tree level, Tc and Ta were estimated from the thermal images. Measurement results showed that the Tc − Ta would become higher and gs would become lower as the severity level increased, which proved that crown temperature estimated with thermal imaging was effective in detecting VW in the early stage of disease development. The team then expanded the olive garden experiment by selecting nine areas in a larger commercial olive garden [92]. The nine areas covered different tree species, tree ages, planting densities and soil management techniques. The results showed that Tc − Ta was still an effective indicator for VW detection in large-scale orchards.

The studies mentioned above suggested that the changes in Tc caused by disease can be monitored by thermal imaging techniques. Thermal imaging can help to separate healthy trees from infected trees, but it lacks diagnostic capability. It is difficult to determine whether the temperature change is caused by disease [121]. Combined with other imaging techniques to solve this problem is the focus of detecting fruit tree diseases at present.

LiDAR scanning

Radar is an electronic device that transmits electromagnetic waves to the target and receives its echo, to obtain the distance and orientation from the target to the electromagnetic wave transmission point. LiDAR is a radar (radio detection and ranging) system that transmits a laser beam to detect the position, velocity and other characteristics of a target [124, 125].

The principle of LiDAR

A LiDAR system consists of a single narrowband laser and a receiving system [126]. The laser fires a pulse of light at the target, and the reflected wave is picked up by the receiver. The receiver can accurately measure the propagation time of the light pulse from transmission to reflection. Light pulses travel at the speed of light, and the distance from the laser point to the target can be calculated based on the speed of light and the time of propagation. The position of the target can be determined according to the height and scanning angle of the laser [127].

The application of LiDAR

Because of the ability to detect distance, LiDAR provides great value in estimating architecture parameters of fruit trees [128,129,130,131]. The application of LiDAR in phenotypic analysis has been reviewed by Colaço et al. [18]. This section mainly focuses on the combination of LiDAR and other technologies.

In the study of chlorophyll content, Ma et al. proposed a method to estimate the chlorophyll content in different areas of light intensity by using 3D models with colour characteristics [132]. A 3D laser scanner was used to acquire 3D data of apple trees; it was equipped with an internal colour camera that enabled the building of a colourful 3D model, and the colours represented different light intensities. They found that the colour index (R − B)/(R + B) was suitable for describing the chlorophyll content of different lights. Similarly, a fusion method of multispectral camera and 3D portable lidar images was proposed by Hosoi et al. [133]. The multispectral camera was placed on the points on the lines connecting the sample and LiDAR to promise the spectral images had the same angle of view as LiDAR data. The VI value of each pixel was added to lidar projection image as an additional attribute value reflecting spatial distribution of chlorophyll. This method provides both horizontal and vertical distribution of chlorophyll content over the canopy.

The uniting of LiDAR and colour imaging is beneficial to the detection of fruits. Underwood et al. used a mobile ground vehicle robot equipped with a 2D LiDAR and a machine vision camera to scan almond trees [134]. Within the LiDAR-based canopy mask, image classification was performed on the images associated with each tree for the estimation of canopy volume. To reduce the error caused by fruit occlusion, Stein et al. collected data from multiple viewpoints [135]. Based on spatial position coordinates, the fruits in the image were correlated with other viewpoint images to avoid double counting. The error between the number of fruits calculated by this method and the true value was 1.36%, which was considered to be a high precision.

In addition to precise 3D coordinates, LiDAR systems also record “intensity”, which is roughly defined as the backscattering intensity of the echo per test point and refers to the amplitude of the returned signal [125]. Different spectral reflectance properties will result in different backscattered intensification. Gené-Mola et al. converted the backscattering intensity at a laser wavelength of 905 nm into reflectance to separate the apple fruits from the canopy branches [136]. According to the feature that the reflectance value of apples is higher than that of leaves and branches at 905 nm wavelength, the correlation points corresponding to leaves and branches were removed from the point clouds, and remaining points were clustered to obtain the number of apples. The fusion result of reflectance information and LiDAR data was comparable to that of colour imagery. In terms of obtaining plant reflectance, LiDAR is less affected by illumination conditions than spectral imaging.

When the spatial information of orchards is detected with UAV or satellite imagery, the spatial resolution is limited by the flight altitude, and the observation angle is just overlooking. The integration of ground-based LiDAR with other technologies can facilitate the study of phenotypic characteristics from multiple lateral perspectives on fruit trees.

Discussion

So far, much progress has been made in phenotypic study of fruit trees, but efforts still need to be made in the combination of technologies and the improvement of equipment. In further research, we should pay more attention to the practicability of technology so that we can make a real contribution to the development of agriculture. To this end, we proposed the following aspects for the focus and challenges of future fruit tree phenotypic research.

The applications of spectrometers and spectral imagers indicate that the changes in fruit tree pigment contents and water state can cause clear spectral responses in the VIS, NIR and short-wave infrared bands. However, hyperspectral sensors in the ultraviolet (UV) range have been demonstrated to detect salt stress in barley leaves [137]. UV–VIS spectroscopy has been used for the classification of tea types [138]. Whether the spectral information of the UV band or other bands is useful for the study of fruit tree phenotypes remains to be further verified in the future.

Cost reduction of optical imaging sensors will be the emphasis of the fruit tree phenotypic techniques, which can serve more farmers rather than scientists. The Flir One camera mentioned in part 5 is a good example [119], which has a lower cost than professional optical imaging devices and can satisfy the research demands in agriculture. Maintaining a high resolution while keeping a low cost is a challenge during the course of fabrication. In addition, it is necessary to develop image processing software with broad applied value so that mobile phones can replace computers to calculate the phenotypic characteristics of fruit trees.

LiDAR and imaging systems are complementary techniques for creating spatial coordinate descriptions and 3D image displays of plants [139]. LiDAR system provides precise elevation information, which is beneficial to the establishment of DSMs and DTMs. Wang et al. utilized airborne LiDAR and optical remote imagery to identify tree species in urban forests, and the classification accuracy was greatly improved compared with optical image analysis alone [74]. Consequently, in the study of fruit tree phenotypes, it may be a new method to identify fruit trees with airborne LiDAR and optical imaging.

Conclusion

We attempted to review the non-destructive technologies applied in the field study of fruit tree phenotypes, including VIS–NIR spectroscopy, digital photography, multispectral and hyperspectral imaging, thermal imaging, and LiDAR. These techniques are feasible and valuable for the applications in phenotypic studies of fruit trees, such as the detection of architecture parameters, pigment and nutrient contents, water status, biochemical parameters of fruits, and plant disease. In particular, the combination of the data obtained by LiDAR and imaging techniques can promote the evaluation of phenotypic characteristics of fruit trees in three-dimensional space. Spatial characteristics have great contributions to the monitoring of spatial variability of pigment contents, the detection of fruit locations and the prediction of fruit yield.

The combination of non-destructive monitoring technology and automatic machinery realizes the automation of phenotypic research equipment. Ground-based devices are used for the detailed study of fruit trees at the tree level. However, it will take a long time to detect large orchard areas with terrestrial devices. Imaging techniques based on UAV and satellites have facilitated high-throughput phenotypic studies. The study of fruit tree phenotypes will be beneficial to rational irrigation, disease prevention, and yield improvement. Furthermore, phenotypic information can be considered the basis for screening excellent fruit tree species and promoting planting research on fruit trees.

Availability of data and materials

Not applicable.

Abbreviations

ANN:

Artificial neural network

BiPLS:

Backward interval partial least squares

BPNN:

Back-propagation neural network

CCD:

Charge-coupled device

CMOS:

Complementary metal oxide semiconductor

CNN:

Convolutional neural network

Chl:

Chlorophyll

Chl-b:

Chlorophyll-b

Chl-a:

Chlorophyll-a

Chl-t:

Chlorophyll-t

Cv :

Crown volume

CWSI:

Crop water stress index

DCP:

Digital cover photography

DEM:

Digital elevation model

DHP:

Digital hemispherical photography

DM:

Dry matter

DSM:

Digital surface model

DTM:

Digital terrain model

EWT:

Equivalent water thickness

FD:

First derivative

FLD:

Fraunhofer line depth principle

FLD3:

Fraunhofer line depth principle based on three spectral bands

FLDn:

FLD3 normalized

GA:

Genetic algorithm

GPS:

Global positioning system

gs :

Stomatal conductance

LA:

Leaf area

LAI:

Leaf area index

LDA:

Linear discriminant analysis

LiDAR:

Light detection and ranging

LRA:

Logistic regression analysis

LW:

Laurel wilt

M3CI:

Modified canopy chlorophyll content index

MSI:

Moisture spectral index

MTD:

Maximum temperature difference

N:

Nitrogen

NDGI:

Normalized difference greenness vegetation index

NDVI:

Normalized difference vegetation index

NDWI:

Normalized difference water index

NIR:

Near-infrared

NWSB:

Non-Water-Stressed Baseline

OSAVI:

Optimized Soil-adjusted Vegetation Index

PAI:

Plant area index

PLS:

Partial least squares

PLSR:

Partial least squares regression

Pn :

Net photosynthesis

PRI:

Photochemical reflectance index

UV:

Ultraviolet

R:

Red (spectral band of red)

R:

Correlation coefficient (parameter of performance evaluation)

R2 :

Coefficients of determination

Rcv :

Cross validation correlation coefficient

RMSE:

Root mean square error

RMSEP:

Root mean square error prediction

RWC:

Relative water content

SfM:

Structure from Motion

SIPI:

Structure Insensitive Pigment Index

Spec:

Spectrometer

SVM:

Supported vector machine

Ta :

Air temperature

TA:

Titratable acidity

Tc :

Canopy temperature

TCARI:

Transformed Chlorophyll Absorption in Reflectance Index

TSS:

Total soluble solids

UAV:

Unmanned aerial vehicle

VARI:

Visible atmospherically resistant index

VI:

Vegetation index

VIS:

Visible

VIs:

Vegetation indices

VW:

Verticillium wilt

Ψleaf :

Leaf water potential

Ψpd :

Predawn leaf water potential

Ψs :

Stem water potential

References

  1. 1.

    Dhondt S, Wuyts N, Inzé D. Cell to whole-plant phenotyping: the best is yet to come. Trends Plant Sci. 2013;18(8):428–39.

    CAS  PubMed  Google Scholar 

  2. 2.

    Shakoor N, Lee S, Mockler TC. High throughput phenotyping to accelerate crop breeding and monitoring of diseases in the field. Curr Opin Plant Biol. 2017;38(C):184–92.

    PubMed  Google Scholar 

  3. 3.

    Mir RR, Reynolds M, Pinto F, et al. High-throughput phenotyping for crop improvement in the genomics era. Plant Sci. 2019;282(SI):60–72.

    CAS  PubMed  Google Scholar 

  4. 4.

    Mahlein A. Plant disease detection by imaging sensors-parallels and specific demands for precision agriculture and plant phenotyping. Plant Dis. 2016;100(2):241–51.

    PubMed  Google Scholar 

  5. 5.

    Chetty K, Govender M, Bulcock H. A review of hyperspectral remote sensing and its application in vegetation and water resource studies. Water Sa. 2007;33(2):145–51.

    Google Scholar 

  6. 6.

    Jones HG. Irrigation scheduling: advantages and pitfalls of plant-based methods. J Exp Bot. 2004;55(407):2427–36.

    CAS  PubMed  Google Scholar 

  7. 7.

    Alemu K. Detection of diseases, identification and diversity of viruses: a review. J Biol Agric Healthcare. 2015;5(1):204–13.

    Google Scholar 

  8. 8.

    Ali MM, Bachik NA, Bachik NA, Muhadi NA, et al. Non-destructive techniques of detecting plant diseases: a review. Physiol Mol Plant P. 2019;108:101426.

    CAS  Google Scholar 

  9. 9.

    Qin J, Chao K, Kim MS, et al. Hyperspectral and multispectral imaging for evaluating food safety and quality. J Food Eng. 2013;118(2):157–71.

    CAS  Google Scholar 

  10. 10.

    Morgan KT, Scholberg JMS, Obreza TA, et al. Size, biomass, and nitrogen relationships with sweet orange tree growth. J Am Soc Hortic Sci. 2006;131(1):149.

    Google Scholar 

  11. 11.

    Zhang Y, Zheng L, Sun H. An optical detector for determining chlorophyll and nitrogen concentration based on photoreaction in apple tree leaves. Intell Autom Soft Co. 1995;21(3):409–21.

    Google Scholar 

  12. 12.

    Sari M, Sonmez NK, Karaca M. Relationship between chlorophyll content and canopy reflectance in Washington navel orange trees (Citrus sinensis (L.) Osbeck. Pak J Bot. 2006;38(4):1093–102.

    Google Scholar 

  13. 13.

    Fernández-Novales J, Garde-Cerdán T, Tardáguila J, et al. Assessment of amino acids and total soluble solids in intact grape berries using contactless Vis and NIR spectroscopy during ripening. Talanta. 2019;199:244–53.

    PubMed  Google Scholar 

  14. 14.

    Wang H, Peng J, Xie C, et al. Fruit quality evaluation using spectroscopy technology: a review. Sensors. 2015;15(5):11889–927.

    PubMed  Google Scholar 

  15. 15.

    Raychaudhuri B. Imaging spectroscopy: origin and future trends. Appl Spectrosc Rev. 2016;51(1):23–35.

    Google Scholar 

  16. 16.

    Mishra P, Asaari MSM, Herrero-Langreo A, et al. Close range hyperspectral imaging of plants: a review. Biosyst Eng. 2017;164:49–67.

    Google Scholar 

  17. 17.

    Zhao C, Zhang Y, Du J, et al. Crop phenomics: current status and perspectives. Front Plant Sci. 2019;10:714.

    PubMed  PubMed Central  Google Scholar 

  18. 18.

    Colaço AF, Molin JP, Rosell-Polo JR, et al. Application of light detection and ranging and ultrasonic sensors to high-throughput phenotyping and precision horticulture: current status and challenges. Hortic Res-England. 2018;5(1):35.

    Google Scholar 

  19. 19.

    Roth L, Hund A, Aasen H. PhenoFly planning tool: flight planning for high-resolution optical remote sensing with unmanned areal systems. Plant Methods. 2018;14(1):116.

    PubMed  PubMed Central  Google Scholar 

  20. 20.

    Wagner A, Hilgert S, Kattenborn T, et al. Proximal VIS-NIR spectrometry to retrieve substance concentrations in surface waters using partial least squares modelling. Water Sci Tech-W Sup. 2019;9(4):1204–11.

    Google Scholar 

  21. 21.

    Czechlowski M, Marcinkowski D, Golimowska R, et al. Spectroscopy approach to methanol detection in waste fat methyl esters. Spectrochim Acta Part A Mol Biomol Spectrosc. 2019;210:14–20.

    CAS  Google Scholar 

  22. 22.

    Wang J, Wang J, Chen Z, et al. Development of multi-cultivar models for predicting the soluble solid content and firmness of European pear (Pyrus communis L.) using portable vis–NIR spectroscopy. Postharvest Biol Tec. 2017;29:143–51.

    Google Scholar 

  23. 23.

    Yang E, Ge S, Wang S. Characterization and identification of coal and carbonaceous shale using visible and near-infrared reflectance spectroscopy. J Spectrosc. 2018;2018:1–13.

    Google Scholar 

  24. 24.

    You H, Kim Y, Lee J, et al. Food powder classification using a portable visible-near-infrared spectrometer. J Electromagn Eng Sci. 2017;17(4):186–90.

    Google Scholar 

  25. 25.

    Xie LJ, Wang AC, Xu HR, et al. Applications of near-infrared systems for quality evaluation of fruits: a review. T Asabe. 2016;59(2):399–419.

    CAS  Google Scholar 

  26. 26.

    Arendse E, Fawole OA, Magwaza LS, et al. Non-destructive prediction of internal and external quality attributes of fruit with thick rind: a review. J Food Eng. 2018;217:11–23.

    Google Scholar 

  27. 27.

    Nicolaï BM, Beullens K, Bobelyn E, et al. Nondestructive measurement of fruit and vegetable quality by means of NIR spectroscopy: a review. Postharvest Biol Tec. 2007;46(2):99–118.

    Google Scholar 

  28. 28.

    Crocombe RA. Portable spectroscopy. Appl Spectrosc. 2018;72(12):1701–51.

    CAS  PubMed  Google Scholar 

  29. 29.

    Xiaobo Z, Jiewen Z, Povey MJW, et al. Variables selection methods in near-infrared spectroscopy. Anal Chim Acta. 2010;667(1–2):14–32.

    PubMed  Google Scholar 

  30. 30.

    Wang Z, Zhu X, Fang X, et al. Hyperspectral models for estimating chlorophyll content of young apple tree leaves. Intell Autom Soft Co. 2015;21(3):383–93.

    CAS  Google Scholar 

  31. 31.

    Guo Z, Zhao C, Huang W, et al. Nondestructive quantification of foliar chlorophyll in an apple orchard by visible/near-infrared reflectance spectroscopy and partial least squares. Spectrosc Lett. 2014;47(6):481–7.

    CAS  Google Scholar 

  32. 32.

    Li C, Zhu X, Wei Y, et al. Estimating apple tree canopy chlorophyll content based on Sentinel-2A remote sensing imaging. Sci Rep-UK. 2018;8(1):3756.

    Google Scholar 

  33. 33.

    Zarco-Tejada PJ, Berjón A, López-Lozano R, et al. Assessing vineyard condition with hyperspectral indices: leaf and canopy reflectance simulation in a row-structured discontinuous canopy. Remote Sens Environ. 2005;99(3):271–87.

    Google Scholar 

  34. 34.

    Ordonez C, Rodriguez-Perez JR, Moreira JJ, et al. Using hyperspectral spectrometry and functional models to characterize vine-leaf composition. IEEE T Geosci Remote. 2013;51(5):2610–8.

    Google Scholar 

  35. 35.

    Ordoñez C, Martínez J, Matías JM, et al. Functional statistical techniques applied to vine leaf water content determination. Math Comput Model. 2010;52(7–8):1116–22.

    Google Scholar 

  36. 36.

    Dzikiti S, Verreynne SJ, Stuckens J, et al. Seasonal variation in canopy reflectance and its application to determine the water status and water use by citrus trees in the Western Cape, South Africa. Agr Forest Meteorol. 2011;151(8):1035–44.

    Google Scholar 

  37. 37.

    Rallo G, Minacapilli M, Ciraolo G, et al. Detecting crop water status in mature olive groves using vegetation spectral measurements. Biosyst Eng. 2014;128:52–68.

    Google Scholar 

  38. 38.

    Pôças I, Rodrigues A, Gonçalves S, et al. Predicting grapevine water status based on hyperspectral reflectance vegetation indices. Remote Sens-Basel. 2015;7(12):16460–79.

    Google Scholar 

  39. 39.

    González-Fernández AB, Rodríguez-Pérez JR, Marcelo V, et al. Using field spectrometry and a plant probe accessory to determine leaf water content in commercial vineyards. Agr Water Manage. 2015;156:43–50.

    Google Scholar 

  40. 40.

    Diago MP, Tardaguila J, Fernández-Novales J, et al. Non-destructive assessment of grapevine water status in the field using a portable NIR spectrophotometer. J Sci Food Agr. 2017;97(11):3772–80.

    Google Scholar 

  41. 41.

    Diago MP, Bellincontro A, Scheidweiler M, et al. Future opportunities of proximal near infrared spectroscopy approaches to determine the variability of vineyard water status. Aust J Grape Wine R. 2017;23(3):409–14.

    Google Scholar 

  42. 42.

    Diago MP, Fernández-Novales J, Tardaguila J, et al. In field quantification and discrimination of different vineyard water regimes by on-the-go NIR spectroscopy. Biosyst Eng. 2018;165:47–58.

    Google Scholar 

  43. 43.

    Diago MP, Fernández-Novales J, Gutiérrez S, et al. Development and validation of a new methodology to assess the vineyard water status by on-the-go near infrared spectroscopy. Front Plant Sci. 2018;9:59.

    PubMed  PubMed Central  Google Scholar 

  44. 44.

    Cruz-Hernandez A, Paredes-Lopez O. Fruit quality: new insights for biotechnology. Crit Rev Food Sci Nutr. 2012;52(3):272–89.

    CAS  PubMed  Google Scholar 

  45. 45.

    Elsayed S, Galal H, Allam A, et al. Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits. Sci Hortic-Amsterdam. 2016;212:136–47.

    Google Scholar 

  46. 46.

    Fernandez-Novales J, Tardaguila J, Gutierrez S, et al. On-The-Go VIS + SW-NIR spectroscopy as a reliable monitoring tool for grape composition within the vineyard. Molecules. 2019;24(15):2795.

    PubMed Central  Google Scholar 

  47. 47.

    Jonckheere I, Fleck S, Nackaerts K, et al. Review of methods for in situ leaf area index determination. Agr Forest Meteorol. 2004;121(1–2):19–35.

    Google Scholar 

  48. 48.

    Madec S, Baret F, de Solan B, et al. High-throughput phenotyping of plant height: comparing unmanned aerial vehicles and ground LiDAR estimates. Front Plant Sci. 2017;8:2002.

    PubMed  PubMed Central  Google Scholar 

  49. 49.

    Watanabe K, Guo W, Arai K, et al. High-throughput phenotyping of sorghum plant height using an unmanned aerial vehicle and its application to genomic prediction modeling. Front Plant Sci. 2017;8:421.

    PubMed  PubMed Central  Google Scholar 

  50. 50.

    Kazlauciunas A. Digital imaging- theory and application Part 1: theory. Surf Coat Int. 2001;84(B1):1–9.

    CAS  Google Scholar 

  51. 51.

    Guowei Hong MRL, Rhodes PA. A study of digital camera colorimetric characterization based on polynomial modeling. Color Res Appl. 2001;26(1):76–84.

    Google Scholar 

  52. 52.

    Macfarlane C, Hoffman M, Eamus D, et al. Estimation of leaf area index in eucalypt forest using digital photography. Agr Forest Meteorol. 2007;143(3–4):176–88.

    Google Scholar 

  53. 53.

    Macfarlane C, Grigg A, Evangelista C. Estimating forest leaf area using cover and fullframe fisheye photography: thinking inside the circle. Agr Forest Meteorol. 2007;146(1–2):1–12.

    Google Scholar 

  54. 54.

    Breda NJJ. Ground-based measurements of leaf area index: a review of methods, instruments and current controversies. J Exp Bot. 2003;54(392):2403–17.

    CAS  PubMed  Google Scholar 

  55. 55.

    Liu C, Kang S, Li F, et al. Canopy leaf area index for apple tree using hemispherical photography in arid region. Sci Hortic-Amsterdam. 2013;164:610–5.

    Google Scholar 

  56. 56.

    Knerl A, Anthony B, Serra S, et al. Optimization of leaf area estimation in a high-density apple orchard using hemispherical photography. HortScience. 2018;53(6):799–804.

    Google Scholar 

  57. 57.

    Zarate-Valdez JL, Whiting ML, Lampinen BD, et al. Prediction of leaf area index in almonds by vegetation indexes. Comput Electron Agr. 2012;85:24–32.

    Google Scholar 

  58. 58.

    Pekin B, Macfarlane C. Measurement of crown cover and leaf area index using digital cover photography and its application to remote sensing. Remote Sens-Basel. 2009;1(4):1298–320.

    Google Scholar 

  59. 59.

    Alivernini A, Fares S, Ferrara C, et al. An objective image analysis method for estimation of canopy attributes from digital cover photography. Trees. 2018;32(3):713–23.

    Google Scholar 

  60. 60.

    Fuentes S, Palmer AR, Taylor D, et al. An automated procedure for estimating the leaf area index (LAI) of woodland ecosystems using digital imagery, MATLAB programming and its application to an examination of the relationship between remotely sensed and field measurements of LAI. Funct Plant Biol. 2008;35(10):1070.

    PubMed  Google Scholar 

  61. 61.

    Poblete-Echeverría C, Fuentes S, Ortega-Farias S, et al. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient. Sensors-Basel. 2015;15(2):2860–72.

    PubMed  PubMed Central  Google Scholar 

  62. 62.

    Fuentes S, Poblete-Echeverría C, Ortega-Farias S, et al. Automated estimation of leaf area index from grapevine canopies using cover photography, video and computational analysis methods. Aust J Grape Wine R. 2014;20(3):465–73.

    Google Scholar 

  63. 63.

    Klodt M, Herzog K, Töpfer R, et al. Field phenotyping of grapevine growth using dense stereo reconstruction. BMC Bioinform. 2015;16(1):143.

    Google Scholar 

  64. 64.

    Haris M, Ishii K, Ziyang L, et al. Construction of a high-resolution digital map to support citrus breeding using an autonomous multicopter. Acta Hort. 2016;1135:73–84.

    Google Scholar 

  65. 65.

    Chason JW, Baldocchi DD, Huston MA. A comparison of direct and indirect methods for estimating forest canopy leaf area. Agr Forest Meteorol. 1991;57(1):107–28.

    Google Scholar 

  66. 66.

    Pei S, Cheng C. Extracting color features and dynamic matching for image data-base retrieval. IEEE T Circ Syst Vid. 1999;9(3):501.

    Google Scholar 

  67. 67.

    Carlsohn MF. Spectral imaging in real-time—Imaging principles and applications. Real-Time Imag. 2005;11(2):71–3.

    Google Scholar 

  68. 68.

    Araus JL, Kefauver SC, Zaman-Allah M, et al. Translating high-throughput phenotyping into genetic gain. Trends Plant Sci. 2018;23(5):451–66.

    CAS  PubMed  PubMed Central  Google Scholar 

  69. 69.

    Garini Y, Young IT, McNamara G. Spectral imaging: principles and applications. Cytom Part A. 2006;69A(8):735–47.

    Google Scholar 

  70. 70.

    Oerke E, Herzog K, Toepfer R. Hyperspectral phenotyping of the reaction of grapevine genotypes toPlasmopara viticola. J Exp Bot. 2016;67(18):5529–43.

    CAS  PubMed  Google Scholar 

  71. 71.

    Wendel A, Underwood J, Walsh K. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform. Comput Electron Agr. 2018;155:298–313.

    Google Scholar 

  72. 72.

    Zhang C, Kovacs JM. The application of small unmanned aerial systems for precision agriculture: a review. Precis Agric. 2012;13(6):693–712.

    CAS  Google Scholar 

  73. 73.

    Matese A, Di Gennaro SF, Berton A. Assessment of a canopy height model (CHM) in a vineyard using UAV-based multispectral imaging. Int J Remote Sens. 2017;38(8–10):2150–60.

    Google Scholar 

  74. 74.

    Wang K, Wang T, Liu X. A review: individual tree species classification using integrated airborne LiDAR and optical imagery with a focus on the urban environment. Forests. 2019;10(1):1.

    Google Scholar 

  75. 75.

    Díaz-Varela R, de la Rosa R, León L, et al. High-resolution airborne UAV imagery to assess olive tree crown parameters using 3D photo reconstruction: application in breeding trials. Remote Sens-Basel. 2015;7(4):4213–32.

    Google Scholar 

  76. 76.

    Koc-San D, Selim S, Aslan N, et al. Automatic citrus tree extraction from UAV images and digital surface models using circular Hough transform. Comput Electron Agr. 2018;150:289–301.

    Google Scholar 

  77. 77.

    Torres-Sánchez J, López-Granados F, Serrano N, et al. High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology. PLoS ONE. 2015;10(6):e0130479.

    PubMed  PubMed Central  Google Scholar 

  78. 78.

    Zarco-Tejada PJ, Guillén-Climent ML, Hernández-Clemente R, et al. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agr Forest Meteorol. 2013;171–172:281–94.

    Google Scholar 

  79. 79.

    Zarco-Tejada PJ, Suarez L, Gonzalez-Dugo V. Spatial resolution effects on chlorophyll fluorescence retrieval in a heterogeneous canopy using hyperspectral imagery and radiative transfer simulation. IEEE Geosci Remote S. 2013;10(4):937–41.

    Google Scholar 

  80. 80.

    Zarco-Tejada PJ, González-Dugo MV, Fereres E. Seasonal stability of chlorophyll fluorescence quantified from airborne hyperspectral imagery as an indicator of net photosynthesis in the context of precision agriculture. Remote Sens Environ. 2016;179:89–103.

    Google Scholar 

  81. 81.

    Islam MS. Sensing and uptake of nitrogen in rice plant: a molecular view. Rice Sci. 2019;26(6):343–55.

    Google Scholar 

  82. 82.

    Xuefeng L, Qiang L, Shaolan H, et al. Estimation of carbon and nitrogen contents in citrus canopy by low-altitude remote sensing. Int J Agric Biol Eng. 2016;9(5):149–57.

    Google Scholar 

  83. 83.

    Perry EM, Goodwin I, Cornwall D. Remote sensing using canopy and leaf reflectance for estimating nitrogen status in red-blush pears. HortScience. 2018;53(1):78–83.

    CAS  Google Scholar 

  84. 84.

    Inácio MRC, de Lima KMG, Lopes VG, et al. Total anthocyanin content determination in intact açaí (Euterpe oleracea Mart.) and palmitero-juçara (Euterpe edulis Mart.) fruit using near infrared spectroscopy (NIR) and multivariate calibration. Food Chem. 2013;136(3–4):1160–4.

    PubMed  Google Scholar 

  85. 85.

    Galvez-Sola L, García-Sánchez F, Pérez-Pérez JG, et al. Rapid estimation of nutritional elements on citrus leaves by near infrared reflectance spectroscopy. Front Plant Sci. 2015;6:571.

    PubMed  PubMed Central  Google Scholar 

  86. 86.

    Nagy A, Riczu P, Tamás J. Spectral evaluation of apple fruit ripening and pigment content alteration. Sci Hortic-Amsterdam. 2016;201:256–64.

    CAS  Google Scholar 

  87. 87.

    Gutiérrez S, Tardaguila J, Fernández-Novales J, et al. On-the-go hyperspectral imaging for the in-field estimation of grape berry soluble solids and anthocyanin concentration. Aust J Grape Wine R. 2019;25(1):127–33.

    Google Scholar 

  88. 88.

    Gutiérrez S, Wendel A, Underwood J. Ground based hyperspectral imaging for extensive mango yield estimation. Comput Electron Agr. 2019;157:126–35.

    Google Scholar 

  89. 89.

    Zhang J, Huang Y, Pu R, et al. Monitoring plant diseases and pests through remote sensing technology: a review. Comput Electron Agr. 2019;165:104943.

    Google Scholar 

  90. 90.

    Mahlein AK, Kuska MT, Thomas S, Bohnenkamp D, Alisaac E, Behmann J, Wahabzada M, Kersting K. Plant disease detection by hyperspectral imaging: from the lab to the field. Adv Animal Biosci. 2017;8(2):238–43.

    Google Scholar 

  91. 91.

    Calderón R, Navas-Cortés JA, Lucena C, et al. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens Environ. 2013;139:231–45.

    Google Scholar 

  92. 92.

    Calderón R, Navas-Cortés J, Zarco-Tejada P. Early detection and quantification of verticillium wilt in olive using hyperspectral and thermal imagery over large areas. Remote Sens-Basel. 2015;7(5):5584–610.

    Google Scholar 

  93. 93.

    López-López M, Calderón R, González-Dugo V, et al. Early detection and quantification of almond red leaf blotch using high-resolution hyperspectral and thermal imagery. Remote Sens-Basel. 2016;8(4):276.

    Google Scholar 

  94. 94.

    de Castro AI, Ehsani R, Ploetz RC, et al. Detection of laurel wilt disease in avocado using low altitude aerial imaging. PLoS ONE. 2015;10(4):e124642.

    Google Scholar 

  95. 95.

    De Castro AI, Ehsani R, Ploetz R, et al. Optimum spectral and geometric parameters for early detection of laurel wilt disease in avocado. Remote Sens Environ. 2015;171:33–44.

    Google Scholar 

  96. 96.

    Perez-Bueno ML, Pineda M, Vida C, et al. Detection of white root rot in avocado trees by remote sensing. Plant Dis. 2019;103(6):1119–25.

    CAS  PubMed  Google Scholar 

  97. 97.

    Hagen N, Kudenov MW. Review of snapshot spectral imaging technologies. Opt Eng. 2013;52(9):90901.

    Google Scholar 

  98. 98.

    Tattersall GJ. Infrared thermography: a non-invasive window into thermal physiology. Comp Biochem Physiol A Mol Integr Physiol. 2016;202:78–98.

    CAS  PubMed  Google Scholar 

  99. 99.

    Vadivambal R, Jayas DS. Applications of thermal imaging in agriculture and food industry—a review. Food Bioprocess Tech. 2011;4(2):186–99.

    Google Scholar 

  100. 100.

    Meola C, Carlomagno GM. Recent advances in the use of infrared thermography. Meas Sci Technol. 2004;9(15):27–58.

    Google Scholar 

  101. 101.

    Berger C, Rosentreter J, Voltersen M, et al. Spatio-temporal analysis of the relationship between 2D/3D urban site characteristics and land surface temperature. Remote Sens Environ. 2017;193:225–43.

    Google Scholar 

  102. 102.

    Stow D, Riggan P, Schag G, et al. Assessing uncertainty and demonstrating potential for estimating fire rate of spread at landscape scales based on time sequential airborne thermal infrared imaging. Int J Remote Sens. 2019;40(13):4876–97.

    Google Scholar 

  103. 103.

    Kays R, Sheppard J, Mclean K, et al. Hot monkey, cold reality: surveying rainforest canopy mammals using drone-mounted thermal infrared sensors. Int J Remote Sens. 2019;40(2):407–19.

    Google Scholar 

  104. 104.

    Giro A, Pezzopane JRM, Barioni Junior W, et al. Behavior and body surface temperature of beef cattle in integrated crop-livestock systems with or without tree shading. Sci Total Environ. 2019;684:587–96.

    CAS  PubMed  Google Scholar 

  105. 105.

    Koprowski R. Automatic analysis of the trunk thermal images from healthy subjects and patients with faulty posture. Comput Biol Med. 2015;62:110–8.

    PubMed  Google Scholar 

  106. 106.

    Childs C, Siraj MR, Fair FJ, et al. Thermal territories of the abdomen after caesarean section birth: infrared thermography and analysis. J Wound Care. 2016;25(9):499–512.

    CAS  PubMed  Google Scholar 

  107. 107.

    Struthers R, Ivanova A, Tits L, et al. Thermal infrared imaging of the temporal variability in stomatal conductance for fruit trees. Int J Appl Earth Obs. 2015;39:9–17.

    Google Scholar 

  108. 108.

    Ballester C, Jiménez-Bello MA, Castel JR, et al. Usefulness of thermography for plant water stress detection in citrus and persimmon trees. Agr Forest Meteorol. 2013;168:120–9.

    Google Scholar 

  109. 109.

    Jackson RD, Idso SB, Reginato RJ, et al. Canopy temperature as a crop water stress indicator. Water Resour Res. 1981;17(4):1133–8.

    Google Scholar 

  110. 110.

    Ben-Gal A, Agam N, Alchanatis V, et al. Evaluating water stress in irrigated olives: correlation of soil water status, tree water status, and thermal imagery. Irrigation Sci. 2009;27(5):367–76.

    Google Scholar 

  111. 111.

    Zarco-Tejada P, Gonzalez-Dugo V, Nicolás E, et al. Using high resolution UAV thermal imagery to assess the variability in the water status of five fruit tree species within a commercial orchard. Precis Agric. 2013;14(6):660–78.

    Google Scholar 

  112. 112.

    Jackson RD, Kustas WP, Choudhury BJ. A reexamination of the crop water stress index. Irrigation Sci. 1988;9(4):309–17.

    Google Scholar 

  113. 113.

    Matese A, Baraldi R, Berton A, et al. Estimation of water stress in grapevines using proximal and remote sensing methods. Remote Sens-Basel. 2018;10(1):114.

    Google Scholar 

  114. 114.

    Santesteban LG, Di Gennaro SF, Herrero-Langreo A, et al. High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agr Water Manage. 2017;183:49–59.

    Google Scholar 

  115. 115.

    Egea G, Padilla-Díaz CM, Martinez-Guanter J, et al. Assessing a crop water stress index derived from aerial thermal imaging and infrared thermometry in super-high density olive orchards. Agr Water Manage. 2017;187:210–21.

    Google Scholar 

  116. 116.

    García-Tejero IF, Gutiérrez-Gordillo S, Ortega-Arévalo C, et al. Thermal imaging to monitor the crop-water status in almonds by using the non-water stress baselines. Sci Hortic-Amsterdam. 2018;238:91–7.

    Google Scholar 

  117. 117.

    Moller M, Alchanatis V, Cohen Y, et al. Use of thermal and visible imagery for estimating crop water status of irrigated grapevine. J Exp Bot. 2006;58(4):827–38.

    PubMed  Google Scholar 

  118. 118.

    Salgadoe A, Robson A, Lamb D, et al. A non-reference temperature histogram method for determining tc from ground-based thermal imagery of orchard tree canopies. Remote Sens-Basel. 2019;11(6):714.

    Google Scholar 

  119. 119.

    García-Tejero I, Ortega-Arévalo C, Iglesias-Contreras M, et al. Assessing the crop-water status in almond (Prunus dulcis Mill.) trees via thermal imaging camera connected to smartphone. Sensors-Basel. 2018;18(4):e1050.

    PubMed  Google Scholar 

  120. 120.

    Kaim W, Fiedler J. Spectroelectrochemistry: the best of two worlds. Chem Soc Rev. 2009;38(12):3373–82.

    CAS  PubMed  Google Scholar 

  121. 121.

    Oerke EC, Fröhling P, Steiner U. Thermographic assessment of scab disease on apple leaves. Precis Agric. 2011;12(5):699–715.

    Google Scholar 

  122. 122.

    Tsror Lahkim L. Epidemiology and control of Verticillium wilt on olive. Israel J Plant Sci. 2011;59(1):59–69.

    Google Scholar 

  123. 123.

    Jiménez-Díaz RM, Cirulli M, Bubici G, Jiménez-Gasco LM, et al. Verticillium wilt, a major threat to olive production: current status and future prospects for its management. Plant Dis. 2012;96(3):304–29.

    PubMed  Google Scholar 

  124. 124.

    Colaço A, Trevisan R, Molin J, et al. A method to obtain orange crop geometry information using a mobile terrestrial laser scanner and 3D modeling. Remote Sens-Basel. 2017;9(8):763.

    Google Scholar 

  125. 125.

    Kashani AG, Olsen MJ, Parrish CE, et al. A review of LIDAR radiometric processing: from ad hoc intensity correction to rigorous radiometric calibration. Sensors. 2015;15(11):28099–128.

    PubMed  Google Scholar 

  126. 126.

    Gondal MA, Mastromarino J. Lidar system for remote environmental studies. Talanta. 2000;53(1):147–54.

    CAS  PubMed  Google Scholar 

  127. 127.

    Lim K, Treitz P, Wulder M, et al. LiDAR remote sensing of forest structure. Progress Phys Geography Earth Environ. 2016;27(1):88–106.

    Google Scholar 

  128. 128.

    Del-Moral-Martínez I, Rosell-Polo J, Company J, et al. Mapping vineyard leaf area using mobile terrestrial laser scanners: should rows be scanned on-the-go or discontinuously sampled? Sensors-Basel. 2016;16(1):119.

    PubMed Central  Google Scholar 

  129. 129.

    Chakraborty M, Khot LR, Sankaran S, et al. Evaluation of mobile 3D light detection and ranging based canopy mapping system for tree fruit crops. Comput Electron Agr. 2019;158:284–93.

    Google Scholar 

  130. 130.

    Pfeiffer SA, Guevara J, Cheein FA, et al. Mechatronic terrestrial LiDAR for canopy porosity and crown surface estimation. Comput Electron Agr. 2018;146:104–13.

    Google Scholar 

  131. 131.

    Arnó J, Escolà A, Masip J, et al. Influence of the scanned side of the row in terrestrial laser sensor applications in vineyards: practical consequences. Precis Agric. 2015;16(2):119–28.

    Google Scholar 

  132. 132.

    Ma X, Feng J, Guan H, et al. Prediction of chlorophyll content in different light areas of apple tree canopies based on the color characteristics of 3D reconstruction. Remote Sens-Basel. 2018;10(3):429.

    Google Scholar 

  133. 133.

    Hosoi F, Umeyama S, Kuo K. Estimating 3D chlorophyll content distribution of trees using an image fusion method between 2D camera and 3D portable scanning lidar. Remote Sens-Basel. 2019;11(18):2134.

    Google Scholar 

  134. 134.

    James P, Underwood CHBW, Sukkarieh S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Comput Electron Agr. 2016;130:83–96.

    Google Scholar 

  135. 135.

    Stein M, Bargoti S, Underwood J. Image based mango fruit detection, localisation and yield estimation using multiple view geometry. Sensors. 2016;16(11):1915.

    Google Scholar 

  136. 136.

    Gené-Mola J, Gregorio E, Guevara J, et al. Fruit detection in an apple orchard using a mobile terrestrial laser scanner. Biosyst Eng. 2019;187:171–84.

    Google Scholar 

  137. 137.

    Brugger A, Behmann J, Paulus S, et al. Extending hyperspectral imaging for plant phenotyping to the UV-range. Remote Sens-Basel. 2019;11(12):1401.

    Google Scholar 

  138. 138.

    Dankowska A, Kowalewski W. Tea types classification with data fusion of UV-Vis, synchronous fluorescence and NIR spectroscopies and chemometric analysis. Spectrochim Acta Part A Mol Biomol Spectrosc. 2019;211:195–202.

    CAS  Google Scholar 

  139. 139.

    Rosell JR, Sanz R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput Electron Agr. 2012;81:124–41.

    Google Scholar 

Download references

Acknowledgements

The authors thank American Journal Experts (AJE) for editing the language of this paper.

Funding

The authors are grateful for the financial support from the Hebei Provincial Department of Science and Technology (Grant number 19227211D).

Author information

Affiliations

Authors

Contributions

H-YR collected and analysed references, drafted the manuscript; R-ZH provided financial supports; R-ZH and L-DM proposed the subject and revised manuscript, L-X offered much help in the process of revision. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Zhenhui Ren.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Huang, Y., Ren, Z., Li, D. et al. Phenotypic techniques and applications in fruit trees: a review. Plant Methods 16, 107 (2020). https://doi.org/10.1186/s13007-020-00649-7

Download citation

Keywords

  • Phenotype
  • VIS–NIR spectroscopy
  • Spectral imaging
  • Thermal imaging
  • LiDAR