Skip to main content
  • Methodology article
  • Open access
  • Published:

A novel 3D imaging system for strawberry phenotyping

Abstract

Background

Accurate and quantitative phenotypic data in plant breeding programmes is vital in breeding to assess the performance of genotypes and to make selections. Traditional strawberry phenotyping relies on the human eye to assess most external fruit quality attributes, which is time-consuming and subjective. 3D imaging is a promising high-throughput technique that allows multiple external fruit quality attributes to be measured simultaneously.

Results

A low cost multi-view stereo (MVS) imaging system was developed, which captured data from 360° around a target strawberry fruit. A 3D point cloud of the sample was derived and analysed with custom-developed software to estimate berry height, length, width, volume, calyx size, colour and achene number. Analysis of these traits in 100 fruits showed good concordance with manual assessment methods.

Conclusion

This study demonstrates the feasibility of an MVS based 3D imaging system for the rapid and quantitative phenotyping of seven agronomically important external strawberry traits. With further improvement, this method could be applied in strawberry breeding programmes as a cost effective phenotyping technique.

Background

A successful strawberry breeding programme generates and selects genotypes with traits suitable for the industry in its target geographic region [1]. As often genotypes cannot be directly observed, traditional breeding selects on the basis of a weighted selection index of phenotypes [2]. In order to maximise the accuracy of selection, heritable traits of interest must be measured precisely and accurately. Currently, most external fruit quality phenotyping approaches in strawberry breeding relies on the human eye to make assessments [1]. This approach is labour-intensive, prone to human bias and typically generates ordinal data less suitable for the most powerful quantitative statistical models [3].

Use of image analysis has the potential to overcome some of these limitations, with previous studies showing success in utilising 2D high-throughput imaging systems to assess external fruit quality [4]. Most studies were focussed on colour analysis of fruits, including apple [5], citrus [6], mango [7] and banana [8], but some systems have assessed morphological attributes, including the size of apples [9] and the shape of oranges [10]. For strawberry, an automated grading system was developed by Liming et al. [11] that assesses colour, size and four degrees of shape. In another 2D strawberry imaging system, developed by Nagata et al., the maximum fruit diameter could be derived by automatically identifying the axis from the top of the calyx to the tip of the nose [12]. However, 2D image analysis is not always a reliable fruit phenotyping method due to uneven colour distribution of fruit and occlusion of morphology from different viewing perspectives [13].

Recently, 3D imaging has been increasingly explored, as the cost of hardware decreases and reassembly techniques improve [14], with a range of sensors deployed for plant phenotyping. Light detection and ranging (LiDAR) was used to generate detailed 3D models of plants [15, 16], but is currently expensive, time-consuming and complex to implement [17]. Binocular stereovision is a low-cost solution for 3D plant canopy reconstruction [18], but with only two viewing perspectives, is insufficient to model the entire target. Other techniques, including time-of-flight (TOF) [19, 20] and structured light [21], have similar limitations in gathering 360° information from the target.

Studies of 3D imaging based phenotyping of fruit are limited. A 3D model of mango was generated using four cameras and the shape-from-silhouettes reconstruction method, but it did not encompass 360° of the fruit. Five parameters were extracted from the 3D model including length, width, thickness, volume and surface area in order to sort the mangoes by size. Image based sorting accuracy was comparable to manual sorting, but no comparison of individual trait data to a “gold standard” was shown [22]. Shape-from-silhouettes was also successfully applied to the 3D reconstruction of tomato seedlings with ten calibrated cameras. Stem height and leaf area were accurately measured after geometry based segmentation [23].

Multi-view stereovision (MVS), which originated from binocular stereovision, is a promising approach for fruit phenotyping by capturing images from multiple overlapping viewpoints [24]. For the determination of the intrinsic camera parameters and the positions of uncalibrated cameras, Structure from Motion (SfM) is a widely used technique (Fig. 1) [25]. SfM detects feature points, called keypoints, from all the input 2D images using the Scale-invariant Feature Transform (SIFT) algorithm. The number of keypoints is determined by image quality, including factors such as resolution and texture. The relative pose and camera locations are determined by matching keypoints across all images and iteratively refined by bundle adjustment, resulting in a point cloud [24]. The coordinate system generated by SfM is always in an arbitrary image space making it necessary to transform the coordinate system into an object space by using a known standard [26]. This method has been demonstrated to be low cost, highly precise, easy to implement, generate 360° colour information, and require no camera calibration. MVS and SfM have been successfully utilised to generate estimates of leaf and stem dimensions of paprika [17].

Fig. 1
figure 1

The flowchart of SfM method

In this study, a novel 3D imaging based approach for phenotyping strawberry fruit was explored. MVS and SfM were applied to generate a 3D model of strawberry and software was developed to measure seven agronomically important external strawberry traits. This method is promising to facilitate strawberry breeding by providing a high-throughput, objective and low-cost phenotyping system.

Methods

Fruit material

100 strawberries were purchased from local supermarkets, including 10 different varieties, to represent the diverse range of commercially available strawberry phenotypes. All fruits were assessed before their “best before” dates. Fruits would likely have been subjected to chilling to 4 °C within 4 h of harvest and kept at that temperature throughout the supply chain until sale. Fruits were stored at 4 °C until assessment.

Manual assessment

In order to validate the results of the 3D analysis, phenotypic data were collected manually (Table 1) immediately after imaging. Measurements of dimensions were performed using a pair of digital callipers and measurement of volume was performed using an overflow can and a measuring cylinder.

Table 1 Manual scoring metrics for seven external fruit quality traits

Image capture

The sample was pinned onto a dark blue holder (38 mm × 19 mm × 19 mm) placed in the middle of a turntable and rotated at 0.02 Hz. A single lens reflex (SLR) camera (Canon EOS 1200D, Canon Inc., Tokyo, Japan) was placed facing the sample with a focal length of 55 mm so that the field of view is large enough to accommodate the largest strawberry sample. The distance between the lens and the sample was set to 50 cm with a viewing angle of 35° to the horizontal, which allows maximum visualization of the strawberry body without occlusion of the calyx. The relative positions of the camera and holder was fixed for all samples. The sample was illuminated with two white LED light sources against a white background (Fig. 2a). In total, 146 images were captured per sample over 50 s with ISO speed rating at 800, shutter speed at 1/125 s and aperture value at 5.38 EV. With this configuration, no blurring was found in any image.

Fig. 2
figure 2

a Mechanical structure of the proposed imaging system; b point cloud of the strawberry and holder

3D point cloud reconstruction

A Dell desktop computer (CPU Xeon® CPU X5560 @2.80 GHz x 16, Intel Co., Santa Clara, CA, USA) with a graphics card (Quadro K2200 GPU, NVIDIA Co., Santa Clara, CA, USA) operating on Linux Ubuntu 14.04 was used in this study for both software development and point cloud processing.

The point cloud reconstruction was implemented with commercial software (Agisoft Photoscan, Agisoft, LLC, St. Petersburg, Russia; licence required), utilising the Structure from Motion (SfM) algorithm [17] (Fig. 2b). Due to the high overlap between adjacent images and high resolution (5148 × 3456) of each image, pre-processing software was developed to automatically reduce the number of images by discarding three frames from every four. This was found to be the minimum number to reconstruct all the 3D models successfully. Additionally, each image was rescaled to the resolution of 1000 × 1450, which greatly increased the processing speed with satisfactory point cloud quality.

Automated 3D image analysis

The automated point cloud analysis software was developed in C++ with Point Cloud Library (PCL) [27]. The software is programmed to automatically load all point cloud files in order and process them in a batch by implementing the point cloud segmentation and external quality attributes measurement algorithms.

Point cloud segmentation

Each point cloud was first converted from Red Green Blue (RGB) space to Hue Saturation Value (HSV) space. Using an arbitrary threshold on the hue channel, which is defined as the attribute of a visual sensation to one of the perceived colours [28], the point cloud was segmented into calyx, body, achenes and holder (Fig. 3a–d).

Fig. 3
figure 3

a Bounding box fitted on the point cloud strawberry body and holder; b Bounding box fitted on point cloud of holder; c Point cloud of strawberry body; d Point cloud of calyx with a red line to label maximum distance; e mesh of strawberry; f identification of achenes

Orienting bounding box (oBB) fitting

The OBBs were fitted to the segments of holder and the combination the holder and the fruit body for the size measurement. The major eigenvectors of the covariance matrix of points in a point cloud define the major axis of its OBB [29]. The second axis was determined by calculating the maximum Euclidean distance of the points in the point cloud orthogonal to the major axis. The final axis was orthogonal to both other axes.

Height, length and width

An OBB was fitted to the point cloud of the combination of the fruit body and holder segments. The OBB was not fitted directly to the body, as its irregular shape often resulted in misidentification of the vertical axis. The height of the combination of fruit body and holder was always the largest dimension, so the magnitude of the OBB major axis was assumed to be equivalent to the height of the fruit body and holder model. As the fruit body was always longer and wider than the holder, the second and third dimensions of the OBB represented length and width respectively. The height of the holder was estimated by fitting an OBB to its point cloud and the difference in height between it and the combination of fruit body and holder OBB was assumed to be the height of the fruit. Ratios between the three fruit body dimensions and the height of the holder were multiplied by the true height of the holder to derive the strawberry height, width and length.

Volume

The mesh of the strawberry body (Fig. 3e) was constructed from the point cloud using Poisson Surface Reconstruction [30], which produces an enclosed mesh without any edges or large holes. The mesh volume was calculated by summing the volume of every triangle based pyramid formed from each face of the mesh and the origin of the point cloud [31].

Calyx size

The edges of the calyx segment were identified by applying convex hull [32], enabling rapid calculation of the maximum Euclidean distance (Fig. 2d). The ratio between the calyx maximum distance and the height of the holder OBB was multiplied by the true height of the holder to estimate calyx size.

Achene number

The segmentation of achenes from the point cloud was based on identifying points in the body segment with an arbitrary range in the hue channel of HSV space. A clustering algorithm based on Euclidean distances between points was implemented to group points corresponding to the same achene (Fig. 3f) [33] and the number of clusters was counted automatically.

Colour

As hue value in HSV space represents visual sensation of perceived colour [28], the mean intensity of the hue channel of every point in the body segment was calculated for the assessment of the strawberry colour.

Statistics

In this study, the concordance correlation coefficient (CCC) was used to measure the concordance between the manually derived and the 3D image derived external fruit quality traits [34]. Additionally, the coefficient of determination (r2) was calculated to estimate correlation between the sets of values. Statistical analysis was performed using R [35]. Linear models and associated coefficients were derived using the “lm” function, the root mean square error (RMSE) was derived using the “Metrics” package [36] and the concordance correlation coefficient (CCC) was derived using the “Agreement” package [37].

Results

In order to evaluate the measurement of seven external strawberry fruit quality parameters using 3D imaging (hereafter referred to as automated assessment), 100 berries were automatically and manually assessed. Reliable reconstruction could be achieved by taking a minimum of 37 images per berry with 100% successful reconstruction, though the nose of the fruit was often missing due to occlusion from the shooting angle. With the described setup, data capture took approximately 60 s, including 10 s of operator action per sample. Model reconstruction took approximately 15 min and parameter derivation took approximately 50 s. Both these operations were fully automated.

In order to validate the 3D reconstruction, the point cloud of the holder segment was manually measured in Meshlab [38], an open source software for 3D mesh visualisation. Although their absolute sizes in image space were inconsistent (range 0.36–1.73; mean 0.78; SD 0.27), ratios among the height, width and length were the similar to the true ratios. As there was no evidence of distortion, the absolute height of the holder was used as a standard for fruit dimension measurements. Moreover, incorporation of the holder point cloud ensured that the vertical axis was always greater than any other axis, allowing the major eigenvector of the point cloud covariance matrix to consistently define the vertical axis.

To validate the measurements, the seven traits were measured on a sample of 100 fruits using both manual and automated assessment (Fig. 4). Concordance and correlation were assessed using CCC and r2 respectively. Good concordance (CCC > 0.9) and correlation (r2 > 0.9) were found between the measurements of fruit dimensions and volume (Fig. 4a–d). Weaker concordance (CCC = 0.86) and correlation (r2 = 0.87) was found between the measurements of calyx size (Fig. 4e), which was possibly due to the soft calyx being moved during assessment. Weak concordance (CCC = 0.67) and correlation (r2 = 0.77) were found between the measurements of achene number (Fig. 4f), which is possibly due to the lack of information gathered regarding the nose of the fruit. Weak correlation (r2 = 0.68) was found between the measurements of colour (Fig. 4g), with high variance in the manual scores. This was likely due to the variability of colour on each fruit and the subjective nature of the score.

Fig. 4
figure 4

Regression analysis for height (a), length (b), width (c), volume (d), calyx size (e), achene number (f) and colour (g) as measured by automated assessment and manual assessment. Sample size = 100 for all measurements, except achene number, where sample size = 10. Red lines are least squares linear regression curves and black lines are idealized regression curves (y = x)

Discussion

Good concordance between manual and automated measurements of calyx size, height, length, width and volume, and promising results for achene number and colour were achieved. It is suggested that the qualitative traits of strawberry currently used in breeding can be represented using the measurements generated from this study. For instance, a “long conic” [1] fruit has a large ratio of height to width and measurement of “Cap size” [1] can be defined by the ratio of calyx size to fruit width and length.

With further development, automated assessment could be suitable for integration into existing strawberry breeding programmes, bringing a range of advantages. Firstly, the quantitative, accurate and unbiased measurements would increase the accuracy of the selection in strawberry breeding. The precise measurements would be particularly suitable for input into models of genomic selection, which attempt to quantify small effect quantitative trait loci (QTLs) associated with polygenic traits [39, 40]. Secondly, automated assessment has the potential to improve the speed of assessment. The described setup requires approximately 10 s of human operator time per sample, approximately 20-fold faster than making the equivalent manual measurements. Thirdly, the low cost and wide availability of hardware mean that this approach can be easily introduced into existing breeding programmes with minimal capital expenditure.

Measurement error may have arisen from a range of sources. During manual assessment, the axis of measurement was determined by eye, potentially resulting in non-maximal distances or non-orthogonal axes. As the calyx is soft, errors may have been induced in the operation of the callipers. Correlation between the measurements of colour may be weak as manual assessment is subjective and it is difficult to assess fruit with uneven colour distributions.

This imaging system can be developed to reduce the duration of data capture by using alternative imagers such as scientific cameras or webcams with programmable shutter speeds and resolutions. Reducing resolution to 1000 × 1450 greatly increases the processing speed compared to the original images, but further investigation is needed to identify the minimum resolution to generate satisfactory point clouds. Use of multiple calibrated cameras to capture information from different viewpoints simultaneously could also be explored to further improve the quality and efficiency of 3D reconstruction, particularly from the nose of the fruit and the data capture speed.

As both fruit body and achenes have a range of colours, our current algorithm of arbitrary hue thresholding is unlikely to be reliable in identifying achenes from a range of cultivars. More sophisticated adaptive or texture based thresholding algorithms would likely improve the cluster identification.

It is believed that more traits could be derived from the gathered dataset. Firstly, algorithms exist that can calculate the surface area of a 3D mesh [31], which together with reliable achene counts could be used to quantify achene density. Secondly, rotational symmetry could be quantified by considering the distribution of the Euclidean distance of points to the principal axis in 2D slices of the point cloud orthogonal to the principle axis. Thirdly, it may be possible to quantify the morphology of the fruit body at the neck of the fruit to derive information regarding the neck line.

Conclusion

In this study, a MVS based 3D reconstruction pipeline was developed and utilised to generate in silico model of strawberries. Automated 3D image point cloud analysis software was developed in house to derive berry achene counts, calyx size, colour, height, length, width and volume of the model. This study found good correlation between the automated and manual assessment techniques for dimension measurements and volume, suggesting that automated assessment is a promising technique to be utilised in place of manual assessment for these traits.

The focus of this study was the investigation of the use of 3D imaging to phenotype strawberries for commercial breeding. This system, with further improvement, can be quantitative, accurate, rapid and require little capital investment to be integrated into existing strawberry breeding programmes. This approach can also be further developed for strawberry quality control as its high precision is particularly suited for assessing differences within single cultivars, a situation frequently encountered in pack houses.

References

  1. Mathey MM, Mookerjee S, Gündüz K, Hancock JF, Iezzoni AF, Mahoney LL, et al. Large-scale standardized phenotyping of strawberry in RosBREED. J Am Pomol Soc. 2013;67:205–16.

    Google Scholar 

  2. Chandler C, Folta K, Dale A, Whitacker VM, Herrington M. Chapter 9—strawberry. In: Badeneses ML, Byrne DH, editors. Fruit breeding. Berlin: Springer; 2012. p. 305–25.

    Chapter  Google Scholar 

  3. Goddard ME, Hayes BJ. Genomic selection. J Anim Breed Genet. 2007;124:323–30.

    Article  CAS  PubMed  Google Scholar 

  4. Dadwal M, Banga VK. Color image segmentation for fruit ripeness detection: a review. In: 2nd International Conference on Electrical Electronics and Civil Engineering; 2012. p. 190–3.

  5. Throop JA, Aneshansley DJ, Anger WC, Peterson DL. Quality evaluation of apples based on surface defects: development of an automated inspection system. Postharvest Biol Technol. 2005;36:281–90.

    Article  Google Scholar 

  6. Blasco J, Aleixos N, Moltó E. Computer vision detection of peel defects in citrus by means of a region oriented segmentation algorithm. J Food Eng. 2007;81:535–43.

    Article  Google Scholar 

  7. Kang SP, East AR, Trujillo FJ. Colour vision system evaluation of bicolour fruit: a case study with “B74” mango. Postharvest Biol Technol. 2008;49:77–85.

    Article  Google Scholar 

  8. Mendoza F, Aguilera JM. Application of image analysis for classification of ripening bananas. Food Eng Phys Prop. 2004;69:471–7.

    Google Scholar 

  9. Blasco J, Aleixos N, Moltó E. Machine vision system for automatic quality grading of fruit. Biosyst Eng. 2003;85:415–23.

    Article  Google Scholar 

  10. Costa C, Menesatti P, Paglia G, Pallottino F, Aguzzi J, Rimatori V, et al. Quantitative evaluation of Tarocco sweet orange fruit shape using optoelectronic elliptic Fourier based analysis. Postharvest Biol Technol. 2009;54:38–47.

    Article  Google Scholar 

  11. Liming X, Yanchao Z. Automated strawberry grading system based on image processing. Comput Electron Agric. 2010;71S:32–9.

    Article  Google Scholar 

  12. Nagata M, Bato PM, Mitarai M, Qixin C, Kitahara T. Study on sorting system for strawberry using machine vision (part 1). Development of software for determining the direction of strawberry (Akihime variety). J Jpn Soc Agric Mach. 2000;62:100–10.

    Google Scholar 

  13. Paulus S, Behmann J, Mahlein AK, Plümer L, Kuhlmann H. Low-cost 3D systems: suitable tools for plant phenotyping. Sensors. 2014;14:3001–18.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Vázquez-Arellano M, Griepentrog HW, Reiser D, Paraforos DS. 3-D imaging systems for agricultural applications—a review. Sensors. 2016;16:1–24.

    Article  Google Scholar 

  15. Kjaer KH, Ottosen C-O. 3D laser triangulation for plant phenotyping in challenging environments. Sensors. 2015;15:13533–47.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Paulus S, Dupuis J, Mahlein A, Kuhlmann H. Surface feature based classification of plant organs from 3D laserscanned point clouds for plant phenotyping. BMC Bioinformatics. 2013;14:238.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Zhang Y, Teng P, Shimizu Y, Hosoi F, Omasa K. Estimating 3D leaf and stem shape of nursery paprika plants by a novel multi-camera photography system. Sensors. 2016;16:874.

    Article  PubMed Central  Google Scholar 

  18. Klodt M, Herzog K, Töpfer R, Cremers D. Field phenotyping of grapevine growth using dense stereo reconstruction. BMC Bioinformatics. 2015;16:143.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Alenyà G, Dellen B, Torras C. 3D modelling of leaves from color and ToF data for robotized plant measuring. In: Proceedings of IEEE international conference on robotics and automation; 2011. p. 3408–14.

  20. Klose R, Penlington J, Ruckelshausen A. Usability study of 3D time-of-flight cameras for automatic plant phenotyping. Image Analysis for Agricultural Products and Processes. 2011;69:93–105

  21. Chéné Y, Rousseau D, Lucidarme P, Bertheloot J, Caffier V, Morel P, et al. On the use of depth camera for 3D phenotyping of entire plants. Comput Electron Agric. 2012;82:122–7.

    Article  Google Scholar 

  22. Chalidabhongse T, Yimyam P, Sirisomboon P. 2D/3D vision-based mango’s feature extraction and sorting. In: 9th international conference on Control, Automation, Robotics and Vision, 2006, ICARCV’06; 2006.

  23. Golbach F, Kootstra G, Damjanovic S, Otten G, van de Zedde R. Validation of plant part measurements using a 3D reconstruction method suitable for high-throughput seedling phenotyping. Mach Vis Appl. 2015;27:663–80.

    Article  Google Scholar 

  24. Rose JC, Paulus S, Kuhlmann H. Accuracy analysis of a multi-view stereo approach for phenotyping of tomato plants at the organ level. Sensors. 2015;15:9651–65.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Fonstad MA, Dietrich JT, Courville BC, Jensen JL, Carbonneau PE. Topographic structure from motion: a new development in photogrammetric measurement. Earth Surf Proc Land. 2013;38:421–30.

    Article  Google Scholar 

  26. Westoby MJ, Brasington J, Glasser NF, Hambrey MJ, Reynolds JM. “Structure-from-motion” photogrammetry: a low-cost, effective tool for geoscience applications. Geomorphology. 2012;179:300–14.

    Article  Google Scholar 

  27. Rusu RB, Cousins S. 3D is here: point cloud library. In: Point Cloud Library http://pointclouds.org/. Accessed 15 July 2017.

  28. Wu D, Sun DW. Colour measurements by computer vision for food quality control—a review. Trends Food Sci Technol. 2013;29:5–20.

    Article  Google Scholar 

  29. Ding S, Mannan MA, Poo AN. Oriented bounding box and octree based global interference detection in 5-axis machining of free-form surfaces. Comput Aided Des. 2004;36:1281–94.

    Article  Google Scholar 

  30. Kazhdan M, Bolitho M, Hoppe H. Poisson surface reconstruction. In: Eurographics symposium on geometry processing; 2006. p. 61–70.

  31. Zhang C, Chen T. Efficient feature extraction for 2D/3D objects in mesh representation. In: Proceedings of international conference on image processing; 2001, p. 1–4.

  32. Cupec R, Nyarko E, Filko D. Fast 2.5D mesh segmentation to approximately convex surfaces. In: 5th European conference on mobile robotics; 2011. p. 3–8.

  33. Dixon SJ, Brereton RG. Comparison of performance of five common classifiers represented as boundary methods: Euclidean distance to centroids, linear discriminant analysis, quadratic discriminant analysis, learning vector quantization and support vector machines, as dependent on data structure. Chemom Intell Lab Syst. 2009;95:1–17.

    Article  CAS  Google Scholar 

  34. Lawrence I, Lin K. A concordance correlation coefficient to evaluate reproducibility. Biometrics. 1989;45:255–68.

    Article  Google Scholar 

  35. R Core Team. R: A language and environment for statistical computing; 2017.

  36. Hamner B. Package “Metrics” version 0.1.2. In: The Comprehensive R Archive Network. 2017. http://cran.r-project.org/web/packages/Metrics/Metrics.pdf. Accessed 12 July 2017.

  37. Yu Y, Lawrence L. Package “Agreement” version 0.8-1. In: The Comprehensive R Archive Network. 2015. http://cran.r-project.org/web/packages/Agreement/Agreement.pdf. Accessed 12 July 2017.

  38. Cignoni P, Callieri M, Corsini M, Delepiane M, Ganovelli F, Ranzuglia G. MeshLab: an open-source mesh processing tool. Sixth Eurographics Italian chapter conference; 2008. p. 129–36.

  39. Meuwissen THE, Hayes BJ, Goddard ME. Prediction of total genetic value using genome-wide dense marker maps. Gene Soc Am. 2001;157:1819–29.

    CAS  Google Scholar 

  40. Gezan SA, Osorio LF, Verma S, Whitaker VM. An experimental validation of genomic selection in octoploid strawberry. Hortic Res. 2017;4:1–9.

    Article  Google Scholar 

Download references

Authors’ contributions

BL devised experiments, developed software and guided the data analysis. JQH assisted in experimental design, conducted the manual assessment and analysed the data. BL and JQH prepared the manuscript. RJH assisted in experimental design and editorial oversight. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

The software developed and datasets generated and analysed during the current study are available from the Image-processing repository at the East Malling Github site (www.github.com/organizations/eastmallingresearch/).

Consent for publication

All authors of the manuscript have read and agreed to its content and are accountable for all aspects of the accuracy and integrity of the manuscript.

Ethics approval and consent to participate

Not applicable.

Funding

This research was supported by the Agriculture and Horticulture Development Board (AHDB) through Studentship Project CP163, Biotechnology and Biological Sciences Research Council (BBSRC) project BB/M01200X/1 awarded to RJH and Engineering and Physical Science Research Council (EPSRC) EP/R005583/1.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bo Li.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

He, J.Q., Harrison, R.J. & Li, B. A novel 3D imaging system for strawberry phenotyping. Plant Methods 13, 93 (2017). https://doi.org/10.1186/s13007-017-0243-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13007-017-0243-x

Keywords