A robot-assisted imaging pipeline for tracking the growths of maize ear and silks in a high-throughput phenotyping platform
© The Author(s) 2017
Received: 4 August 2017
Accepted: 25 October 2017
Published: 8 November 2017
In maize, silks are hundreds of filaments that simultaneously emerge from the ear for collecting pollen over a period of 1–7 days, which largely determines grain number especially under water deficit. Silk growth is a major trait for drought tolerance in maize, but its phenotyping is difficult at throughputs needed for genetic analyses.
We have developed a reproducible pipeline that follows ear and silk growths every day for hundreds of plants, based on an ear detection algorithm that drives a robotized camera for obtaining detailed images of ears and silks. We first select, among 12 whole-plant side views, those best suited for detecting ear position. Images are segmented, the stem pixels are labelled and the ear position is identified based on changes in width along the stem. A mobile camera is then automatically positioned in real time at 30 cm from the ear, for a detailed picture in which silks are identified based on texture and colour. This allows analysis of the time course of ear and silk growths of thousands of plants. The pipeline was tested on a panel of 60 maize hybrids in the PHENOARCH phenotyping platform. Over 360 plants, ear position was correctly estimated in 86% of cases, before it could be visually assessed. Silk growth rate, estimated on all plants, decreased with time consistent with literature. The pipeline allowed clear identification of the effects of genotypes and water deficit on the rate and duration of silk growth.
The pipeline presented here, which combines computer vision, machine learning and robotics, provides a powerful tool for large-scale genetic analyses of the control of reproductive growth to changes in environmental conditions in a non-invasive and automatized way. It is available as Open Source software in the OpenAlea platform.
Maize (Zea mays L.) silks are styles emerging from modified leaf sheaths (husks) that enclose the ear. Silk emergence and growth largely determine the final number of ovaries that develop into grains [1–3]. This is of particular importance under water deficit, because grain abortion is largely controlled in this case by the time during which silks elongate outside the husks. This time can range from 1 day in water deficit, associated with abortion rates of 70–90%, to 7 days in well-watered plants with low abortion rate [1, 4, 5]. The drought-dependent abortion rate is one of the main causes of the high sensitivity of maize to water deficit, so a precise characterization of silk growth and of its response to water deficit is crucial for estimating the degree of sensitivity of maize varieties to water deficit.
Silk number and growth have been measured by cutting cross sections of the silk bundle emerged from husks, and counting and measuring silk segments by image analysis [6–8]. This method is invasive and laborious because silks need to be sampled daily (up to 15 min per sample) . It can also be followed with displacement transducers, thereby providing precise measurements of silk growth dynamics and its response to water deficit [10, 11]. This method is time-consuming and can be performed on a few tens of plants at maximum. Hence, current methods provide accurate estimates of silk number and growth but cannot be used at the throughput required for genetic analyses.
Phenotyping platforms based on computer vision are powerful tools for capturing at high-throughput a number of traits related with the structure and function of plants [12–14] such as the detection, count and quantification of morphological features of oat inflorescences , maize tassels [16, 17] and rice panicles . Most imaging methods are based on camera viewpoints at fixed positions. This limits the possibilities for extracting complete information from complex images, and thus requires manual selection of best views containing useful information [12, 19, 20]. This is the case for maize ears whose position along the stem differs among genotypes and treatments, and is often hidden by leaves. Three problems need to be solved for automating image analysis of ear and silk growth, namely (1) detecting the position of the ear along the stem before the ear is visible (a non-intuitive detection that requires skills of maize experts) (2) identifying the best viewpoints for capturing silk growth dynamics and (3) following silk growth during the 1–7 days during which silks elongate outside the husks. Robot-assisted imaging may help solving these three problems by establishing a loop between image acquisition, analysis and de novo positioning of sensors [12, 21–23]. Thus, partial information recovered from an initial set of fixed viewpoints can be used to calculate new viewpoints containing maximum information and to guide a robot to acquire new images.
In this paper, we have combined computer vision methods, machine learning and robotics to develop a non-invasive, reproducible, and automatized pipeline for detecting maize ears and silks and monitoring silk growth dynamics in a high-throughput phenotyping platform. The methods presented here were tested in a panel of 60 maize genotypes subjected to different water availabilities in the PHENOARCH phenotyping platform (http://bioweb.supagro.inra.fr/phenoarch).
The pipeline presented here involved six steps, namely (1) multi-view whole plant acquisition, (2) image segmentation, (3) detection of side view images containing maximum information, (4) detection of potential ear position, (5) robot-assisted movement of a camera near the ear and (6) ear and silk image acquisition and analysis.
Step 1: Multi-view whole plant acquisition
RGB colour images (2056 × 2454 pixels) of each plant were taken daily with thirteen views (twelve side views from 30° rotational difference and one top view) by using the imaging units of the PHENOARCH platform . Each unit is composed of a cabin involving top and side RGB cameras (Grasshopper3, Point Grey Research, Richmond, BC, Canada) equipped with 12.5–75 mm TV zoom lens (Pentax, Ricoh Imaging, France) and LED illumination (5050–6500 K colour temperature). Images were captured while the plant was rotating at constant rate (20 rpm) using a brushless motor (Rexroth, Germany).
Step 2: Image segmentation
For top view images (Fig. 1d), plant pixels were segmented from background (Fig. 1f) using a decision tree learning  with seven colour space (RGB, HSV, Luv, Lab, HLS, xyz, Yuv) (Fig. 1e) implemented in R . Learning was previously built on a set of contrasting top images involving plants of different genotypes and growth stages.
Step 3: Selection of side view images containing maximum information
Step 4: Detection of the most likely position of the ear in selected side view images
Knowing that the primary ear is located in the upper half of the stem  we have estimated a reference internode width in the lower half of the stem and use it to detect the position of thinner internodes in the upper half. To that end, we have ordered width values in the lower half and kept the 15% percentile, thereby eliminating artefactual width peaks corresponding to leaf junctions, to leaves occulting the stem and to errors related to the image of the plant tutor (Additional file 1). The most likely position of the ear was then estimated in the upper half by detecting (1) long peaks, presumably corresponding to the ear position, followed by (2) internodes with significantly lower width than the width in the lower half of the stem. Because artefacts such as wide leaf junctions or broken leaves along the stem may affect apparent internode width, the two criteria were weighed by 1 and 2, respectively. In cases where multiple side view images were selected, the last step consisted in keeping most represented positions by iteratively discarding those with the highest deviation.
Step 5: Moving a camera close to the ear
In the second cabin, plants were automatically oriented in such a way that the plane containing leaves (as identified in step 3) was perpendicular to the camera axis, by using the brushless motor (Rexroth, Germany) that allows rotating plants with a precision of 0.1° (Fig. 4b). A robotized arm carrying the camera (Fig. 4a) was automatically positioned at 30 cm from the ear (Fig. 4c). This movement was driven by a linear profile axis able to move in the x, y, z directions (1500, 1000, 4000 mm moving range, respectively) equipped with electric synchronous servomotors (Rexroth, Germany) (Fig. 4a). Ears and silks were imaged by using a RGB camera (Grasshopper3, Point Grey Research, Richmond, BC, Canada equipped with a C-mount 50 mm fixed focal lens, Computar, CBC Group, USA) carried by the arm (Fig. 4d).
Step 6: Analysis of ear and silk images
RGB images (2048 × 2448 pixels) of ears and silks (Fig. 4d) were then analysed using two different methods. First, total pixels corresponding to the plant were extracted from background by applying a thresholding based on HSV colour space. In a second method, pixels corresponding to silks were extracted from the previous step using a random forest classification method based on colour (Gaussian smoothing of 5 px) and texture (structure tensor eigenvalues of 1.6 px) (Ilastik software, version 1.1.7) . We used for that a machine learning procedure based on a training set of contrasting ear images involving plants of different genotypes at different ear and silk developmental stages (Additional file 3). Finally, the time courses of pixels corresponding to silk bundles were individually fitted for each plant using the R scripts  ‘segmented’ package , and the maximum rates of silk growth and duration of silk growth were extracted.
Plant material, growth conditions and measured traits
The methods presented here were tested in an experiment involving a set of 60 commercial maize hybrids representative of breeding history in Europe during the last 60 years. This material covers a wide range of plant architecture, growth and development, leading to an appreciable variability of performances in the field. The experiment was conducted in the PHENOARCH phenotyping platform hosted at the M3P, Montpellier Plant Phenotyping Platforms (https://www6.montpellier.inra.fr/lepse/M3P), which allows non-destructive measurements of plant architecture and growth via automatic image acquisition (see  for platform details). Plants were sown in 9L pots filled with a 30:70 (v/v) mixture of a clay and organic compost. They were grown until 10 days after silk emergence. Two levels of soil water content were imposed; (1) retention capacity (WW, soil water potential of − 0.05 MPa) and (2) water deficit (WD, soil water potential of − 0.3 MPa). Soil water content in pots was maintained at target values by compensating transpired water three times per day via individual measurements of each plant. Each genotype was replicated 3 times. Greenhouse temperature was maintained at 25 ± 3 °C during the day and 20 °C during the night. Supplemental light (150 µmol m−2 s−1) was provided to extend the photoperiod to 16 h per day, and during day time when solar radiation dropped below 300 W m−2 (400 W HPS Plantastar lamps, OSRAM, Munich, Germany). Micro-meteorological conditions were monitored every 15 min at eight positions in the greenhouse at the top of the plant canopy.
Phenological stages, including anthesis and silk appearance were individually scored for each plant in the platform. All phenotypic, experimental and environmental collected data were stored in the PHIS information system (http://web.supagro.inra.fr/phis/web/index.php).
Two-way analyses of variance (ANOVA) were performed using the ‘lm’ procedure to calculate the effects of water treatment and genotype. All statistical tests and graphs were performed using R 3.1.3 .
Results and discussion
Selection of side images containing maximum information
Number of side view images selected for detecting ear positions, based on the successive use of two robust major axis regressions (step 3)
1st regression (%)
2nd regression (%)
Detection of most likely position of the ear in each side view image
The ‘stem width’ curves corresponding to all studied plants consisted in alternations of low values corresponding to internodes and of peaks corresponding to leaves (numbered in red in Fig. 6b). Stem width in the lower half of the plant (nodes 5–9) could be identified with acceptable accuracy (20.6 ± 4.8 mm) in 85% of plants. This was the cases when this reference stem width was calculated based on the 15% percentile of values, whereas using mean values resulted in a 40% of erroneous values. In the plant represented in Fig. 6, the wider internode between leaves 10 and 11 corresponded to the ear (first criterion for ear detection). This was consistent with the second criterion, a clear decrease in internode width from internode 11 onwards (vertical arrow, Fig. 6b). This analysis was performed for each selected side image of each plant to estimate potential ear positions (red dot, Fig. 6a).
Percentages of ear position correctly detected as a function of days before the appearance of silks
Days before silking
XYZ positioning of a mobile camera and image acquisition
Dynamic monitoring of silk growth traits and differences between genotypes and water treatments
The imaging procedure presented above resulted in ear images with a high spatial and temporal resolution, and allowed us to monitor silk growth dynamics from silk emergence until silk senescence in primary ears (Fig. 7a; see video in Additional file 8).
Output of an analysis of variance performed on maximum rates of silk growth and duration of silk growth
Silk growth rate (pixel day−1)
Silk growth duration (day)
By combining computer vision methods and robotics, the pipeline presented here provides for the first time an automatic and non-invasive procedure for monitoring silk growth dynamics at high-throughput in a phenotyping platform. It automatically detected ear position and evaluated silk growth in a panel of maize genotypes with contrasting size and architectures. It therefore provides a powerful tool for large-scale genetic analyses of the control of reproductive growth to changes in environmental conditions in reproductive structures in a non-invasive and automatized way.
NB, OT, FT, CW and LCB planned and designed the research. NB and LCB performed the experiments and analysed data. CF, CP and OS provided advice on the conception of the pipeline. NB, CF and SA implemented the code. NB, FT and LCB wrote the manuscript. All authors read and approved the manuscript.
Authors are grateful to Nathalie Luchaire, Benoît Suard, Thomas Laisné, Luciana Galizia, Alexandra Manset-Sarcos, Awaz Mohamed and Adel Meziane for their help in conducting the experiment.
The authors declare that they have no competing interests.
Availability of data and materials
The source code and examples are available on Github (https://github.com/openalea/eartrack) under an Open Source license (CeCILL-C). It has been integrated as a reusable package in the OpenAlea platform [54, 55]. User and developer documentation is also available at http://eartrack.readthedocs.io. A subset of whole plant images is available at https://zenodo.org/record/1002675 and a subset of ear images, Ilastik project and outputs are available at https://zenodo.org/record/1002173. It requires Python 2.7 and OpenCV libraries.
Consent for publication
All the authors have approved the manuscript and have made all required statements and declarations.
This work was supported by the “Infrastructure Biologie Santé” Phenome supported by the National Research Agency and the “Programme d’Investissements d’Avenir” (PIA) (ANR-11-INBS-0012).
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Oury V, Tardieu F, Turc O. Ovary apical abortion under water deficit is caused by changes in sequential development of ovaries and in silk growth rate in maize. Plant Physiol. 2016;171:986–96.PubMedGoogle Scholar
- Edmeades G, Bolaños J, Hernandez M, Bello S. Causes for silk delay in a lowland tropical maize population. Crop Sci. 1993;33:1029–35.View ArticleGoogle Scholar
- Edmeades GO, Bolanos J, Elings A, Ribaut J-M, Bänziger M, Westgate ME. The role and regulation of the anthesis-silking interval in maize. In: Westgate M, Boote K, editors. Physiology and modeling Kernel set in maize. CSSA Spec. Publ. 29. Madison: CSSA and ASA; 2000. p. 43–73. doi:10.2135/cssaspecpub29.c4.
- Fuad-Hassan A, Tardieu F, Turc O. Drought-induced changes in anthesis-silking interval are related to silk expansion: a spatio-temporal growth analysis in maize plants subjected to soil water deficit. Plant Cell Environ. 2008;31:1349–60.View ArticlePubMedGoogle Scholar
- Bolanos J, Edmeades GO, Martinez L. 8 cycles of selection for drought tolerance in lowland tropical maize. 3. Responses in drought-adaptive physiological and morphological traits. Field Crops Res. 1993;31:269–86.View ArticleGoogle Scholar
- Anderson SR, Farrington RL, Goldman DM, Hanselman TA, Hausmann NJ, Schussler JR. Methods for counting corn silks or other plural elongated strands and use of the count for characterizing the strands or their origin. 2009. U.S. Patent Application No. 12/545,266.Google Scholar
- Bassetti P, Wesgate ME. Floral asynchrony and kernel set in maize quantified by image analysis. Agron J. 1994;86:699–703.View ArticleGoogle Scholar
- Carcova J, Uribelarrea M, Borrás L, Otegui ME, Westgate ME. Synchronous pollination within and between ears improves kernel set in maize. Crop Sci. 2000;40:1056–61.View ArticleGoogle Scholar
- Monneveux P, Ribaut J-M, Okono A. Drought phenotyping in crops: from theory to practice. Frontiers E-books. 2014.Google Scholar
- Fuad-Hassan A, Tardieu F, Turc O. Drought-induced changes in anthesis-silking interval are related to silk expansion: a spatio-temporal growth analysis in maize plants subjected to soil water deficit. Plant Cell Environ. 2008;31:1349–60.View ArticlePubMedGoogle Scholar
- Turc O, Bouteillé M, Fuad-Hassan A, Welcker C, Tardieu F. The growth of vegetative and reproductive structures (leaves and silks) respond similarly to hydraulic cues in maize. New Phytol. 2016;212:377–88.View ArticlePubMedGoogle Scholar
- Tardieu F, Cabrera-Bosquet L, Pridmore T, Bennett M. Plant phenomics, from sensors to knowledge. Curr Biol. 2017;27:R770–83.View ArticlePubMedGoogle Scholar
- Bucksch A, Atta-Boateng A, Azihou AF, Battogtokh D, Baumgartner A, Binder BM, Braybrook SA, Chang C, Coneva V, DeWitt TJ. Morphological plant modeling: unleashing geometric and topological potential within the plant sciences. Front Plant Sci. 2017;8:900.View ArticlePubMedPubMed CentralGoogle Scholar
- Cabrera-Bosquet L, Crossa J, von Zitzewitz J, Serret MD, Araus JL. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge. J Integr Plant Biol. 2012;54:312–20.View ArticlePubMedGoogle Scholar
- Boyle R, Corke F, Howarth C. Image-based estimation of oat panicle development using local texture patterns. Funct Plant Biol. 2015;42:433–43.View ArticleGoogle Scholar
- Tang W, Zhang Y, Zhang D, Yang W, Li M. Corn tassel detection based on image processing. In: 2012 International workshop on image processing and optical engineering. International Society for Optics and Photonics; 2011. p. 83350J.Google Scholar
- Gage JL, Miller ND, Spalding EP, Kaeppler SM, de Leon N. TIPS: a system for automated image-based phenotyping of maize tassels. Plant Methods. 2017;13:21.View ArticlePubMedPubMed CentralGoogle Scholar
- Duan L, Huang C, Chen G, Xiong L, Liu Q, Yang W. Determination of rice panicle numbers during heading by multi-angle imaging. Crop J. 2015;3:211–9.View ArticleGoogle Scholar
- Gibbs JA, Pound M, French AP, Wells DM, Murchie E, Pridmore T. Approaches to three-dimensional reconstruction of plant shoot topology and geometry. Funct Plant Biol. 2017;44:62.View ArticleGoogle Scholar
- Duan T, Chapman S, Holland E, Rebetzke G, Guo Y, Zheng B. Dynamic quantification of canopy structure to characterize early plant vigour in wheat genotypes. J Exp Bot. 2016;67:4523–34.View ArticlePubMedPubMed CentralGoogle Scholar
- Paulus S, Schumann H, Kuhlmann H, Léon J. High-precision laser scanning system for capturing 3D plant architecture and analysing growth of cereal plants. Biosys Eng. 2014;121:1–11.View ArticleGoogle Scholar
- Jahnke S, Roussel J, Hombach T, Kochs J, Fischbach A, Huber G, Scharr H. phenoSeeder—a robot system for automated handling and phenotyping of individual seeds. Plant Physiol. 2016;172:1358–70.View ArticlePubMedPubMed CentralGoogle Scholar
- Vadez V, Kholova J, Hummel G, Zhokhavets U, Gupta SK, Hash CT. LeasyScan: a novel concept combining 3D imaging and lysimetry for high-throughput phenotyping of traits controlling plant water budget. J Exp Bot. 2015;66:5581–93.View ArticlePubMedPubMed CentralGoogle Scholar
- Cabrera-Bosquet L, Fournier C, Brichet N, Welcker C, Suard B, Tardieu F. High-throughput estimation of incident light, light interception and radiation-use efficiency of thousands of plants in a phenotyping platform. New Phytol. 2016;212:269–81.View ArticlePubMedGoogle Scholar
- Comaniciu D, Meer P. Mean shift: a robust approach toward feature space analysis. IEEE Trans Pattern Anal Mach Intell. 2002;24:603–19.View ArticleGoogle Scholar
- Fournier C, Artzet S, Chopard J, Mielewczik M, Brichet N, Cabrera L, Sirault X, Cohen-Boulakia S, Pradal C. Phenomenal: a software framework for model-assisted analysis of high throughput plant phenotyping data. In: IAMPS 2015 (international workshop on image analysis methods for the plant sciences). Louvain-la-Neuve; 2015.Google Scholar
- Van Rossum G, Drake FL. Python language reference manual. Network Theory. 2003. p. 144. ISBN: 0954161785 Google Scholar
- Sural S, Qian G, Pramanik S. Segmentation and histogram generation using the HSV color space for image retrieval. In: Image processing 2002 proceedings 2002 international conference on. IEEE; 2002. p. II–II.Google Scholar
- Breiman L, Friedman JH, Olshen RA, Stone CJ. Classification and regression trees. Monterey: Wadsworth & Brooks; 1984.Google Scholar
- R Core Team. R: a language and environment for statistical computing. R 3.0.0 edition. Vienna: R Foundation for Statistical Computing; 2015.Google Scholar
- Hinich MJ, Talwar PP. Simple method for robust regression. J Am Stat Assoc. 1975;70:113–9.View ArticleGoogle Scholar
- Lejeune P, Bernier G. Effect of environment on the early steps of ear initiation in maize (Zea mays L.). Plant Cell Environ. 1996;19:217–24.View ArticleGoogle Scholar
- van der Walt S, Schönberger JL, Nunez-Iglesias J, Boulogne F, Warner JD, Yager N, Gouillart E, Yu T. The scikit-image c: scikit-image: image processing in Python. PeerJ. 2014;2:e453.View ArticlePubMedPubMed CentralGoogle Scholar
- Dijkstra E. Anote on two problems in connection with graphs. Numer Math. 1959;1:101–18.View ArticleGoogle Scholar
- Maurer CR, Qi R, Raghavan V. A linear time algorithm for computing exact Euclidean distance transforms of binary images in arbitrary dimensions. IEEE Trans Pattern Anal Mach Intell. 2003;25:265–70.View ArticleGoogle Scholar
- Salvi J, Armangué X, Batlle J. A comparative review of camera calibrating methods with accuracy evaluation. Pattern Recogn. 2002;35:1617–35.View ArticleGoogle Scholar
- Sommer C, Straehle C, Kothe U, Hamprecht FA. Ilastik: interactive learning and segmentation toolkit. In: 8th IEEE international symposium on biomedical imaging. Chicago. IEEE; 2011. p. 230–3.Google Scholar
- Muggeo VM, Muggeo MVM. Package ‘segmented’. Biometrika 2017;58:525–34.Google Scholar
- Hartmann A, Czauderna T, Hoffmann R, Stein N, Schreiber F. HTPheno: an image analysis pipeline for high-throughput plant phenotyping. BMC Bioinform. 2011;12:148.View ArticleGoogle Scholar
- Golzarian MR, Frick RA, Rajendran K, Berger B, Roy S, Tester M, Lun DS. Accurate inference of shoot biomass from high-throughput images of cereal plants. Plant Methods. 2011;7:2.View ArticlePubMedPubMed CentralGoogle Scholar
- Knecht AC, Campbell MT, Caprez A, Swanson DR, Walia H. Image Harvest: an open-source platform for high-throughput plant image processing and analysis. J Exp Bot. 2016;67:3587–99.View ArticlePubMedPubMed CentralGoogle Scholar
- Klukas C, Chen D, Pape J-M. Integrated analysis platform: an open-source information system for high-throughput plant phenotyping. Plant Physiol. 2014;165:506–18.View ArticlePubMedPubMed CentralGoogle Scholar
- Coupel-Ledru A, Lebon E, Christophe A, Gallo A, Gago P, Pantin F, Doligez A, Simonneau T. Reduced nighttime transpiration is a relevant breeding target for high water-use efficiency in grapevine. Proc Natl Acad Sci USA. 2016;113:8963–8.View ArticlePubMedPubMed CentralGoogle Scholar
- Coupel-Ledru A, Lebon É, Christophe A, Doligez A, Cabrera-Bosquet L, Péchier P, Hamard P, This P, Simonneau T. Genetic variation in a grapevine progeny (Vitis vinifera L. cvs Grenache × Syrah) reveals inconsistencies between maintenance of daytime leaf water potential and response of transpiration rate under drought. J Exp Bot. 2014;65:6205–18.View ArticlePubMedPubMed CentralGoogle Scholar
- Coupel-Ledru A, Tyerman S, Masclef D, Lebon E, Christophe A, Edwards EJ, Simonneau T. Abscisic acid down-regulates hydraulic conductance of grapevine leaves in isohydric genotypes only. Plant Physiol. 2017. doi:10.1104/pp.17.00698.Google Scholar
- Lopez G, Pallas B, Martinez S, Lauri P-É, Regnard J-L, Durel C-É, Costes E. Genetic variation of morphological traits and transpiration in an apple core collection under well-watered conditions: towards the identification of morphotypes with high water use efficiency. PLoS ONE. 2015;10:e0145540.View ArticlePubMedPubMed CentralGoogle Scholar
- Maddonni GA, Otegui ME, Andrieu B, Chelle M, Casal JJ. Maize leaves turn away from neighbors. Plant Physiol. 2002;130:1181–9.View ArticlePubMedPubMed CentralGoogle Scholar
- Girardin P, Tollenaar M. Effects of intraspecific interference on maize leaf azimuth. Crop Sci. 1994;34:151–5.View ArticleGoogle Scholar
- McCormick RF, Truong SK, Mullet JE. 3D sorghum reconstructions from depth images identify QTL regulating shoot architecture. Plant Physiol. 2016;172:823–34.PubMedPubMed CentralGoogle Scholar
- Paulus S, Dupuis J, Mahlein A-K, Kuhlmann H. Surface feature based classification of plant organs from 3D laserscanned point clouds for plant phenotyping. BMC Bioinform. 2013;14:238.View ArticleGoogle Scholar
- Burgess AJ, Retkute R, Pound MP, Mayes S, Murchie EH. Image-based 3D canopy reconstruction to determine potential productivity in complex multi-species crop systems. Ann Bot. 2017;119:517–32.PubMedPubMed CentralGoogle Scholar
- Bassetti P, Westgate ME. Emergence, elongation, and senescence of maize silks. Crop Sci. 1993;33:271–5.View ArticleGoogle Scholar
- Cárcova J, Andrieu B, Otegui M. Silk elongation in maize. Crop Sci. 2003;43:914–20.View ArticleGoogle Scholar
- Pradal C, Dufour-Kowalski S, Boudon F, Fournier C, Godin C. OpenAlea: a visual programming and component-based software platform for plant modelling. Funct Plant Biol. 2008;35:751–60.View ArticleGoogle Scholar
- Pradal C, Fournier C, Valduriez P, Cohen-Boulakia S. OpenAlea: scientific workflows combining data analysis and simulation. In: Gupta A, Rathbun S, editors. 27th international conference on scientific and statistical database management (SSDBM 2015); San Diego. New York: ACM—Association for Computing Machinery; 2015. 978-1-4503-3709-0.Google Scholar