Skip to main content

Deep learning for detecting herbicide weed control spectrum in turfgrass

Abstract

Background

Precision spraying of postemergence herbicides according to the herbicide weed control spectrum can substantially reduce herbicide input. The objective of this research was to evaluate the effectiveness of using deep convolutional neural networks (DCNNs) for detecting and discriminating weeds growing in turfgrass based on their susceptibility to ACCase-inhibiting and synthetic auxin herbicides.

Results

GoogLeNet, MobileNet-v3, ShuffleNet-v2, and VGGNet were trained to discriminate the vegetation into three categories based on the herbicide weed control spectrum: weeds susceptible to ACCase-inhibiting herbicides, weeds susceptible to synthetic auxin herbicides, and turfgrass without weed infestation (no herbicide). ShuffleNet-v2 and VGGNet showed high overall accuracy (≥ 0.999) and F1 scores (≥ 0.998) in the validation and testing datasets to detect and discriminate weeds susceptible to ACCase-inhibiting and synthetic auxin herbicides. The inference time of ShuffleNet-v2 was similar to MobileNet-v3, but noticeably faster than GoogLeNet and VGGNet. ShuffleNet-v2 was the most efficient and reliable model among the neural networks evaluated.

Conclusion

These results demonstrated that the DCNNs trained based on the herbicide weed control spectrum could detect and discriminate weeds based on their susceptibility to selective herbicides, allowing the precision spraying of particular herbicides to susceptible weeds and thereby saving more herbicides. The proposed method can be used in a machine vision-based autonomous spot-spraying system of smart sprayers.

Introduction

Turf is the predominant vegetation cover in urban landscapes, such as athletic fields, institutional and residential lawns, parks, and golf courses [1]. Weeds can be a significant challenge for turf management. Weeds compete with turfgrass for environmental resources such as sunlight, water, and nutrients [2, 3], reducing turf aesthetics and functionality. Herbicides are typically broadcast-applied for weed control [4], resulting in unnecessary application of herbicide to turf areas where weeds do not occur [5, 6]. This is a source of concern because excessive use of synthetic herbicides could potentially pollute the environment [6,7,8,9]. For example, monosodium methyl arsenate (MSMA), an organic arsenical herbicide, is used to control difficult-to-control weeds in bermudagrass [Cynodon dactylon (L.) Pers.] turf, but is detected in underground water [10]. In the United States, only a single broadcast application of MSMA is permitted for newly constructed golf courses per year. Application of MSMA on existing golf courses is limited to spot application and should not exceed 25% of the total turf area per year [7]. However, manual spot-spraying of herbicides is time-consuming and labor-intensive, and thus is unfeasible for large turf areas.

Machine vision-based precision herbicide spraying can reduce herbicide input and weed control costs [11]. Accurate weed detection is a prerequisite for automatic precision herbicide application [12, 13]. Various visual characteristics have been studied for weed detection and classification through image processing techniques, such as color [14], morphological [15], and textural features [16]. However, none of them can reliably detect and discriminate weeds due to the fact that crops and weeds may exhibit similar morphological characteristics [2, 17]. In recent years, deep learning, especially deep convolutional neural networks (DCNNs), has made significant advancements in image classification and object detection [18, 19]. Deep learning technologies have an extraordinary ability to automatically learn representations from raw data without introducing hand-coded rules or human domain knowledge and extract complex features from images with a high accuracy level [11, 20]. It has proven to be a powerful tool in computer vision [18, 21, 22], natural language processing [23, 24], and speech recognition [25, 26].

In agriculture, previous studies demonstrated the effectiveness of using DCNNs for weed detection [27, 28], disease detection [29, 30], yield prediction [31, 32], insect damage recognition [33, 34], and crop quality examination [35,36,37]. A large number of studies have investigated the feasibility of using DCNNs for weed detection in various cropping systems, such as vegetable [38], corn (Zea mays L.) [39], soybean [Glycine max (L.) Merr.] [40], wheat (Triticum aestivum L.) [41], and turf [5, 7, 42, 43]. Kamilaris et al. concluded that deep learning techniques generally outperformed traditional image processing methods for weed detection and classification [44].

The feasibility of using deep learning technology for weed detection and classification in turf was first reported by Yu et al. [42, 43], who compared three image classification neural networks including AlexNet, GoogLeNet, and VGGNet, and found that VGGNet effectively detected various broadleaf weeds including common chickweed [Stellaria media (L.) Vill.], dandelion (Taraxacum officinale F. H. Wigg.), henbit (Lamium amplexicaule L.), purple deadnettle (Lamium purpureum L.), and white clover (Trifolium repens L.) growing in dormant bermudagrass [42]. In another investigation, VGGNet also effectively detected grassy weeds including crabgrass (Digitaria spp.), doveweed [Murdannia nudiflora (L.) Brenan], dallisgrass (Paspalum dilatatum Poir.), and tropical signalgrass [Urochloa distachya (L.) T.Q. Nguyen] growing in bermudagrass turf [43].

Despite all the recent successes, none of the previous studies attempted to train deep learning models for detecting and discriminating different weed species growing in turf based on their susceptibility to particular herbicides. To achieve selective herbicide spraying, the machine vision system of an automatic herbicide sprayer (carry multiple herbicides) must be able to determine the types of herbicides that need to be sprayed. Therefore, the outputs of weed species neural networks cannot be used to guide and control the sprayers directly. Effective discrimination of weed species based on the herbicide weed control spectrum allows the smart sprayer to spray particular herbicides to control the susceptible weeds, thereby saving more herbicides. Crabgrass (Digitaria ischaemum L.), dallisgrass, dollarweed (Hydrocotyle spp.), goosegrass (Eleusine indica L.), old world diamond-flower (Hedyotis cormybosa L.), tropical signalgrass, Virginia buttonweed (Diodia virginiana L.), and white clover are the most common turf weeds in the Southeast United States. The performances of DCNNs for detecting and discriminating these weed species in turf were evaluated with the ultimate goal of selective herbicide application based on the herbicide weed control spectrum. The objectives of this research were to (1) investigate the feasibility of using DCNNs for detecting and discriminating weeds growing in bermudagrass turf based on their susceptibility to ACCase-inhibiting and synthetic auxin herbicides, (2) evaluate and compare the performance of DCNNs for discriminating individual weed species, and (3) determine the best herbicide weed control spectrum neural network by jointly analyzing the overall accuracy, F1 score, and inference time.

Materials and method

Overview

In this study, the DCNNs were trained according to the herbicide weed control spectrum with the ultimate goal of autonomous spot-spraying herbicides. Four image classification DCNNs, including GoogLeNet [45], MobileNet [46], ShuffleNet [47], and VGGNet [48] were evaluated to detect and discriminate weeds growing in bermudagrass turf. GoogLeNet is a type of neural network in the form of inception architecture. GoogLeNet reduces the number of neurons and parameters by taking an average among the channels right before the dense layer. MobileNet is constructed based on streamlined architecture, using depth-wise separable convolutions to build lightweight neural networks. MobileNet provides efficient and low-power models for mobile devices. ShuffleNet is designed for mobile applications with minimal requirement of computing power. It utilizes pointwise group convolution and channel shuffle to reduce computation cost while maintaining accuracy. VGGNet, also known as VGG-16, is composed of 13 convolutional layers and 3 fully connected layers. It has smaller filters with more depth instead of having large filters. These DCNN architectures were used for classifying and discriminating if the sub-images contain weeds susceptible to particular herbicides or exclusively contain bermudagrass turf without weed infestation.

Image acquisition

The training images of dallisgrass, goosegrass, Virginia buttonweed, and white clover growing in bermudagrass turf were acquired at the University of Georgia Griffin Campus in Griffin, Georgia, United States (33.26° N, 84.28° W), while the testing images were primarily taken in multiple golf courses in Peachtree City, Georgia, United States (33.39° N, 84.59° W). The training images of crabgrass, dollarweed, old world diamond-flower, and tropical signalgrass were taken at multiple golf courses in Bradenton (27.49° N, 82.47° W), Tampa (27.95° N, 82.45° W), Riverview (27.86° N, 82.32° W), and Sun City, Florida (27.71° N, 82.35° W), while the testing images were taken at multiple institutional lawns and golf courses in Lakeland, Florida (28.03° N, 81.94° W). The training and testing images of crabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, and white clover were taken multiple times from April to November 2018 using a digital camera (DSC-HX1, SONY®, Cyber-Shot Digital Still Camera, SONY Corporation, Minato, Tokyo, Japan) at a ratio of 16:9, with an original dimension of 1920 × 1080 pixels. The camera was set on automatic modes for parameters including exposure, focus, white balance, etc. During image acquisition, the images were adjusted at a height to obtain a ground-sampling distance of 0.05 cm pixel−1. The images were taken from 9:00 AM to 5:00 PM under various illumination conditions, including cloudy, partly cloudy, and sunny days.

Training and testing

Images containing a single weed species were selected and used for training and testing. Images containing crabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, and white clover growing in bermudagrass turf were cropped into 40 sub-images (5 rows × 8 columns, 40 grid cells) with a resolution of 240 × 216 pixels using ImageJ (version 2.1.0, an open-source software available at https://github.com/imagej/imagej). Sub-images of crabgrass, dallisgrass, goosegrass, and tropical signalgrass (Fig. 1), dollarweed, old world diamond-flower, Virginia buttonweed, and white clover (Fig. 2) at varying growth stages and densities, and sub-images of bermudagrass (Fig. 3) at varying turf management regimes, including different mowing heights and surface conditions were distributed evenly and used for training and testing the neural networks.

Fig. 1
figure 1

The training and testing images of crabgrass, dallisgrass, goosegrass, and tropical signalgrass at different growth stages and densities

Fig. 2
figure 2

The training and testing images of dollarweed, old world diamond-flower, Virginia buttonweed, and white clover at different growth stages and densities

Fig. 3
figure 3

The training and testing images of bermudagrass at different turfgrass management regimes, mowing heights, and surface conditions

The herbicide weed control spectrum neural networks were trained using a dataset containing 3 classes of sub-images: weeds susceptible to ACCase-inhibiting herbicides, weeds susceptible to synthetic auxin herbicides, and turf without weed infestation. To constitute the training dataset of the herbicide weed control spectrum neural networks, the aforementioned sub-images containing crabgrass, dallisgrass, goosegrass, or tropical signalgrass (susceptible to ACCase-inhibiting herbicides) were randomly selected, pooled, and labeled with ACCase-inhibiting herbicides, the aforementioned sub-images containing dollarweed, old world diamond-flower, Virginia buttonweed, or white clover (susceptible to synthetic auxin herbicides) were randomly selected, pooled, and labeled with Synthetic auxin herbicides, whereas the aforementioned sub-images containing only bermudagrass turf were used as the true negative images and labeled with No herbicide (Table 1).

Table 1 The number of sub-images used to constitute the training, validation, and testing datasets of the herbicide weed control spectrum neural networks

Weed species neural network was trained because we were interested in comparing the performances of the DCNNs for identifying individual weed species growing in bermudagrass turf. To constitute the training dataset of the weed species neural networks, a total of 24,000 sub-images (3000 images for each weed species) containing crabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, or white clover growing in bermudagrass turf were randomly selected and used as the true positive images. A total of 12,000 sub-images containing bermudagrass turf exclusively were randomly selected and used as the true negative images.

To constitute the validation or testing dataset (independent of each other) of the herbicide weed control spectrum neural networks, the aforementioned sub-images containing crabgrass, dallisgrass, goosegrass, or tropical signalgrass were pooled and labeled with ACCase-inhibiting herbicides, the aforementioned sub-images containing dollarweed, old world diamond-flower, Virginia buttonweed, or white clover were pooled and labeled with Synthetic auxin herbicides, while the aforementioned sub-images containing bermudagrass turf only were used as the true negative images and labeled with No herbicide (Table 1). To constitute the validation or testing dataset of the weed species neural networks, a total of 4800 sub-images (600 images for each weed species) containing crabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, or white clover growing in bermudagrass were randomly selected and used as the true positive images. A total of 2400 sub-images containing bermudagrass turf exclusively were randomly selected and used as the true negative images.

The training and testing were performed in PyTorch open-source deep learning environment (available at https://pytorch.org/; Facebook, San Jose, California, United States) using a graphic processing unit (NVIDIA GeForce RTX 2080 Ti, NVIDIA; Santa Clara, USA). The DCNNs were pre-trained using ImageNet to initialize the weights and bias through the transfer learning approach [49, 50]. The hyper-parameters used for training the DCNNs are presented in Table 2.

Table 2 Values of the hyperparameters for the neural networks

The training and testing results of image classification DCNNs were arranged in a binary classification confusion matrix consisting of four conditions: a true positive (tp), a true negative (tn), a false positive (fp), and a false negative (fn). The performances of the DCNNs were evaluated in terms of precision, recall, overall accuracy, and F1 score.

Precision measures the ability of the neural network to detect the target and was calculated using the following equation [51]:

$$ {\text{precision}} = \frac{tp}{{tp + fp}}. $$
(1)

Recall measures the effectiveness of the neural network to detect the target and was computed using the following equation [51]:

$$ {\text{recall}} = \frac{tp}{{tp + fn}}. $$
(2)

Overall accuracy measures the ratio between the corrected prediction and the total observation and was defined using the following equation [51]:

$$ {\text{Overall}}\;{\text{accuracy}} = \frac{tp + tn}{{tp + fp + fn + tn}}.$$
(3)

The F1 score measures the overall performance of the neural network and was defined as the harmonic means of precision and recall, which was determined using the following equation [51]:

$$ {F_1} = \frac{2 \times precision \times recall}{{precision + recall}}. $$
(4)

Frames per second (FPS) measures the number of images, known as frames, are processed by the neural network per second. The higher the FPS value, the faster the image processing is. The FPS was adopted as a quantitative metric to evaluate the speed of different neural networks.

Results and discussion

For herbicide weed control spectrum neural networks, no obvious differences were observed among GoogLeNet, ShuffleNet-v2, and VGGNet for detecting and discriminating weeds susceptible to ACCase-inhibiting and synthetic auxin herbicides (Table 3). The precision, recall, overall accuracy, and F1 score values of MobileNet-v3 were consistently lower than other neural networks in the validation and testing datasets. In general, the performances of herbicide weed control spectrum neural networks were slightly reduced in the testing datasets compared to the validation datasets. For detecting and discriminating the sub-images containing bermudagrass turf exclusively, the F1 score of MobileNet-v3 was 0.975 in the testing dataset, while the F1 scores of all other neural networks never fell below 0.998. ShuffleNet-v2 and VGGNet showed high overall accuracy (≥ 0.999) and F1 scores (≥ 0.998) in the validation and testing datasets to detect and discriminate weeds susceptible to ACCase-inhibiting and synthetic auxin herbicides.

Table 3 The performances of the herbicide weed control spectrum neural networks for detecting and discriminating the sub-images containing weeds susceptible to ACCase-inhibiting herbicides, weeds susceptible to synthetic auxin herbicides, or bermudagrass turf exclusively (no herbicide)

The inference time is critical for real-time weed detection and precision herbicide application. The speed of weed detection, in terms of FPS, is shown in Table 4. The FPS values of the herbicide weed control spectrum neural networks were calculated using images from the testing dataset. VGGNet demonstrated a significant speed advantage (189.10fps) over the other herbicide weed control spectrum neural networks (≤ 142.15fps) when detecting and discriminating the sub-images (240 × 216 pixels) with a batch size value of 1. Since the machine vision sub-system of our developed smart sprayer prototype captures images at a resolution of 1920 × 1080 pixels, the classification speed with original images was measured (by inferring the sub-image with a batch size value of 40). When detecting and discriminating the original images, ShuffleNet-v2, with 58.21 images inferred per second, was 6.61 slower than MobileNet-v3, but noticeably faster than GoogLeNet and VGGNet. MobileNet-v3 and ShuffleNet-v2 exhibited faster inference rates and outperformed the other neural networks on classification efficiency.

Table 4 The inference time of the neural networks evaluated in the study

By jointly analyzing the overall accuracy, F1 score, and FPS, ShuffleNet-v2 demonstrated superiorities in both accuracy and computational efficiency compared to the other herbicide weed control spectrum neural networks. This competitive result may mainly come from implementing pointwise group convolution and channel shuffle [47]. Overall, these results demonstrated that ShuffleNet-v2 was the most efficient and accurate model for detecting and discriminating weeds growing in turf susceptible to ACCase-inhibiting and synthetic auxin herbicides. Figure 4 shows the learning curve of ShuffleNet-v2 over 60 training epochs. The value of the loss function changes with training epochs, which forms the loss curve. The loss value quickly approaches 0.05 after 5 epochs. The loss curve continues to decline and stabilize, indicating minimal overfitting.

Fig. 4
figure 4

The learning curve of ShuffleNet-v2 when it was trained to detect herbicide weed control spectrum

Table 5 presents the metrics results when ShuffleNet-v2 was trained to detect and discriminate individual weed species. ShuffleNet-v2 exhibited excellent overall accuracy (≥ 0.997) and F1 score (≥ 0.980) with high precision and recall values in the validation datasets for detecting and discriminating the sub-images containing dallisgrass, goosegrass, old world diamond-flower, or Virginia buttonweed growing in bermudagrass turf and the sub-images containing bermudagrass turf exclusively. ShuffleNet-v2 had slightly reduced precision, recall, overall accuracy, and F1 score values in the testing dataset. For detecting and discriminating crabgrass, dollarweed, tropical signalgrass, or white clover, the F1 score of ShuffleNet-v2 never exceeded 0.932 in the validation and testing datasets, although it is the best herbicide weed control spectrum neural network.

Table 5 Weed detection validation and testing results when ShuffleNet-v2 was trained to detect and discriminate individual weed species

ShuffleNet-v2 presented a superiority in detecting the susceptibility of weed species to herbicides (Fig. 5). It was observed that 51 tropical signalgrass were misclassified as crabgrass, 18 dallisgrass were misclassified as goosegrass, 58 dollarweed were misclassified as white clover, and 11 Virginia buttonweed were misclassified as old world diamond-flower in the testing dataset. These weed species are morphologically similar. Therefore, it can be deduced that training DCNN models according to the herbicide weed control spectrum would likely eliminate the similarity issue in weed morphology and thereby increase detection accuracy.

Fig. 5
figure 5

Confusion matrices when ShuffleNet-v2 was trained as herbicide weed control spectrum neural network (a) and weed species neural network (b), respectively

In the present study, weed vegetation was only discriminated into two categories: weeds susceptible to ACCase-inhibiting herbicides versus weeds susceptible to synthetic auxin herbicides. While the herbicide weed control spectrum neural networks achieved high classification rates, more positive images of the training dataset comprised of three or even more categories of herbicides are highly desired. An additional study is needed to evaluate the feasibility of detecting and discriminating three weed vegetation categories, including broadleaf, grass, and nutsedge weeds growing in turf.

It should be noted that diclofop-methyl is the only ACCase-inhibitor that can be used to selectively control grass weeds, such as goosegrass and ryegrass (Lolium spp.), in bermudagrass turf [4, 52], while other ACCase-inhibitors such as fenoxaprop and fluazifop (aryloxyphenoxypropionate) are used to control grassy weeds in cool-season turfgrasses, and zoysiagrass (Zoysia spp.) [53, 54], and sethoxydim (cyclohexanedione) is used to control grassy weeds in centipedegrass [Eremochloa ophiuroides (Munro) Hack.] [55]. The majority of synthetic auxin herbicides (e.g. 2,4-D, dicamba, and mecoprop) are postemergence herbicides that selectively control broadleaf weeds within bermudagrass turf with only a few exceptions [4, 56, 57]. For example, quinclorac controls both broadleaf and crabgrass weeds in bermudagrass turf, while triclopyr is used to suppress bermudagrass in cool-season turfgrasses [58,59,60].

In this study, all training and testing images were cropped into 40 sub-images (grid cells). The image classification DCNNs were trained using these sub-images with a resolution of 240 × 216 pixels. Each sub-image (grid cell) represented a physical size of 10 cm × 9 cm. In a practical machine vision system, a custom software will be utilized to build a grid cell map and detect the location of weeds on the input image by identifying if the grid cells contain weeds that are susceptible to particular herbicides. The resolution of the sub-image (physical size) should be equal to or slightly smaller than the size of the area in which one nozzle is covered. In the future study, the trained herbicide weed control spectrum neural networks are employed to infer if the grid cells contained weeds. The grid cells are marked as spraying areas if the inference indicates they contain weeds. With a subsequent decision-making system, only the nozzles corresponding to those cells infested with weeds susceptible to selective herbicides are turned on, thus realizing smart sensing and spraying.

It should be noted that weeds susceptible to ACCase-inhibiting herbicides may be misclassified as susceptible to synthetic auxin herbicides (or vice versa) during field applications; however, this is unlikely to be an issue because areas with weed infestation have been detected. The occurrence of this type of erroneous classification can be minimized by increasing the number of training images containing such weed species.

Discriminating different categories of weed species growing in turf based on their susceptibility to selective herbicides allows spraying particular herbicides for weed control, thereby saving more herbicides. It should be noted that the weed species examined in the present study are the most common turf weeds in the Southeast United States. The purpose of the training dataset is to learn representations of different weeds and complex field environments on the performance of deep learning models applied to natural images. Improving the robustness and adaptability of the developed herbicide weed control spectrum neural networks depends on obtaining diverse training data. An additional study is needed to include a more diverse weed species in the training and testing datasets. Based on the high-level performance, the proposed method is highly suitable for ground-based weed detection in turf.

Summary and conclusions

This work demonstrated the feasibility of using image classification DCNNs to detect and discriminate weeds growing in bermudagrass turf based on their susceptibility to ACCase-inhibiting and synthetic auxin herbicides. This is the first study attempting to train DCNNs for detecting and discriminating weeds based on their susceptibility to selective herbicides, which will allow the use of particular herbicides for precision spraying susceptible weeds to save more herbicides.

ShuffleNet-v2 and VGGNet showed high overall accuracy (≥ 0.999) and F1 scores (≥ 0.998) in the validation and testing datasets to detect and discriminate weeds susceptible to ACCase-inhibiting and synthetic auxin herbicides. ShuffleNet-v2 was the best herbicide weed control spectrum neural network as it exhibited higher accuracy and computational efficiency among the neural networks evaluated. ShuffleNet-v2 presented a superiority in discriminating weeds based on their susceptibility to herbicides compared to when it was used to detect and discriminate individual weed species. The developed herbicide weed control spectrum neural network can be used in a machine vision sub-system with an automatic herbicide sprayer to achieve selective herbicide spraying.

Availability of data and materials

The datasets used in this study is available from the corresponding author on reasonable request.

Abbreviations

DCNNs:

Deep convolutional neural networks

MSMA:

Monosodium methyl arsenate

FPS:

Frames per second

References

  1. Milesi C, Elvidge C, Dietz J, Tuttle B, Nemani R, Running S. A strategy for mapping and modeling the ecological effects of US lawns. J Turfgrass Manag. 2005;1(1):83–97.

    Google Scholar 

  2. Hamuda E, Glavin M, Jones E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput Electron Agric. 2016;125:184–99. https://doi.org/10.1016/j.compag.2016.04.024.

    Article  Google Scholar 

  3. Liu B, Bruch R. Weed detection for selective spraying: a review. Curr Robot Rep. 2020;1(1):19–26.

    Article  Google Scholar 

  4. McElroy J, Martins D. Use of herbicides on turfgrass. Planta Daninha. 2013;31:455–67.

    Article  Google Scholar 

  5. Yu J, Schumann AW, Cao Z, Sharpe SM, Boyd NS. Weed detection in perennial ryegrass with deep learning convolutional neural network. Front Plant Sci. 2019;10:1422–1422. https://doi.org/10.3389/fpls.2019.01422.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Dai X, Xu Y, Zheng J, Song H. Analysis of the variability of pesticide concentration downstream of inline mixers for direct nozzle injection systems. Biosyst Eng. 2019;180:59–69. https://doi.org/10.1016/j.biosystemseng.2019.01.012.

    Article  Google Scholar 

  7. Yu J, Sharpe SM, Schumann AW, Boyd NS. Deep learning for image-based weed detection in turfgrass. Eur J Agron. 2019;104:78–84. https://doi.org/10.1016/j.eja.2019.01.004.

    Article  Google Scholar 

  8. Mennan H, Jabran K, Zandstra BH, Pala F. Non-chemical weed management in vegetables by using cover crops: a review. Agronomy. 2020;10(2):257. https://doi.org/10.3390/agronomy10020257.

    Article  CAS  Google Scholar 

  9. Slaughter DC, Giles DK, Downey D. Autonomous robotic weed control systems: a review. Comput Electron Agric. 2008;61(1):63–78. https://doi.org/10.1016/j.compag.2007.05.008.

    Article  Google Scholar 

  10. Mahoney DJ, Gannon TW, Jeffries MD, Matteson AR, Polizzotto ML. Management considerations to minimize environmental impacts of arsenic following monosodium methylarsenate (MSMA) applications to turfgrass. J Environ Manag. 2015;150:444–50.

    Article  CAS  Google Scholar 

  11. Liakos KG, Busato P, Moshou D, Pearson S, Bochtis D. Machine learning in agriculture: a review. Sensors. 2018;18(8):2674. https://doi.org/10.3390/s18082674.

    Article  PubMed Central  Google Scholar 

  12. Fennimore SA, Slaughter DC, Siemens MC, Leon RG, Saber MN. Technology for automation of weed control in specialty crops. Weed Technol. 2016;30(4):823–37.

    Article  Google Scholar 

  13. Wang A, Zhang W, Wei X. A review on weed detection using ground-based machine vision and image processing techniques. Comput Electron Agric. 2019;158:226–40. https://doi.org/10.1016/j.compag.2019.02.005.

    Article  Google Scholar 

  14. Tang J-L, Chen X-Q, Miao R-H, Wang D. Weed detection using image processing under different illumination for site-specific areas spraying. Comput Electron Agric. 2016;122:103–11. https://doi.org/10.1016/j.compag.2015.12.016.

    Article  Google Scholar 

  15. Perez A, Lopez F, Benlloch J, Christensen S. Colour and shape analysis techniques for weed detection in cereal fields. Comput Electron Agric. 2000;25(3):197–212.

    Article  Google Scholar 

  16. Bakhshipour A, Jafari A, Nassiri SM, Zare D. Weed segmentation using texture features extracted from wavelet sub-images. Biosyst Eng. 2017;157:1–12.

    Article  Google Scholar 

  17. Hasan AM, Sohel F, Diepeveen D, Laga H, Jones MG. A survey of deep learning techniques for weed detection from images. Comput Electron Agric. 2021;184: 106067.

    Article  Google Scholar 

  18. Shi J, Li Z, Zhu T, Wang D, Ni C. Defect detection of industry wood veneer based on NAS and multi-channel mask R-CNN. Sensors. 2020;20(16):4398.

    Article  Google Scholar 

  19. He T, Liu Y, Yu Y, Zhao Q, Hu Z. Application of deep convolutional neural network on feature extraction and detection of wood defects. Measurement. 2020;152: 107357.

    Article  Google Scholar 

  20. Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science. 2015;349(6245):255–60. https://doi.org/10.1126/science.aaa8415.

    Article  CAS  PubMed  Google Scholar 

  21. Gu J, Wang Z, Kuen J, et al. Recent advances in convolutional neural networks. Pattern Recogn. 2018;77:354–77. https://doi.org/10.1016/j.patcog.2017.10.013.

    Article  Google Scholar 

  22. Zhou H, Zhuang Z, Liu Y, Liu Y, Zhang X. Defect classification of green plums based on deep learning. Sensors. 2020;20(23):6993.

    Article  CAS  Google Scholar 

  23. Collobert R, Weston J. A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th international conference on machine learning; 2008. p. 160–7.

  24. Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P. Natural language processing (almost) from scratch. J Mach Learn Res. 2011;12:2493–537.

    Google Scholar 

  25. Hinton G, Deng L, Yu D, et al. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag. 2012;29(6):82–97. https://doi.org/10.1109/MSP.2012.2205597.

    Article  Google Scholar 

  26. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521(7553):436–44. https://doi.org/10.1038/nature14539.

    Article  CAS  PubMed  Google Scholar 

  27. Suh HK, Ijsselmuiden J, Hofstee JW, van Henten EJ. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst Eng. 2018;174:50–65.

    Article  Google Scholar 

  28. Hu K, Coleman G, Zeng S, Wang Z, Walsh M. Graph weeds net: a graph-based deep learning method for weed recognition. Comput Electron Agric. 2020;174: 105520.

    Article  Google Scholar 

  29. Lee SH, Goëau H, Bonnet P, Joly A. New perspectives on plant disease characterization based on deep learning. Comput Electron Agric. 2020;170: 105220.

    Article  Google Scholar 

  30. Ferentinos KP. Deep learning models for plant disease detection and diagnosis. Comput Electron Agric. 2018;145:311–8.

    Article  Google Scholar 

  31. Khaki S, Wang L. Crop yield prediction using deep neural networks. Front Plant Sci. 2019;10:621.

    Article  Google Scholar 

  32. Nevavuori P, Narra N, Lipping T. Crop yield prediction with deep convolutional neural networks. Comput Electron Agric. 2019;163: 104859.

    Article  Google Scholar 

  33. Liu W, Wu G, Ren F, Kang X. DFF-ResNet: an insect pest recognition model based on residual networks. Big Data Mining Anal. 2020;3(4):300–10.

    Article  Google Scholar 

  34. Rustia DJA, Chao JJ, Chiu LY, Wu YF, Chung JY, Hsu JC, Lin TT. Automatic greenhouse insect pest detection and recognition based on a cascaded deep learning classification method. J Appl Entomol. 2021;145(3):206–22.

    Article  CAS  Google Scholar 

  35. Kamilaris A, Prenafeta-Boldú FX. A review of the use of convolutional neural networks in agriculture. J Agric Sci. 2018;156(3):312–22.

    Article  Google Scholar 

  36. Sahu P, Chug A, Singh AP, Singh D, Singh RP. Deep learning models for crop quality and diseases detection. In: Proceedings of the international conference on paradigms of computing, communication and data sciences. Springer; 2021. p. 843–51.

  37. Perales Gómez ÁL, López-de-Teruel PE, Ruiz A, García-Mateos G, Bernabé García G, García Clemente FJ. FARMIT: continuous assessment of crop quality using machine learning and deep learning techniques for IoT-based smart farming. Clust Comput. 2022;25(3):2163–78.

    Article  Google Scholar 

  38. Jin X, Sun Y, Che J, Bagavathiannan M, Yu J, Chen Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag Sci. 2022;78(5):1861–9.

    Article  CAS  Google Scholar 

  39. Ahmad A, Saraswat D, Aggarwal V, Etienne A, Hancock B. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Comput Electron Agric. 2021;184: 106081.

    Article  Google Scholar 

  40. dos Santos Ferreira A, Matte Freitas D, Gonçalves da Silva G, Pistori H, TheophiloFolhes M. Weed detection in soybean crops using ConvNets. Comput Electron Agric. 2017;143:314–24. https://doi.org/10.1016/j.compag.2017.10.027.

    Article  Google Scholar 

  41. Zhuang J, Li X, Bagavathiannan M, et al. Evaluation of different deep convolutional neural networks for detection of broadleaf weed seedlings in wheat. Pest Manag Sci. 2022;78(2):521–9.

    Article  CAS  Google Scholar 

  42. Yu J, Sharpe SM, Schumann AW, Boyd NS. Detection of broadleaf weeds growing in turfgrass with convolutional neural networks. Pest Manag Sci. 2019;75(8):2211–8.

    Article  CAS  Google Scholar 

  43. Yu J, Schumann AW, Sharpe SM, Li X, Boyd NS. Detection of grassy weeds in bermudagrass with deep convolutional neural networks. Weed Sci. 2020;68(5):545–52.

    Article  Google Scholar 

  44. Kamilaris A, Prenafeta-Boldú FX. Deep learning in agriculture: a survey. Comput Electron Agric. 2018;147:70–90.

    Article  Google Scholar 

  45. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A. Going deeper with convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2015. p. 1–9.

  46. Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H. Mobilenets: efficient convolutional neural networks for mobile vision applications. arXiv preprint. arXiv:170404861. 2017.

  47. Zhang X, Zhou X, Lin M, Sun J. Shufflenet: an extremely efficient convolutional neural network for mobile devices. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2018. p. 6848–56.

  48. Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint. arXiv:14091556. 2014.

  49. Lu J, Behbood V, Hao P, Zuo H, Xue S, Zhang G. Transfer learning using computational intelligence: a survey. Knowl Based Syst. 2015;80:14–23.

    Article  Google Scholar 

  50. Deng J, Dong W, Socher R, Li L-J, Li K, Fei-Fei L. Imagenet: a large-scale hierarchical image database. In: 2009 IEEE conference on computer vision and pattern recognition. IEEE; 2009. p. 248–55.

  51. Sokolova M, Lapalme G. A systematic analysis of performance measures for classification tasks. Inf Process Manag. 2009;45(4):427–37.

    Article  Google Scholar 

  52. McCullough PE, Yu J, Raymer PL, Chen Z. First report of ACCase-resistant goosegrass (Eleusine indica) in the United States. Weed Sci. 2016;64(3):399–408.

    Article  Google Scholar 

  53. Neal JC, Bhowmik PC, Senesac AF. Factors influencing fenoxaprop efficacy in cool-season turfgrass. Weed Technol. 1990;4(2):272–8.

    Article  CAS  Google Scholar 

  54. Tate TM, McCullough PE, Harrison ML, Chen Z, Raymer PL. Characterization of mutations conferring inherent resistance to acetyl coenzyme A carboxylase-inhibiting herbicides in turfgrass and grassy weeds. Crop Sci. 2021;61(5):3164–78.

    Article  CAS  Google Scholar 

  55. Ferrell JA, Murphy TR, Vencill WK, Guerke WR. Effects of postemergence herbicides on centipedegrass seed production. Weed Technol. 2003;17(4):871–5.

    Article  CAS  Google Scholar 

  56. Grichar WJ, Baumann PA, Baughman TA, Nerada JD. Weed control and bermudagrass tolerance to imazapic plus 2, 4-D. Weed Technol. 2008;22(1):97–100.

    Article  CAS  Google Scholar 

  57. Reed TV, Yu J, McCullough PE. Aminocyclopyrachlor efficacy for controlling Virginia buttonweed (Diodia virginiana) and smooth crabgrass (Digitaria ischaemum) in tall fescue. Weed Technol. 2013;27(3):488–91.

    Article  CAS  Google Scholar 

  58. Brosnan JT, Breeden GK. Bermudagrass (Cynodon dactylon) control with topramezone and triclopyr. Weed Technol. 2013;27(1):138–42.

    Article  CAS  Google Scholar 

  59. Lewis D, McElroy J, Sorochan J, Mueller T, Samples T, Breeden G. Efficacy and safening of aryloxyphenoxypropionate herbicides when tank-mixed with triclopyr for bermudagrass control in zoysiagrass turf. Weed Technol. 2010;24(4):489–94.

    Article  CAS  Google Scholar 

  60. Yu J, McCullough PE, Czarnota MA. Selectivity and fate of monosodium methylarsenate in bermudagrass, centipedegrass, and seashore paspalum. Crop Sci. 2017;57(S1):S-322-S-330.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was supported by the National Natural Science Foundation of China (Grant No. 32072498), the Postgraduate Research & Practice Innovation Program of Jiangsu Province (Grant No. KYCX22_1051), the Key Research and Development Program of Jiangsu Province (Grant No. BE2021016), and the Jiangsu Agricultural Science and Technology Innovation Fund (Grant No. CX(21)3184).

Author information

Authors and Affiliations

Authors

Contributions

All authors made significant contributions to this research. XJ conceived the research ideas and designed the experiments under the guidance of YC and JY. AM and JY collected the data and conducted the data analysis. XJ drafted the manuscript. MB, AM, YC, and JY edited and revised the manuscript. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Yong Chen or Jialin Yu.

Ethics declarations

Ethics approval and consent to participate

All authors read and approved the manuscript.

Consent for publication

All authors agreed to publish this manuscript.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, X., Bagavathiannan, M., Maity, A. et al. Deep learning for detecting herbicide weed control spectrum in turfgrass. Plant Methods 18, 94 (2022). https://doi.org/10.1186/s13007-022-00929-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13007-022-00929-4

Keywords