36
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Detecting Intra-Field Variation in Rice Yield With Unmanned Aerial Vehicle Imagery and Deep Learning

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Unmanned aerial vehicles (UAVs) equipped with multispectral sensors offer high spatial and temporal resolution imagery for monitoring crop stress at early stages of development. Analysis of UAV-derived data with advanced machine learning models could improve real-time management in agricultural systems, but guidance for this integration is currently limited. Here we compare two deep learning-based strategies for early warning detection of crop stress, using multitemporal imagery throughout the growing season to predict field-scale yield in irrigated rice in eastern Arkansas. Both deep learning strategies showed improvements upon traditional statistical learning approaches including linear regression and gradient boosted decision trees. First, we explicitly accounted for variation across developmental stages using a 3D convolutional neural network (CNN) architecture that captures both spatial and temporal dimensions of UAV images from multiple time points throughout one growing season. 3D-CNNs achieved low prediction error on the test set, with a Root Mean Squared Error (RMSE) of 8.8% of the mean yield. For the second strategy, a 2D-CNN, we considered only spatial relationships among pixels for image features acquired during a single flyover. 2D-CNNs trained on images from a single day were most accurate when images were taken during booting stage or later, with RMSE ranging from 7.4 to 8.2% of the mean yield. A primary benefit of convolutional autoencoder-like models (based on analyses of prediction maps and feature importance) is the spatial denoising effect that corrects yield predictions for individual pixels based on the values of vegetation index and thermal features for nearby pixels. Our results highlight the promise of convolutional autoencoders for UAV-based yield prediction in rice.

          Related collections

          Most cited references42

          • Record: found
          • Abstract: not found
          • Article: not found

          Red and photographic infrared linear combinations for monitoring vegetation

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Extracting and composing robust features with denoising autoencoders

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves.

              Leaf chlorophyll content provides valuable information about physiological status of plants. Reflectance measurement makes it possible to quickly and non-destructively assess, in situ, the chlorophyll content in leaves. Our objective was to investigate the spectral behavior of the relationship between reflectance and chlorophyll content and to develop a technique for non-destructive chlorophyll estimation in leaves with a wide range of pigment content and composition using reflectance in a few broad spectral bands. Spectral reflectance of maple, chestnut, wild vine and beech leaves in a wide range of pigment content and composition was investigated. It was shown that reciprocal reflectance (R lambda)-1 in the spectral range lambda from 520 to 550 nm and 695 to 705 nm related closely to the total chlorophyll content in leaves of all species. Subtraction of near infra-red reciprocal reflectance, (RNIR)-1, from (R lambda)-1 made index [(R lambda)(-1)-(RNIR)-1] linearly proportional to the total chlorophyll content in spectral ranges lambda from 525 to 555 nm and from 695 to 725 nm with coefficient of determination r2 > 0.94. To adjust for differences in leaf structure, the product of the latter index and NIR reflectance [(R lambda)(-1)-(RNIR)-1]*(RNIR) was used; this further increased the accuracy of the chlorophyll estimation in the range lambda from 520 to 585 nm and from 695 to 740 nm. Two independent data sets were used to validate the developed algorithms. The root mean square error of the chlorophyll prediction did not exceed 50 mumol/m2 in leaves with total chlorophyll ranged from 1 to 830 mumol/m2.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Plant Sci
                Front Plant Sci
                Front. Plant Sci.
                Frontiers in Plant Science
                Frontiers Media S.A.
                1664-462X
                23 March 2022
                2022
                : 13
                : 716506
                Affiliations
                [1] 1Department of Computer Science, Arkansas State University , Jonesboro, AR, United States
                [2] 2Center for No-Boundary Thinking, Arkansas State University , Jonesboro, AR, United States
                [3] 3University of Arkansas System Division of Agriculture , Little Rock, AR, United States
                [4] 4College of Agriculture, Arkansas State University , Jonesboro, AR, United States
                [5] 5U. S. Department of Agriculture, Department of Biological and Agricultural Engineering, University of Arkansas , Fayetteville, AR, United States
                [6] 6USDA Agricultural Research Service Delta Water Management Research Unit , Jonesboro, AR, United States
                Author notes

                Edited by: Alessandro Matese, Institute for Bioeconomy (CNR), Italy

                Reviewed by: Tiebiao Zhao, University of California, Merced, United States; Shangpeng Sun, McGill University, Canada

                *Correspondence: Emily S. Bellis, ebellis@ 123456astate.edu

                These authors share first authorship

                This article was submitted to Technical Advances in Plant Science, a section of the journal Frontiers in Plant Science

                Article
                10.3389/fpls.2022.716506
                8984025
                35401643
                d4c962c6-fbd2-4bb1-9cc5-5f6a29e2c1d2
                Copyright © 2022 Bellis, Hashem, Causey, Runkle, Moreno-García, Burns, Green, Burcham, Reba and Huang.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 28 May 2021
                : 18 February 2022
                Page count
                Figures: 6, Tables: 5, Equations: 4, References: 47, Pages: 13, Words: 9286
                Categories
                Plant Science
                Original Research

                Plant science & Botany
                convolutional autoencoder,remote sensing,uas—unmanned aerial system,grain crop,precision agriculture

                Comments

                Comment on this article

                scite_
                24
                1
                12
                0
                Smart Citations
                24
                1
                12
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content218

                Cited by6

                Most referenced authors710