Automated and cost-effective phenotyping pipelines are needed to efficiently characterize new lines and hybrids developed in plant breeding programs. In this study, we employ deep neural networks (DNNs) to model individual maize plants using 3D point cloud data derived from unmanned aerial systems (UAS) imagery by PointNet network. The experimental setup was performed at the Indiana Corn and Soybean Innovation Center at the Agronomy Center for Research and Education (ACRE) in West Lafayette, Indiana, USA. On June 17th, 2020 a flight was carried out over maize trials using a custom designed UAS platform with a Sony Alpha ILCE-7R photogrammetric sensor. RGB images were processed by a standard photogrammetric pipeline by Structure from Motion (SfM) to reconstruct the study field into a final scaled 3D point cloud. 50 individual maize plants were manually segmented from the point cloud to train the DNN and subsequently individual plants were extracted over a test trial with more than 5,000 plants. Moreover, to reduce overfitting in the fully-connected layers, we employed data augmentation not only in translation, but also in color intensity. Results show a successful rate for the extraction of the individual plants of 72.4%. Our test trial demonstrates the possibility of using deep learning to overcome the individual maize extraction challenge on the basis of UAS data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.