Corn classification using Deep Learning with UAV imagery. An operational proof of concept

Fedra Trujillano, Andres Flores, Carlos Saito, Mario Balcazar, Daniel Racoceanu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

12 Scopus citations

Abstract

Climate change is affecting the agricultural production in Ancash - Peru and corn is one of the most important crops of the region. It is essential to constantly monitor grain yields and generate statistic models in order to evaluate how climate change will affect food security. The present study proposes as a proof of concept to use Deep learning techniques for the classification of near infrared images, acquired by an Unmanned Aerial Vehicle (UAV), in order to estimate areas of corn, for food security purpose. The results show that using a well balanced (altitudes, seasons, regions) database during the acquisition process improves the performance of a trained system, therefore facing crop classification from a variable and difficult-to-access geography.

Original languageEnglish
Title of host publication2018 IEEE 1st Colombian Conference on Applications in Computational Intelligence, ColCACI 2018 - Proceedings
EditorsAlvaro David Orjuela-Canon, Diana Briceno Rodriguez
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538667408
DOIs
StatePublished - 5 Oct 2018
Event1st IEEE Colombian Conference on Applications in Computational Intelligence, ColCACI 2018 - Medellin, Colombia
Duration: 16 May 201818 May 2018

Publication series

Name2018 IEEE 1st Colombian Conference on Applications in Computational Intelligence, ColCACI 2018 - Proceedings

Conference

Conference1st IEEE Colombian Conference on Applications in Computational Intelligence, ColCACI 2018
Country/TerritoryColombia
CityMedellin
Period16/05/1818/05/18

Fingerprint

Dive into the research topics of 'Corn classification using Deep Learning with UAV imagery. An operational proof of concept'. Together they form a unique fingerprint.

Cite this