Atenção: Todas as denúncias são sigilosas e sua identidade será preservada.
Os campos nome e e-mail são de preenchimento opcional
Metadados | Descrição | Idioma |
---|---|---|
Autor(es): dc.contributor | Universidade Federal de Mato Grosso do Sul (UFMS) | - |
Autor(es): dc.contributor | Univ Western Sao Paulo | - |
Autor(es): dc.contributor | Soo Paulo State Univ | - |
Autor(es): dc.contributor | Univ Waterloo | - |
Autor(es): dc.contributor | Universidade Estadual Paulista (Unesp) | - |
Autor(es): dc.creator | Osco, Lucas Prado | - |
Autor(es): dc.creator | Arruda, Mauro dos Santos de | - |
Autor(es): dc.creator | Marcato Junior, Jose | - |
Autor(es): dc.creator | Silva, Neemias Buceli da | - |
Autor(es): dc.creator | Marques Ramos, Ana Paula | - |
Autor(es): dc.creator | Saito Moryia, Erika Akemi | - |
Autor(es): dc.creator | Imai, Nilton Nobuhiro | - |
Autor(es): dc.creator | Pereira, Danillo Roberto | - |
Autor(es): dc.creator | Creste, Jose Eduardo | - |
Autor(es): dc.creator | Matsubara, Edson Takashi | - |
Autor(es): dc.creator | Li, Jonathan | - |
Autor(es): dc.creator | Goncalves, Wesley Nunes | - |
Data de aceite: dc.date.accessioned | 2022-02-22T00:11:54Z | - |
Data de disponibilização: dc.date.available | 2022-02-22T00:11:54Z | - |
Data de envio: dc.date.issued | 2020-12-09 | - |
Data de envio: dc.date.issued | 2020-12-09 | - |
Data de envio: dc.date.issued | 2020-01-31 | - |
Fonte completa do material: dc.identifier | http://dx.doi.org/10.1016/j.isprsjprs.2019.12.010 | - |
Fonte completa do material: dc.identifier | http://hdl.handle.net/11449/197328 | - |
Fonte: dc.identifier.uri | http://educapes.capes.gov.br/handle/11449/197328 | - |
Descrição: dc.description | Visual inspection has been a common practice to determine the number of plants in orchards, which is a labor-intensive and time-consuming task. Deep learning algorithms have demonstrated great potential for counting plants on unmanned aerial vehicle (UAV)-borne sensor imagery. This paper presents a convolutional neural network (CNN) approach to address the challenge of estimating the number of citrus trees in highly dense orchards from UAV multispectral images. The method estimates a dense map with the confidence that a plant occurs in each pixel. A flight was conducted over an orchard of Valencia-orange trees planted in linear fashion, using a multispectral camera with four bands in green, red, red-edge and near-infrared. The approach was assessed considering the individual bands and their combinations. A total of 37,353 trees were adopted in point feature to evaluate the method. A variation of a (0.5; 1.0 and 1.5) was used to generate different ground truth confidence maps. Different stages (T) were also used to refine the confidence map predicted. To evaluate the robustness of our method, we compared it with two state-of-the-art object detection CNN methods (Faster R-CNN and RetinaNet). The results show better performance with the combination of green, red and near-infrared bands, achieving a Mean Absolute Error (MAE), Mean Square Error (MSE), R-2 and Normalized Root-MeanSquared Error (NRMSE) of 2.28, 9.82, 0.96 and 0.05, respectively. This band combination, when adopting sigma = 1 and a stage (T = 8), resulted in an R-2, MAE, Precision, Recall and F1 of 0.97, 2.05, 0.95, 0.96 and 0.95, respectively. Our method outperforms significantly object detection methods for counting and geolocation. It was concluded that our CNN approach developed to estimate the number and geolocation of citrus trees in highdensity orchards is satisfactory and is an effective strategy to replace the traditional visual inspection method to determine the number of plants in orchards trees. | - |
Descrição: dc.description | Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) | - |
Descrição: dc.description | Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) | - |
Descrição: dc.description | Fundect | - |
Descrição: dc.description | Univ Fed Mato Grosso do Sul, Fac Engn Architecture & Urbanism & Geog, Campo Grande, MS, Brazil | - |
Descrição: dc.description | Univ Fed Mato Grosso do Sul, Fac Comp Sci, Campo Grande, MS, Brazil | - |
Descrição: dc.description | Univ Western Sao Paulo, Fac Agron, Sao Paulo, Brazil | - |
Descrição: dc.description | Univ Western Sao Paulo, Fac Engn & Architecture, Sao Paulo, Brazil | - |
Descrição: dc.description | Soo Paulo State Univ, Dept Cartog Sci, BR-19060900 Presidente Prudente, SP, Brazil | - |
Descrição: dc.description | Univ Western Sao Paulo, Fac Comp Sci, Sao Paulo, Brazil | - |
Descrição: dc.description | Univ Waterloo, Dept Geog & Environm Management, Waterloo, ON N2L 3G1, Canada | - |
Descrição: dc.description | Univ Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, Canada | - |
Descrição: dc.description | CNPq: 433783/2018-4 | - |
Descrição: dc.description | CNPq: 304173/2016-9 | - |
Descrição: dc.description | CAPES: 88881.311850/2018-01 | - |
Descrição: dc.description | Fundect: 59/300.066/2015 | - |
Formato: dc.format | 97-106 | - |
Idioma: dc.language | en | - |
Publicador: dc.publisher | Elsevier B.V. | - |
Relação: dc.relation | Isprs Journal Of Photogrammetry And Remote Sensing | - |
???dc.source???: dc.source | Web of Science | - |
Palavras-chave: dc.subject | Deep learning | - |
Palavras-chave: dc.subject | Multispectral image | - |
Palavras-chave: dc.subject | UAV-borne sensor | - |
Palavras-chave: dc.subject | Object detection | - |
Palavras-chave: dc.subject | Citrus tree counting | - |
Palavras-chave: dc.subject | Orchard | - |
Título: dc.title | A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery | - |
Tipo de arquivo: dc.type | livro digital | - |
Aparece nas coleções: | Repositório Institucional - Unesp |
O Portal eduCAPES é oferecido ao usuário, condicionado à aceitação dos termos, condições e avisos contidos aqui e sem modificações. A CAPES poderá modificar o conteúdo ou formato deste site ou acabar com a sua operação ou suas ferramentas a seu critério único e sem aviso prévio. Ao acessar este portal, você, usuário pessoa física ou jurídica, se declara compreender e aceitar as condições aqui estabelecidas, da seguinte forma: