In viticulture, it is critical to predict productivity levels of the different vineyard zones to undertake appropriate cropping practices. To overcome this challenge, the final yield was predicted by combining vegetation indices (VIs) to sense the health status of the crop and by computer vision to obtain the vegetated fraction cover (F-c) as a measure of plant vigour. Multispectral imagery obtained from an unmanned aerial vehicle (UAV) is used to obtain VIs and F-c, which are used together with artificial neural networks (ANN) to model the relationship between VIs, F-c and yield. The proposed methodology was applied in a vineyard, where different irrigation and fertilisation doses were applied. The results showed that using computer vision techniques to differentiate between canopy and soil is necessary in precision viticulture to obtain accurate results. In addition, the combination of VIs (reflectance approach) and F-c (geometric approach) to predict vineyard yield results in higher accuracy (root mean square error (RMSE) = 0.9 kg vine(-1) and relative error (RE) = 21.8% for the image when close to harvest) compared to the simple use of VIs (RMSE = 1.2 kg vine(-1) and RE = 28.7%). The implementation of machine learning techniques resulted in much more accurate results than linear models (RMSE = 0.5 kg vine(-1) and RE = 12.1%). More precise yield predictions were obtained when images were taken close to the harvest date, although promising results were obtained at earlier stages. Given the perennial nature of grapevines and the multiple environmental and endogenous factors determining yield, seasonal calibration for yield prediction is required.

Vineyard yield estimation by combining remote sensing, computer vision and artificial neural network techniques

J. M. Ramirez-Cuesta;
2020-01-01

Abstract

In viticulture, it is critical to predict productivity levels of the different vineyard zones to undertake appropriate cropping practices. To overcome this challenge, the final yield was predicted by combining vegetation indices (VIs) to sense the health status of the crop and by computer vision to obtain the vegetated fraction cover (F-c) as a measure of plant vigour. Multispectral imagery obtained from an unmanned aerial vehicle (UAV) is used to obtain VIs and F-c, which are used together with artificial neural networks (ANN) to model the relationship between VIs, F-c and yield. The proposed methodology was applied in a vineyard, where different irrigation and fertilisation doses were applied. The results showed that using computer vision techniques to differentiate between canopy and soil is necessary in precision viticulture to obtain accurate results. In addition, the combination of VIs (reflectance approach) and F-c (geometric approach) to predict vineyard yield results in higher accuracy (root mean square error (RMSE) = 0.9 kg vine(-1) and relative error (RE) = 21.8% for the image when close to harvest) compared to the simple use of VIs (RMSE = 1.2 kg vine(-1) and RE = 28.7%). The implementation of machine learning techniques resulted in much more accurate results than linear models (RMSE = 0.5 kg vine(-1) and RE = 12.1%). More precise yield predictions were obtained when images were taken close to the harvest date, although promising results were obtained at earlier stages. Given the perennial nature of grapevines and the multiple environmental and endogenous factors determining yield, seasonal calibration for yield prediction is required.
2020
Unmanned aerial vehicles
Deficit irrigation
RGB ortho-images
NIR
Red-edge
k-means algorithm
Vitis vinifera
File in questo prodotto:
File Dimensione Formato  
Precision agriculture 2020.pdf

solo gestori archivio

Descrizione: Articolo
Tipologia: Versione Editoriale (PDF)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 1.81 MB
Formato Adobe PDF
1.81 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/552483
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? 38
social impact