UAV and ground-based imagery analysis detects canopy structure changes after canopy management applications
Abstract
Aim: To analyse unmanned aerial vehicle (UAV)-based imagery to assess canopy structural changes after the application of different canopy management practices in the vineyard.
Methods and results: Four different canopy management practices: i–ii) leaf removal within the bunch zone (eastern side/both eastern and western sides), iii) bunch thinning and iv) shoot trimming were applied to grapevines at veraison, in a commercial Cabernet-Sauvignon vineyard in McLaren Vale, South Australia. UAV-based imagery captures were taken: i) before the canopy treatments, ii) after the treatments and iii) at harvest to assess the treatment outcomes. Canopy volume, projected canopy area and normalized difference vegetation index (NDVI) were derived from the analysis of RGB and multispectral imagery collected using the UAV. Plant area index (PAI) was calculated using the smartphone app VitiCanopy as a ground-based measurement for comparison with UAV-derived measurements. Results showed that all three types of UAV-based measurements detected changes in the canopy structure after the application of canopy management practices, except for the bunch thinning treatment. As expected, ground-based PAI was the only technique to effectively detect internal canopy structure changes caused by bunch thinning. Canopy volume and PAI were found to better detect variations in canopy structure compared to NDVI and projected canopy area. The latter were negatively affected by the interference of the trimmed shoots left on the ground.
Conclusions: UAV-based tools can provide accurate assessments to some canopy management outcomes at the vineyard scale. Among different UAV-based measurements, canopy volume was more sensitive to changes in canopy structure, compared to NDVI and projected canopy area, and demonstrated a greater potential to assess the outcomes of a range of canopy management practices.
Significance and impact of the study: Canopy management practices are widely applied to regulate canopy growth, improve grape quality and reduce disease pressure in the bunch zone. Being able to detect major changes in canopy structure, with some limitations when the practice affects the internal structure (i.e., bunch thinning), UAV-based imagery analysis can be used to measure the outcome of common canopy management practices and it can improve the efficiency of vineyard management.
Introduction
Among the vineyard management practices, canopy management is widely applied to regulate canopy growth, reduce disease pressure, improve bud fertility and improve berry quality (Dry, 2008; Mirás-Avalos et al., 2017; Trought et al., 2017; Wolf et al., 2003). Commonly applied canopy management practices such as leaf removal, shoot trimming and bunch thinning aim to modify the source–sink relationship by reducing leaf density and/or crop load (Smart & Robinson, 1991). By selecting different practices and their levels/intensity of application, canopy treatments can have various outcomes. Low levels of input often have a limited impact on the canopy structure and are inefficient. In contrast, excessive application can negatively impact yield and quality including increasing the risk of exposing the crop to extreme weather conditions, such as heat waves (Caravia et al., 2016; Reynolds et al., 2005; Vasconcelos & Castagnoli, 2000).
To effectively apply canopy management practices, it is crucial to have convenient and accurate assessments of their outcomes. One approach could be to directly compare the differences in canopy structure before and after the application of canopy management practices, through the assessment of parameters such as plant area index (PAI: total leaf and cordon area per unit ground area) (Bréda, 2003; De Bei et al., 2016) and canopy volume. However, the most accurate estimations for these parameters involve destructive and labour-intensive sampling practices in the field (Gower et al., 1999; Jonckheere et al., 2004). To overcome these disadvantages, there have been recent developments in smartphone apps that analyse upward-looking canopy cover imagery and offer an objective and accurate solution to the measurement of PAI (De Bei et al., 2016; Fuentes et al., 2014; Poblete-Echeverria et al., 2015). These tools, that estimate PAI, have been effectively applied to assess changes in canopy structure during the growing season (De Bei et al., 2019; Wang et al., 2019).
Alternatively, optical remote sensing using unmanned aerial vehicles (UAV), aircraft and satellite platforms can be applied to estimate canopy structure (Hall, 2018). Amongst these platforms, recent advancements in UAV related research have led to a wide range of UAV applications for monitoring vineyard performance such as the rate of canopy development, canopy structure spatial variability, disease incidence and canopy water status (Albetis et al., 2018; de Castro et al., 2018; Mathews & Jensen, 2013; Pádua et al., 2018; Romero et al., 2018; Su et al., 2016). Through the collection of high-resolution red/green/blue (RGB), multispectral or hyperspectral imagery, UAV-mounted sensors are capable of providing data to create high-resolution RGB and spectral indices maps at the vineyard scale, e.g., normalized difference vegetation index (NDVI) and plant cell density (PCD) maps (Xue & Su, 2017). In addition, three-dimensional digital models, including vineyard point cloud and digital canopy model (DCM), can be created from overlapping images captured by the UAV (Comba et al., 2018). From vineyard digital models, parameters such as canopy height, projected area and volume can be calculated and provide detailed information regarding the canopy structure (Matese & Di Gennaro, 2018; Weiss & Baret, 2017). Compared with manned aircraft and satellite-based remote sensing, UAV also offers convenience in simple flight preparation, flexible operation options (Khaliq et al., 2019) and is more cost-effective for small and medium-size vineyards (Andújar et al., 2019; Matese et al., 2015). These advantages can help obtain a prompt evaluation of canopy management outcomes during critical developmental stages. Despite the demonstrated potential of these techniques, applying UAV remote sensing for measuring canopy management outcomes is currently limited.
This study aimed to assess whether UAV remote sensing can detect canopy structure changes after the application of different canopy management practices. These assessments were compared to ground-based PAI measured at the same time as the UAV flights. The advantages and limitations of using UAV as a monitoring platform in evaluating canopy structure changes are discussed.
Materials and methods
1. Study vineyard and experimental design
A commercial Cabernet-Sauvignon vineyard in the wine region McLaren Vale, South Australia (Lat—35.218, Long—138.542) was used for the study during the 2017–18 growing season. The climate of the region is classified as Mediterranean with low summer rainfall and the soil type is red/brown loamy sand (Wine Australia, 2020). In the study vineyard, vine and row spacings were 2 m and 3 m, respectively, and row orientation was north to south. Vines were trained with spur pruning and develops a sprawling canopy at the approximate width of 1.3 m and the cordon height of 1.1 m. The commercial vineyard was managed with standardised management practices, drip irrigated and no cover crop was grown in the mid-row or under-vine.
At veraison (E-L stage 35), four canopy management treatments were applied: i) leaf removal on the eastern side of the canopy, ii) leaf removal on both sides of the canopy, iii) bunch thinning and iv) shoot trimming on the eastern side of the canopy (Table 1). Removing leaves and/or shoot trimming on the eastern side only is often performed to reduce canopy density while protecting the bunches on the western side from the intense afternoon sunlight (Reynolds & Vanden Heuvel, 2009). A randomised replicate design was used to apply the treatments in the vineyard. As shown in Figure 1, each replicate consisted of six vines (two panels) and the different coloured sections in the figure correspond to different treatments. Each treatment was replicated six times; 180 grapevines in total were monitored and measured in the study.
Table 1. Treatment code, description and purpose of the different canopy management practices applied at veraison (E-L stage 35) in the study vineyard, during the 2017–18 growing season.
Code |
Treatment Description |
Purpose of Treatment |
---|---|---|
C |
Control |
No canopy management applied |
LR-E |
Leaf removal at bunch zone on the east side only |
Reduce canopy density and increase light penetration in the bunch zone on the east side |
LR-B |
Leaf removal at bunch zone on both sides (east and west) |
Increase light into bunch zone on both sides |
BT |
Bunch thinning |
Reduce bunch number by 50 % |
ST-E |
Shoot trimming east side only |
Reduce shoot length/canopy size by 50 % |
Figure 1. Cabernet-Sauvignon experimental trial vineyard in McLaren Vale, South Australia.

Four different canopy management practices were applied: i) leaf removal on the eastern side of the canopy (LR-E), ii) leaf removal on both sides of the canopy (eastern and western, LR-B), iii) bunch thinning (BT) and iv) shoot trimming on the eastern side of the canopy (ST-E) (colour-coded according to the treatments received). Each treatment was applied to six replicates. The background RGB orthomosaic image was captured at veraison (E-L stage 35).
2.2. Ground and UAV-based Imagery Acquisition and Analysis
To measure the impact of treatments on canopy structure, a combination of ground and aerial imagery analysis was used. Measurements were taken: i) before treatment application (E-L stage 35), ii) one week after treatment application and iii) at harvest (E-L stage 38). For the ground measurement, two images were acquired from the middle vine in each panel, one on the left and one on the right side of the trunk. A total of 24 images were acquired per treatment at each measurement. From these images, PAI was calculated using the smartphone app VitiCanopy (University of Adelaide, South Australia, Australia). The ground imagery was captured by the frontal RGB camera of an Apple iPhone 7 at a resolution of 7.2 megapixels (Apple Inc., Cupertino, CA, USA), detailed procedures for image capture and PAI calculation can be found in De Bei et al. (2016). The PAI was also calibrated against real LAI, measured by destructive leave removal measurements in the same study site, and their correlations can be found in the supplementary information.
UAV flights were performed to acquire RGB and multispectral imagery of the study vineyard. RGB imagery was captured by the RGB sensor of a Phantom 4 Pro quadcopter (DJI, Shenzhen, China) at a resolution of 20 megapixels. Multispectral imagery was acquired by the Parrot Sequoia multispectral sensor (Parrot SA, Paris, France) recording spectral bands at green (550 nm), red (660 nm), red edge (735 nm) and near-infrared (790 nm) bands, at a resolution of 1.2 megapixels. The Sequoia camera was integrated into a Phantom 3 Adv (DJI, Shenzhen, China) quadcopter using a customized integration package (MicaSense, v1.3, Seattle, USA). The multispectral images were radiometrically calibrated by the onboard downwelling sunlight sensor during the flight and a calibrated reflectance panel on the ground (MicaSense, Seattle, USA). On a clear day, the flights were conducted at solar noon to minimise any shadow effect. The single-grid routes covering all replicate panels at an overlap ratio of 80 % were set. The aircraft was maintained at a height of 30 m above ground and a constant speed of 2 m/s during the flight. The geographic coordinates of images captured were recorded by the onboard Global Positioning System (GPS) receiver.
The commercial photogrammetry software PIX4Dmapper (v.4.4.12; Pix4D SA, Lausanne, Switzerland) was used for UAV imagery reconstruction, the RGB orthomosaic images and the digital surface model (DSM) rasters of the vineyard were reconstructed from RGB images (Figure 2). Using multispectral images, single-band orthomosaic images were created from images of each spectral band. The average ground sampling distances (GSD) of the RGB and single-band orthomosaic images were 0.8 cm and 3.2 cm, respectively. Using a customized processing procedure in ArcGIS (v.10.5.1; ESRI, Redlands, CA, USA), the DSM was normalized to generate the digital canopy model (DCM) rasters. Using the image processing toolbox in Matlab (v.2019b, Natick, Massachusetts, United States), green canopy pixels in the RGB orthomosaic imagery were extracted according to the Lab colour space profile to create the canopy layer for projected canopy area calculation.
Figure 2. Flow chart of the unmanned aerial vehicle (UAV)-based RGB/multispectral imagery processing procedures for generating canopy volume, projected canopy area and normalized difference vegetation index (NDVI) in the current study.

To create the normalized difference vegetation index (NDVI) raster for the study vineyard, a calculation was performed using single-band orthomosaic images as follows:
where:
NIR = Near-infrared band orthomosaic image;
Red = Red band orthomosaic image.
With the NDVI raster for the whole vineyard, unsupervised classification was performed to extract the canopy-related pixels using the Iso Cluster Unsupervised Classification function in ArcGIS. As a result, the raster was classified into two classes: background and grapevine canopy. The canopy class was extracted as the classified NDVI raster for the calculation of mean NDVI per vine.
Vineyard rasters of DCM, RGB imagery of green canopy layer and classified NDVI containing only canopy-related pixels were stored in the tagged image file format (.tiff) image files. Rasters were geo-referenced using ground control points and experimental replicates were marked using the high-resolution RGB imagery by identifying posts separating individual panels. Polygon vectors for replicates were then created and used as masks to extract replicate rasters. According to the fixed planting distance, single grapevine rasters were further separated from the individual replicate rasters. Using the single vine rasters, canopy volume per vine (m3) were calculated from the DCM using volume above the cordon height plane (1.1 m above ground). Projected canopy area per vine (m2) was calculated using the sum of the single canopy pixel area in the RGB green pixel layer. The mean NDVI per vine was calculated from the mean NDVI values of all extracted canopy class pixels.
2.3. Statistical Analysis
Analysis of variance (ANOVA) was performed to determine if canopy treatments led to significantly different canopy structures (p < 0.05). Statistics were performed in the statistics and machine learning toolbox of Matlab and GraphPad Prism (v.8; GraphPad Software, San Diego, USA). Mean values, standard errors of the mean and statistical significance of measurements have been reported.
Results and discussion
With the application of canopy treatments, changes in the canopy structure were observed (Figure 3). Significant reductions in the mean canopy volume per vine were observed after leaf removal (LR-E and LR-B) and shoot trimming (ST-E). Leaf removal and shoot trimming applied to one side of the canopy (LR-E and ST-E) had also lowered canopy volume (from 1.29 m3 to 1.06 m3 and from 1.37 m3 to 1.12 m3, respectively). Double-sided leaf removal reduced canopy volume (from 1.42 m3 to 0.82 m3) more than the single-sided application. No difference in canopy volume was found when bunch thinning was applied and compared to the control group. This was expected as bunch thinning only removed bunches at the bunch zone inside the canopy which was covered by leaves and shoots, meaning the overall dimensions of the canopy were not altered. At harvest, the canopy volume of all treatment groups declined to significantly lower levels than after treatment application. The general decline in the canopy volume can be explained by the seasonal reduction in canopy development due to leaf senescence leading to leaf fall and the reduction in irrigation application (Pádua et al., 2018).
Figure 3. Canopy structure measurements from UAV-captured imagery.

(a) canopy volume (m3) per vine, (b) projected canopy (m2) area per vine, (c) normalized difference vegetation index (NDVI) and (d) ground-based plant area index (PAI). Four different treatments applied were: leaf removal on the eastern side of the canopy (LR-E), leaf removal on both sides of the canopy (LR-B), bunch thinning (BT) and shoot trimming on the eastern side of the canopy (ST-E). Canopy treatments were applied at veraison (E-L stage 35). Measurements were performed at three timepoints: 1) before the treatment, 2) one week after the treatment and 3) at harvest (E-L stage 38) to track changes in canopy structure. In each graph, bars are grouped by treatment and within each group, measurements at different timepoints are shown. Significant differences (p < 0.05) between measurements in the same group are annotated by different letters and bars show the standard errors of the mean value.
Similar to canopy volume, projected canopy area and NDVI values in the treatment groups where leaf removal and shoot trimming were applied (LR-E, LR-B and ST-E) were reduced while the control and bunch thinning groups (C and BT) remained relatively unchanged. Ground-based PAI measurements recorded significant differences in all treatment groups, including the bunch thinning treatment which was not detected by the UAV based remote sensing approaches. PAI was calculated from upward-looking canopy cover imagery (De Bei et al., 2016), and was shown to be capable of detecting the changes of internal canopy structure under foliage cover, such as the removal of bunches (Figure 4). In addition, canopy porosity, which is an important indicator of the light conditions inside the canopy and closely related to PAI, can also be generated from the canopy cover imagery (De Bei et al., 2016). Thus, PAI was found to be more advantageous in detecting variations in both internal and external dimensions in the canopy structure than remote sensing approaches which only detected external dimension and spectral value variations.
Figure 4. Examples of ground-based canopy cover images for (a) control and (b) bunch thinning.

Note the decrease in the bunch number in the bunch thinning group, compared with the control group.
Comparing different measurements for the same treatment group, canopy volume and PAI were found to detect greater differences in canopy structure after canopy management practices were applied. When comparing measurements taken before and after treatment application (Figure 5), canopy volume and PAI demonstrate the largest percentage changes for leaf removal and shoot trimming treatments. In contrast, very little change was observed in NDVI values with the largest decline of 13 % when double-sided leaf removal was applied. The same treatment recorded reductions of 43 % and 30 % in canopy volume and PAI measures, respectively. At harvest (Figure 5), canopy volume recorded the biggest percentage change across all measurements, followed by PAI. This may have been accentuated by leaf fall. Canopy volume and PAI measures detected greater differences in canopy structure and may be more useful measures for making informed decisions and determining the effectiveness of canopy management practices.
Figure 5. Percentages of change in the measured parameters for the same treatment group between stages.

(a) between pre and post canopy treatments at veraison (both at E-L stage 35); (b): between post-treatment (E-L stage 35) and harvest (E-L stage 38). The data were grouped by four different treatments and within each group, the mean percentage change of four measured parameters are shown (canopy volume (m3), projected canopy area (m2), NDVI and PAI). Treatments applied were leaf removal on the eastern side of the canopy (LR-E), leaf removal on both sides of the canopy (LR-B), bunch thinning (BT) and shoot trimming on the eastern side of the canopy (ST-E).
The treatments used in this study showed the limitations of NDVI and canopy area measurements in detecting changes in canopy structure. As discussed previously, bunch thinning cannot be detected by remote sensing approaches due to foliage cover. In addition, it was also found, in the shoot trimming group (ST-E), that the leaves and portion of shoots that were trimmed and left on the ground were detected by the NDVI and canopy area measurements using unsupervised classification (Figure 6a and b). The portion of the canopy that was removed at trimming likely had similar spectral properties to the canopy that remained on the vine. When unsupervised classification was applied, both pixels on the grapevine and on the ground (residual pixels) were extracted and these residual pixels reduced the accuracies of NDVI and canopy area. For the canopy area, residual pixels increased the total canopy pixel number for the measured grapevine and, as a result, increased the canopy area. By setting higher thresholds in the Lab colour space profile when extracting canopy pixels, some residual pixels in the RGB orthomosaic were filtered out (Figure 6c). Nonetheless, strict thresholds also filtered out part of the canopy pixels and resulted in the underestimation of the projected canopy area. With these measurements already taken one week after the shoots were trimmed, they are unlikely to be capable of providing accurate real-time assessments of the canopy structure immediately after shoot trimming. Therefore, it is suggested that NDVI and canopy area measurements using UAV-based two-dimensional rasters (e.g., spectral indices and RGB rasters) for the assessment of shoot trimming should be avoided to minimise any potential errors.
Figure 6. A comparison of different remote sensing approaches for extracting the grapevine canopy.

Rasters of the canopy that received shoot trimming, captured one week after the application at veraison (E-L stage 35), are shown: (a) NDVI raster (overlaying RGB orthomosaic image), (b) canopy pixel raster for calculating projected canopy area, (c) canopy pixel raster with the higher extraction threshold for green pixels and (d) digital canopy model (DCM) for calculating canopy volume. Note the trimmed shoots and leaves left on the ground (residual pixels) were captured by NDVI (a) and canopy area measurements (b). Canopy pixel extraction with the higher threshold (c) filtered out most ground residual pixels but also reduced the actual projected canopy area in the canopy zone. In comparison, DCM (d) can effectively filter out ground interference.
In Figure 6d, DCM was shown to contain only the grapevine canopy after filtering out shoots left on the ground during DSM normalization. Without the interference from ground residual pixels, the canopy volume calculated from the DCM reflected more accurately the volume measurements of the grapevine canopy. Canopy volume measurements can also be obtained immediately after canopy management practices are applied without the interference of the trimmed shoots or leaves on the ground, unlike NDVI and projected canopy volume measures. In addition, although no ground vegetation (weed or cover crop) grew in the study vineyard, it has been found that DCM can also filter out the interference from ground vegetation as only the partial volume of the DCM that was above the cordon height was included (Vanegas et al., 2018; Weiss & Baret, 2017). With these advantages, canopy volume calculated from DCM displayed potential as a suitable tool for monitoring canopy management outcomes.
Compared with UAV, the ground-based PAI measurement has a simpler process and offers the convenience of collecting and analysing the data quickly. It can assess the outcomes of the canopy management practice and report the results in the field which allows growers to promptly adjust their management practices. Nonetheless, as the canopy cover imagery for the PAI calculation was collected discretely between vines, the sampling distances between images should be considered in conjunction with target map resolution when using this method for mapping the spatial variability. By integrating the ground imagery acquisition system with vineyard vehicles can potentially enable the on-the-go canopy structure assessment, as explored in previous studies (Bramley et al., 2007; Liu et al., 2017; Rose et al., 2016). However, these systems often require regular calibrations during the operation for capturing suitable imagery under the dynamic light conditions in the field and more research is required to improve the robustness of the integrated system.
Between UAV-based remote sensing approaches, the processes for data collection and primary analysis were similar, both in terms of the UAV flight and the vineyard raster reconstruction. For canopy volume, DSM needs to be normalized to create the DCM which was more complicated than the canopy pixel classification and extraction for the NDVI and projected canopy area calculation. However, the collection of multispectral imagery for NDVI calculation required integrating extra multispectral sensors while the DCM was created from the RGB imagery captured by the originally on-board RGB sensor. Therefore, the cost and labour required for the acquisition and integration of multispectral sensors can be avoided by using DCM. In addition, DCM can also be used to provide the surface area of the canopy and knowing the volume and surface area of the canopy can potentially be useful for guiding other canopy management operations. For example, it can be used to advise the chemical spray volume required and the spray application rate can be adjusted during the operation according to the canopy size (Llorens et al., 2010; Llorens et al., 2011). The latter authors proposed the use of ultrasonic and LIDAR sensors for the proximal sensing of canopies in vineyards and concluded that these sensors could provide valuable information on canopy volume and leaf are index. However, they also concluded that the post-processing of LIDAR-acquired information can be a limiting factor to its usability. Similarly, the work from Siebers et al. (2018) has demonstrated that proximal sensing in viticulture could provide useful information on canopy architecture; the authors developed a proximal sensing platform, the Grover, equipped with LiDAR but with the capability to host and test multiple sensors. Diago et al. (2019) proposed the use of an on-the-go system to collect RGB images for the assessment of grapevine canopy parameters with successful results. Similar to our findings an on-the-go system can be used as a tool to assess the efficiency of canopy management operations.
UAV-based remote sensing can be used to provide assessments of the whole area before or after the application of canopy management practices to advise the optimal input level and assess outcomes. Various flight control software also helps maintain the course and speed of the aircraft automatically during the flight which greatly reduces the difficulties and complexities in capturing overlapped imagery suitable for the reconstruction of orthomosaic imagery using photogrammetry. The operational flexibility of UAV also allows the timely assessment of the canopy management outcomes, compared with manned aircraft and satellite remote sensing.
Conclusion
The UAV-based canopy volume assessment was able to account for differences obtained through canopy management, specifically the canopy structure. Canopy volume is more sensitive to changes in the canopy structure compared to NDVI and projected canopy area. It is also a more cost-effective measure than the multispectral index NDVI, where a multispectral sensor is required, increasing costs of hardware and analysis and interpretation requirements. The accuracies of NDVI and projected canopy area measurements were negatively impacted when practices such as shoot trimming, which leave plant material on the ground were applied. Due to the cover of foliage, UAV-based measurements cannot be used to measure the impact of bunch thinning on canopy structure. However, UAV-based approaches can be used to provide vineyard scale measurements that cover all the grapevines in the vineyard and have the potential to be integrated with other vineyard management practices, such as targeted chemical spraying. PAI calculated from ground-based canopy cover imagery can be used to measure all three types of canopy management practices (leaf removal, shoot trimming and bunch thinning) assessed in this study and was a convenient approach for in-field assessments.
Acknowledgements
The authors would like to thank Treasury Wine Estate for providing access to the experimental block for this study and Patrick Vaughan O’Brien, Massimiliano Cocco and Marco Zito for helping apply the canopy management practices. This research was supported by funding from the University of Adelaide and Wine Australia. Wine Australia invests in and manages research, development and extension on behalf of Australia’s grape growers and winemakers and the Australian Government.
NOTES
References
- Albetis, J., Jacquin, A., Goulard, M., Poilvé, H., Rousseau, J., Clenet, H., Dedieu, G., & Duthoit, S. (2018). On the potentiality of UAV multispectral imagery to detect Flavescence dorée and grapevine trunk diseases. Remote Sensing, 11(1). https://doi.org/10.3390/rs11010023
- Andújar, D., Moreno, H., Bengochea-Guevara, J. M., de Castro, A., & Ribeiro, A. (2019). Aerial imagery or on-ground detection? An economic analysis for vineyard crops. Computers and Electronics in Agriculture, 157, 351–358. https://doi.org/10.1016/j.compag.2019.01.007
- Bramley, R. G. V, Gobbett, D., & Praat, J. (2007). A proximal canopy sensor - A tool for managing vineyard variability. Adelaide, South Australia, Australia: CSIRO Sustainable Ecosystems.
- Bréda, N. J. J. (2003). Ground‐based measurements of leaf area index: a review of methods, instruments and current controversies. Journal of Experimental Botany, 54(392), 2403–2417. https://doi.org/10.1093/jxb/erg263
- Caravia, L., Collins, C., Petrie, P. R., & Tyerman, S. D. (2016). Application of shade treatments during Shiraz berry ripening to reduce the impact of high temperature. Australian Journal of Grape and Wine Research, 22(3), 422–437. https://doi.org/10.1111/ajgw.12248
- Comba, L., Biglia, A., Ricauda Aimonino, D., & Gay, P. (2018). Unsupervised detection of vineyards by 3D point-cloud UAV photogrammetry for precision agriculture. Computers and Electronics in Agriculture, 155, 84–95. https://doi.org/10.1016/j.compag.2018.10.005
- De Bei, R., Fuentes, S., Gilliham, M., Tyerman, S., Edwards, E., Bianchini, N., Smith, J., & Collins, C. (2016). VitiCanopy: a free computer app to estimate canopy vigor and porosity for grapevine. Sensors , 16(4), 585. https://doi.org/10.3390/s16040585
- De Bei, R., Wang, X., Papagiannis, L., Cocco, M., O’Brien, P., Zito, M., Ouyang, J., Fuentes, S., Gilliham, M., Tyerman, S., & Collins, C. (2019). Postveraison Leaf Removal Does Not Consistently Delay Ripening in Semillon and Shiraz in a Hot Australian Climate. American Journal of Enology and Viticulture, 70(4), 398 LP – 410. https://doi.org/10.5344/ajev.2019.18103
- de Castro, A., Jiménez-Brenes, F., Torres-Sánchez, J., Peña, J., Borra-Serrano, I., & López-Granados, F. (2018). 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sensing, 10(4), 584. https://doi.org/10.3390/rs10040584
- Diago, M. P., Aquino, A., Millan, B., Palacios, F., & Tardáguila, J. (2019). On‐the‐go assessment of vineyard canopy porosity, bunch and leaf exposure by image analysis. Australian Journal of Grape and Wine Research 25(3), 363-374.
- Dry, P. R. (2008). Canopy management for fruitfulness. Australian Journal of Grape and Wine Research, 6(2), 109–115. https://doi.org/10.1111/j.1755-0238.2000.tb00168.x
- Fuentes, S., Poblete-Echeverria, C., Ortega-Farias, S., Tyerman, S., & De Bei, R. (2014). Automated estimation of leaf area index from grapevine canopies using cover photography, video and computational analysis methods. Australian Journal of Grape and Wine Research, 20(3), 465–473. https://doi.org/10.1111/ajgw.12098
- Gower, S. T., Kucharik, C. J., & Norman, J. M. (1999). Direct and indirect estimation of leaf area index, fAPAR, and net primary production of terrestrial ecosystems. Remote Sensing of Environment, 70(1), 29–51. https://doi.org/10.1016/s0034-4257(99)00056-5
- Hall, A. (2018). Remote sensing application for viticultural terroir analysis. Elements, 14(3), 185–190. https://doi.org/10.2138/gselements.14.3.185
- Jonckheere, I., Fleck, S., Nackaerts, K., Muys, B., Coppin, P., Weiss, M., & Baret, F. (2004). Review of methods for in situ leaf area index determination: Part I. Theories, sensors and hemispherical photography. Agricultural and Forest Meteorology, 121(1–2), 19–35. https://doi.org/http://dx.doi.org/10.1016/j.agrformet.2003.08.027
- Khaliq, A., Comba, L., Biglia, A., Ricauda Aimonino, D., Chiaberge, M., & Gay, P. (2019). Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sensing, 11(4). https://doi.org/10.3390/rs11040436
- Liu, S., Cossell, S., Tang, J., Dunn, G., & Whitty, M. (2017). A computer vision system for early stage grape yield estimation based on shoot detection. Computers and Electronics in Agriculture, 137, 88–101. https://doi.org/https://doi.org/10.1016/j.compag.2017.03.013
- Llorens, J., Gil, E., Llop, J., & Escolà, A. (2010). Variable rate dosing in precision viticulture: Use of electronic devices to improve application efficiency. Crop Protection, 29(3), 239–248. https://doi.org/http://dx.doi.org/10.1016/j.cropro.2009.12.022
- Llorens, J., Gil, E., Llop, J., & Escolà, A. (2011). Ultrasonic and LIDAR Sensors for Electronic Canopy Characterization in Vineyards: Advances to Improve Pesticide Application Methods. Sensors, Vol. 11. https://doi.org/10.3390/s110202177
- Matese, A., & Di Gennaro, S. F. (2018). Practical applications of a multisensor UAV platform based on multispectral, thermal and RGB high resolution images in precision viticulture. Agriculture (Switzerland), 8(7). https://doi.org/10.3390/agriculture8070116
- Matese, A., Toscano, P., Di Gennaro, S., Genesio, L., Vaccari, F., Primicerio, J., Belli, C., Zaldei, A., Bianconi, R., & Gioli, B. (2015). Intercomparison of UAV, aircraft and satellite remote sensing platforms for Precision Viticulture. Remote Sensing, 7(3), 2971. Retrieved from http://www.mdpi.com/2072-4292/7/3/2971
- Mathews, A., & Jensen, J. (2013). Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sensing, 5(5), 2164. Retrieved from http://www.mdpi.com/2072-4292/5/5/2164
- Mirás-Avalos, J. M., Buesa, I., Llacer, E., Jiménez-Bello, M. A., Risco, D., Castel, J. R., & Intrigliolo, D. S. (2017). Water versus source–Sink relationships in a semiarid Tempranillo vineyard: vine performance and fruit composition. American Journal of Enology and Viticulture, 68(1), 11 LP – 22. https://doi.org/10.5344/ajev.2016.16026
- Pádua, L., Marques, P., Hruška, J., Adão, T., Peres, E., Morais, R., & Sousa, J. J. (2018). Multi-temporal vineyard monitoring through UAV-based RGB imagery. Remote Sensing, 10(12), 1907. https://doi.org/10.3390/rs10121907
- Poblete-Echeverria, C., Fuentes, S., Ortega-Farias, S., Gonzalez-Talice, J., & Yuri, J. A. (2015). Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient. Sensors, 15(2), 2860–2872. https://doi.org/10.3390/s150202860
- Reynolds, A. G., Molek, T., & De Savigny, C. (2005). Timing of shoot thinning in Vitis vinifera: impacts on yield and fruit composition variables. American Journal of Enology and Viticulture, 56(4), 343–356.
- Reynolds, A. G., & Vanden Heuvel, J. E. (2009). Influence of grapevine training systems on vine growth and fruit composition: a review. American Journal of Enology and Viticulture, 60(3), 251–268.
- Romero, M., Luo, Y. C., Su, B. F., & Fuentes, S. (2018). Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Computers and Electronics in Agriculture, 147, 109–117. https://doi.org/10.1016/j.compag.2018.02.013
- Rose, J. C., Kicherer, A., Wieland, M., Klingbeil, L., Töpfer, R., & Kuhlmann, H. (2016). Towards automated large-scale 3D phenotyping of vineyards under field conditions. Sensors (Switzerland), 16(12). https://doi.org/10.3390/s16122136
- Rouse, J. W. ., Haas, R. H. ., & Schell, J. A. (1974). Monitoring the vernal advancement and retrogradation (Greenwave Effect) of natural vegetation. NASA’s Goddard Space Flight Center: Greenbelt, MD, USA.
- Siebers, M. H., Edwards, E. J., Jimenez-Berni, J. A., Thomas, M. R., Salim, M., & Walker, R. R. (2018). Fast phenomics in vineyards: development of GRover, the grapevine rover, and LiDAR for assessing grapevine traits in the field. Sensors, 18(9), 2924. https://doi.org/10.3390/s18092924
- Smart, R., & Robinson, M. (1991). Sunlight into wine: a handbook for winegrape canopy management. Adelaide, South Australia, Australia: Winetitles.
- Su, B. F., Xue, J. R., Xie, C. Y., Fang, Y. L., Song, Y. Y., & Fuentes, S. (2016). Digital surface model applied to unmanned aerial vehicle based photogrammetry to assess potential biotic or abiotic effects on grapevine canopies. Int J Agric & Biol Eng, 9(6), 119–130.
- Trought, M. C. T., Naylor, A. P., & Frampton, C. (2017). Effect of row orientation, trellis type, shoot and bunch position on the variability of Sauvignon Blanc (Vitis vinifera L.) juice composition. Australian Journal of Grape and Wine Research, 23(2), 240–250. https://doi.org/10.1111/ajgw.12275
- Vanegas, F., Bratanov, D., Powell, K., Weiss, J., & Gonzalez, F. (2018). A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data. Sensors, 18(1), 260. https://doi.org/10.3390/s18010260
- Vasconcelos, M. C., & Castagnoli, S. (2000). Leaf canopy structure and vine performance. American Journal of Enology and Viticulture, 51(4), 390–396.
- Wang, X., De Bei, R., Fuentes, S., & Collins, C. (2019). Influence of canopy management practices on canopy architecture and reproductive performance of Semillon and Shiraz grapevines in a hot climate. American Journal of Enology and Viticulture, 70(4), 360 LP – 372. https://doi.org/10.5344/ajev.2019.19007
- Weiss, M., & Baret, F. (2017). Using 3D point clouds derived from UAV RGB Imagery to describe vineyard 3D macro-structure. Remote Sensing, 9(2), 111. Retrieved from http://www.mdpi.com/2072-4292/9/2/111
- Wine Australia (2020). Discover Australian Wine: Regions and Varieties. Retrieved May 21, 2020, from https://www.wineaustralia.com/getmedia/f77af85a-95a4-44ab-97e8-7823b87b64ec/DIGI_Web_Discover_v2.pdf
- Wolf, T. K., Dry, P. R., Iland, P. G., Botting, D., Dick, J. O. Y., Kennedy, U., & Ristic, R. (2003). Response of Shiraz grapevines to five different training systems in the Barossa Valley, Australia. Australian Journal of Grape and Wine Research, 9(2), 82–95. https://doi.org/10.1111/j.1755-0238.2003.tb00257.x
- Xue, J., & Su, B. (2017). Significant remote sensing vegetation indices: A review of developments and applications. Journal of Sensors, Vol. 2017. https://doi.org/10.1155/2017/1353691