High-throughput phenotyping in grapevine breeding research: technologies and applications This article is a review article published in cooperation with Open GPB 2024
Abstract
In times of highly effective and cost-efficient genotyping technologies routinely applied in plant research and breeding, the need for comparable high-throughput (HT) and high-resolution phenotyping tools has increased substantially. As a perennial plant, grapevines have very specific requirements for HT phenotyping. Depending on the trait, it can be applied in laboratories or greenhouses, but it is also very important under field conditions to rate the full phenotypic variability of traits like yield or plant vigour throughout the season. For more than a decade, researchers have strived to improve grapevine phenotyping by sensors and automation to dissolve the phenotyping bottleneck. The core goal of the present review is the illustration of promising and reliable opportunities for HT phenotyping in grapevine research and breeding. Therefore, different imaging sensor technologies and their data analysis, including artificial intelligence (AI), will be discussed, focusing on traits that are important for breeding new grapevine varieties. However, the expected outcome of any HT phenotyping approach is similar: transfer of a low-throughput method into an approach that acquires objective, precise, and reliable data for plant evaluation with high spatial and temporal resolution. Furthermore, the collection of large phenotypic data sets and their linkage with environmental or genomic data will provide new or extended insights into the response of grapevines to biotic and abiotic stresses and will significantly support the evaluation of traits, identification of new QTLs, or implementation of breeding strategies like genomic prediction. These advancements offer an improvement of precision and scalability within seedling selection and can additionally contribute to increased sustainability in viticulture.
__________
This article is published in cooperation with the Open Conference on Grapevine Physiology and Biotechnology 2024 (Open GPB 2024), 7-11 July 2024, Logrõno, La Rioja, Spain.
Guest editor: Javier Tello.
How grapevine breeding programs will benefit from HT phenotyping
The number of plants to be phenotyped and the complexity of the traits are the two driving forces for improving phenotyping technologies, both in grapevine breeding as well as in the accompanying research. Simplified, the workload during the season and time-consuming procedures in the field mean that manual evaluations result in low phenotyping throughput (i.e., limits in the number of data points in time). Given the future challenges for viticulture due to climate change and sustainability, viticulture will have to plant new and better-adapted varieties in the long term, regardless of the market aspects that have prevented varietal change for a long time (Töpfer & Trapp, 2022). That requires more efficient breeding. Disease-resistant grapevine varieties (PIWIs) are one important option to face many of the worldwide challenges in viticulture, like the demand for increased sustainability or adaptation to a changing climate (Töpfer & Trapp, 2022). Many PIWI varieties show very good performance concerning fungal resistance (powdery mildew, downy mildew, black rot, and Botrytis) and save a considerable amount of fungicides (50 to 80 %). Concurrently, only a few of them are already well prepared for climate change, such as Calardis blanc (VIVC No. 22828 with late budburst (i.e., less endangered from late spring frost), medium sugar accumulation (i.e., moderate alcohol content in the wine), grape sunburn tolerant, Töpfer & Trapp, 2022). Because of that, grapevine breeding around the globe aims at developing new PIWI varieties with combined resistance against pests, fungal diseases, and bacteria, a better adaptation to the changing climate, and high wine quality. Traditionally, breeding new grapevine varieties is a long endeavour as it can easily take more than 20 years from the initial cross-to-market launch.
To speed up breeding, the use of trait-specific molecular markers for Marker-Assisted Selection (MAS) is meanwhile an established method to monitor the underlying genomic regions (Töpfer & Trapp, 2022). Thus, in today’s breeding programs, MAS works excellently to select seedlings shortly after germination for known loci providing resistance against Plasmopara viticola (Downy Mildew) or Erysiphe necator (Powdery Mildew), allowing efficient breeding of mildew-resistant grapevines (Töpfer & Trapp, 2022). However, many more traits need to be considered, like resistance to other biotic stresses (e.g., Botrytis bunch rot, black rot), plant vigour, yield, grape bunch health, phenology, or wine quality. They are often very complex, and their polygenic nature and the large impact of environmental factors result in a high variability of phenotypic expression (Herzog et al., 2021; Rist et al., 2018; Richter et al., 2019).
Additional trait-specific molecular markers for MAS or new approaches like genomic prediction and novel high-throughput (HT) phenotyping methods are needed to facilitate an (early) negative or positive selection of such quantitative traits. Low-throughput phenotyping is insufficient for dissecting the genetic nature of such complex traits. Phenotypic data are laborious to obtain by traditional methods, they are of low resolution, and can be subjective with hidden errors. Alongside other techniques, imaging sensor technologies can provide an objective, non-invasive view of the traits of interest. The application of suitable sensor techniques for HT phenotyping follows the central objective of increasing data collection (quantity) and improving data accuracy (quality).
Today, a variety of imaging sensors are available for applications in grapevine phenotyping. To realise the full potential of HT phenotyping, both automated acquisition as well as automated analysis of sensor data with high accuracy and reliability are required. In recent years, artificial intelligence (AI) has become the most powerful data analysis tool for HT phenotyping, as it can process large data sets from sensors to, for example, recognise disease symptoms, quantify severity, and predict disease progression. An established HT phenotyping offers:
- high degree of standardisation, independent from experts;
- temporal and spatial high-resolution data, which are objective, numeric, and reproducible;
- retrospective analysis;
- monitoring and kinetic studies;
- data-driven trait forecast by genomic prediction; and
- extended opportunities for targeted seedling selection by studying Genotype-Environment-Interactions (G × E), underlying molecular behaviours, and candidate genes.
One important area of application is the objective screening of breeding material throughout the evaluation process in the field (Vezzulli et al., 2019). Other major applications of HT phenotyping in grapevine breeding research are QTL (Quantitative Trait Locus) studies, fine mapping, and the development of trait-specific molecular markers for MAS (Vezzulli et al., 2019). In addition, HT phenotyping methods offer new opportunities for genomic prediction (Brault et al., 2024) and they can provide novel perspectives on the plant’s behaviour by phenotyping proxies usable for the selection of genotypes that are, for example, resilient to biotic and abiotic stresses.
This review summarises relevant information related to HT phenotyping and is divided into two main sections: 1) aspects that need to be considered when developing new HT phenotyping methods; and 2) remarkable applications in grapevine breeding research that show the promising potential of merging both, imaging sensors and AI. After a short definition of vocabulary and methods, selected studies will illustrate the different sensor- and AI-driven strategies in the laboratory and in the field.
Critical factors that need to be considered for the development of HT phenotyping
1. Traits of interest and their specific requirements for the experimental setup
Firstly, the scale and environment of phenotyping need to be defined. HT phenotyping can be done on individual grapevine organs or whole potted plants under controlled conditions in the lab, growth chambers, or greenhouse, as well as in the field (Figure 1). In general, five major categories of traits are often considered for phenotyping in grapevine breeding and research:
- 1. Morphology: dimension and shape of traits, for example bunches, shoot position (upright).
- 2. Disease: percentage of disease symptoms on leaves or anomalies of the canopy.
- 3. Phenology: time of bud break, flowering, or veraison.
- 4. Physiology: response to biotic and abiotic stresses, water-use-efficiency, or nutrition status.
- 5. Quality: especially constituents of wine.
Prior to the development of a new HT phenotyping method, several critical parameters need to be considered: variability of the trait of interest, the environmental conditions of phenotyping (lab, greenhouse, or field), the minimal number of biological and technical repetitions per genotype, and the aimed precision of the received data. Among phenotyping environments, laboratory settings offer the highest level of control, while field-based approaches often allow for higher throughput (Figure 1). Regarding the spatial resolution, HT phenotyping in the lab will give a detailed view of a trait for previously selected genotypes (e.g., Malagol et al., 2025). Trials in greenhouses will give information about disease distributions within one plant, and field phenotyping allows the evaluation of grapevines under natural growing conditions in a vineyard (e.g., Liu et al., 2022b). The number of repetitions typically depends on the phenotypic variability of the traits, the accuracy of recorded sensor data, as well as the number of genotypes targeted for phenotyping. For example, yield-related traits such as flower number or grape bunch development can be acquired via RGB images taken at different developmental stages (e.g., pre-flowering, pea-sized berries, or harvest maturity) under field conditions (Di Gennaro et al., 2019; Palacios et al., 2020). These traits often vary in colour, shape, and spatial arrangement between genotypes. Further, the external conditions during image capture can vary because of the weather (e.g., sunny, cloudy, or rainy conditions) or the time of day. Especially for training robust AI-based image analysis tools, a sufficiently large number of replicates, ideally at least 100 images per object class or genotype, is essential to ensure reliable prediction accuracy (Muthukumarana & Aponso, 2020). Depending on the phenotyping aim, one object class for yield can correspond to a single berry, inflorescence, grape cluster, or entire plant per image. This recommendation is supported by studies in plant phenotyping and image classification, which indicate that fewer than 100 images per class can lead to reduced model performance unless extensive data augmentation or transfer learning is applied (Too et al., 2019). Alternatively, instead of relying solely on full-object images, some approaches focus on dividing high-resolution images into smaller sub-regions or image patches. Working at the patch level can substantially increase the number of training samples per image and improve model generalisability, especially when large annotated datasets (see digital reference data in section “3. Reference data as a benchmark for method validation”) are not available (Malagol et al., 2025). Patch-based classification is particularly useful for detecting fine-grained spatial features or localised stress symptoms and has been successfully applied in tasks such as disease detection, segmentation, or quantifying sunburn damage at the tissue level. Moreover, this strategy enables more efficient data handling and model training, especially when image resolution exceeds the capacity of standard neural networks.

2. What sensor technology is suitable for individual grapevine traits?
The targeted spatial resolution of HT phenotyping methods strongly differs between the stated traits of interest and, of course, the research question. The investigation of diseases, for instance, can be done on a cellular level (phenotyping of haustoria formation and mycelium growth) or on a field level (phenotyping the disease distribution within one plant or one vineyard). From a methodological point of view, sensor applications can be conducted by proximal and remote sensing approaches. While remote sensing is characterised by using satellite- and unmanned aerial vehicle (UAV)-based sensors (Di Gennaro et al., 2019; Shmuleviz et al., 2024), proximal sensing includes all non-invasive ground-based sensors. That involves steady (plant-to-sensor) and movable sensors (sensor-to-plant), including handheld and sensors on moving phenotyping platforms in greenhouses or the field (Kicherer et al., 2017a). HT phenotyping in general is possible by imaging and non-imaging, i.e., point-based sensors like spectrometers (Íñiguez et al., 2024). Hereby, the spectral resolution goes from monochrome (one band), RGB (three bands of visual light: Red-Green-Blue), multispectral imaging (MSI; several selected bands like one band for red light 520–600 nm, Table S1) to hyperspectral imaging (HSI, several hundred bands from e.g., 400–2500 nm, Table S1). In addition, thermal cameras and laser techniques like LiDAR (Light Detection And Ranging) sensors are used for HT phenotyping (Table 1; Chedid et al., 2023). As visible in Table 1, single cameras recording RGB images are the most common sensors used for HT phenotyping studies. In combination with an automated data acquisition (e.g., Bierman et al., 2019) and robust automated image analysis, this kind of HT phenotyping pipeline is feasible in grapevine breeding because thousands of leaf disks from different mapping populations or genetic repositories can be phenotyped for molecular marker development.
However, the characteristics of the desired traits have to be clarified accurately to select the most efficient sensor. 2D monochrome images will only be useful to extract morphological traits like size and shape, and RGB images will provide additional information about colour. Thus, 2D RGB imaging is useful to investigate morphological traits, phenology, and disease resistance with the appearance of visual symptoms. The combination of 2D imaging with structured light, also named structured-illumination-reflectance-imaging (SIRI), enables the phenotyping of fruit quality but is mostly applied on fruit crops (Lu & Lu, 2019) and is a rare application on grapevines (Haucke et al., 2021).
Multi- and hyperspectral imaging (MSI, HSI) provide highly informative sensor data. Because of that, HSI is a promising technique validated for the pre-symptomatic detection or the distinction of different viruses for monitoring approaches, for example (Sawyer et al., 2023). In contrast to 2D RGB images (three wavebands per pixel), MSI and HSI acquire additional spectral data per pixel (hyperspectral data cube or hypercube) containing spectral (λ) and spatial (x, y) information (Sarić et al., 2022). Whereas HSI has a much higher number of spectral bands than MSI (Table S1). In general, each material is characterised by a specific spectral signature (spectral fingerprint) resulting from interactions of light (e.g., absorption, reflection, or transmission) with the object of interest (Sarić et al., 2022). Thus, water, green vegetation, or soil can be distinguished by their individual spectral signature. Depending on the applied hyperspectral camera spectral characteristics of visible (Vis, 400–700 nm), near-infrared (NIR, 700–1100 nm) up to short-wave-infrared (SWIR, 1100–2500 nm) wavebands (Sarić et al., 2022) can be captured with spatial resolutions of mostly 1 nm for Vis-NIR or 2 nm for SWIR. Hereby, biochemical components, biotic and abiotic stresses, and structural and physiological changes result in an alteration of spectral signatures that can be detected using HSI. Vis-NIR cameras are the most cost-efficient imaging sensors suitable for experiments related to visible and even non-visible traits, which have been used to characterise, for instance, grapevine physiological traits (Sharma et al., 2024) or the detection of endophytic diseases like grapevine yellows (Bendel et al., 2020). However, HSI typically produces big data that requires large storage facilities and high computing power for its analysis (Bendel et al., 2020). Thus, if spectral information is used for HT phenotyping in breeding, HSI is important to detect relevant wavelengths, for example, for disease detection (Bendel et al., 2020) that can be transferred to specific MSI bands or indices.
In addition to 2D data and spectral data, further information can be gained from 2.5D and 3D phenotyping techniques that describe object geometry (Moreno & Andújar, 2023) and are thus suitable for any research on canopy architecture, yield parameters, or morphology. Especially in field phenotyping approaches, 3D information can further be used for depth map calculations to detect grapevines in the foreground and delete any background pixels (Engler et al., 2023; Klodt et al., 2015). 3D point clouds can be generated directly by using, for example, LiDAR. Another opportunity is the reconstruction of 3D point clouds from 2D images using stereo images or multi-view triangulation (Rist et al., 2018; Rose et al., 2016). Reconstructed 3D point clouds offer the advantage that they provide colour information as advanced information in addition to volume and geometry. However, 3D data of the grapevine canopy or individual organs can be used to predict vigour-related traits like leaf-area-index or pruning weight, and yield.
3. Reference data as a benchmark for method validation
One fundamental step regarding the development of any sensor-based phenotyping method is the collection of corresponding high-quality reference data. Reference data represent the observed reality of phenotypic traits that need to be characterised using objective, digital methods. In general, two types of references can be distinguished: digital reference data and ground truth. Digital reference data can be obtained in a non-invasive manner from the acquired sensor data. Therefore, human experts have to label the phenotypic trait of interest by image annotations, which means labelling, measuring, or counting the trait of interest (Bierman et al., 2019; Engler et al., 2023; Rudolph et al., 2019). As soon as the accuracy of an algorithm is sufficient, the method needs to be validated with in-situ reference data, i.e., ground truth data. Ground truth data should be collected by human experts applying established protocols. This in situ phenotyping involves mostly invasive methods like weight measurements of, for example, pruning wood and grapes (Kicherer et al., 2017b), analytical quantification of leaf chlorophyll or nitrogen (Bendel et al., 2024; Bodor-Pesti et al., 2023), or analytical investigation of grape must (Gebauer et al., 2021). Further, labour-intensive and low-throughput devices that are applied in grapevine physiology studies to determine stem water potential, chlorophyll fluorescence, stomatal conductance or gas exchange, even represent tools to acquire objective and reliable ground truth data for any study on grapevine response to, for example, abiotic stress (Fernandes de Oliveira et al., 2024). However, the valid sample size is hereby defined by the variability of the trait, its heritability, the number of environments, the quality and resolution of ground truth and sensor data, as well as the analysis algorithm (see examples given in Table 1).
4. Artificial intelligence: basic knowledge that needs to be taken into consideration
AI in general is embedded into the field of data science (= data-driven extraction of knowledge) and is subdivided into classical programming and machine learning (ML) (Choi et al., 2020). In contrast to classical programming, where the model is developed by human experts, ML determines the model based on input data (e.g., sensor data) and associated outputs (e.g., reference data), i.e., the computer learns a model describing the relationship between both (Choi et al., 2020). Choosing ML for HT phenotyping approaches mainly requires training of the algorithm by feeding the model with sensor data (input) and corresponding reference data.
Although different learning methods are available, supervised and unsupervised learning are the most commonly applied ones in grapevine studies (Gatou et al., 2024). Unsupervised learning does not require any training data, which is an advantage in terms of labour and can be used to recognise and categorise patterns in data sets (Choi et al., 2020). Principal component analysis (PCA) is an unsupervised method that can be used to study patterns of associated traits to detect trade-offs or to evaluate the structure of mapping populations within genetic studies (Xavier et al., 2017).
In the field of plant research, grapevine and beyond, supervised ML is one of the most used branches of AI (Gatou et al., 2024) applied for regression analysis and modelling (Damásio et al., 2023; Rist et al., 2019), classifications (Gutiérrez et al., 2018) as well as computer-assisted evaluations (Fernandez et al., 2024). For many simple applications within HT phenotyping of grapevine traits (Table 1), for instance, support vector machines (SMV), t-distributed stochastic neighbour embedding (t-SNE), random forests (RF), or k-nearest neighbour (k-NN) could successfully be used for feature extraction and classification (Gatou et al., 2024). Due to the increased availability of computing power in recent years, the application of Deep Learning (DL, sub-branch of ML) and artificial neural networks (ANN, especially convolutional neural network (CNN) for image analysis) has proven to be a very promising tool for sophisticated applications in plant breeding and research. DL algorithms, particularly ANNs, can achieve generally high accuracy, often exceeding 80–90 %, specifically in disease detection and classification tasks, which is important within the establishment of new sensor-based methods (see Table 1). In general, different CNN architectures are available and have been applied for HT phenotyping approaches in grapevine, such as ResNet, AlexNet, or VGGNet, to identify grapevine varieties and ampelography from leaf images (Magalhães et al., 2023) or detect several traits like the berry number or pedicel length (Grimm et al., 2019; Zabawa et al., 2020). The detection of berries to estimate the grape bunch architecture, for instance, was also conducted by vision transformer-based methods such as the segment anything model (SAM) (Torres-Lomas et al., 2024).
The term computer vision is often used in the context of HT phenotyping. Computer vision is just a cross-sectional topic of AI, using the different methods of AI as explained above to process visual information. As reviewed by Li et al. (2020), it has revolutionised plant phenotyping regarding trait prediction from imaging sensor data, for instance. However, in general, different types of tasks (e.g., regression or classification) or modalities (e.g., point measurements or images) can be distinguished. With regard to that and based on several laboratory and in-field studies (Macia et al., 2024; Rudolph et al., 2019; Underhill et al., 2020; Zendler et al., 2021), the development of an AI-driven HT phenotyping framework using ML or DL involves six steps:
(1) Acquisition of sensor and ground truth data; (2) Division of the whole data set into training (70 %) and test data; (3) Training and performance test; (4) Model selection; (5) Result validation; and (6) Application on new data, for example, in another season or from other genotypes.
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Downy Mildew | lab | leaf disk | RGB/single camera1 | 57,836 image patches (5,763 original plate images) | Vitis species, breeding populations | OIV 452–1 prediction: 82–97 % | high | + | Macia et al., 2024 |
Downy Mildew | lab | leaf disk | RGB/single camera2 | 5,271 image patches (10 original leaf disk images) | 813 genotypes | Intensity of sporangia development: 82–100 % | high | +++ | Zendler et al., 2021 |
Downy Mildew | field | grapevine canopy | RGB/stereo camera2 | 12,432 sub-images (2,072 original image pairs of in-field plants) | Chardonnay | Disease severity estimation: 73–87 % | high | + | Liu et al., 2022b |
Powdery Mildew | lab | leaf disk | RGB/single camera2 | 14,180 image patches (19 original leaf disk images) | Chardonnay, different genotypes (Run/Ren carriers) | Disease severity estimation: 94 % | high | +++ | Bierman et al., 2019 |
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Powdery Mildew | lab | leaf disk | RGB/single camera1 | 21,162 image patches (10,760 original leaf disk images) | 267 genotypes | Disease severity estimation: 92–99 % | high | +++ | Qiu et al., 2022 |
Powdery Mildew | lab | grape bunch | HSI (900–1700 nm)1 | 30 grape bunch images | Carignan noir | Detection of infected bunches: 81–86 % | low | – | Pérez-Roncal et al., 2020 |
Grapevine yellows | lab | shoot | HSI (400–2500 nm)2 | 526 shoot images | Riesling, Scheurebe | Detection of symptomatic shoots: 62–92 % (lab) and 75–99 % (field) | medium | + | Bendel et al., 2020 |
Grapevine viruses | lab | detached leaf | HSI (510–710 nm)1 | 500 images of individual leaves | Cabernet franc, Cabernet-Sauvignon | Prediction of virus status: 82.8–85.6 % (pre-symptomatic); 82.4–87 % (symptomatic) | medium | + | Sawyer et al., 2023 |
Leaf hair | lab | leaf disk | RGB/single camera2 | 12,650 image patches (25 original leaf disk images) | 496 genotypes + six varieties (e.g., Riesling, Pinot meunier) | Leaf hair quantification: 95.4 % | high | +++ | Malagol et al., 2025 |
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Yield (bunch number) | field | cluster zone | RGB/single camera1 | 1,704 images | Tempranillo | Bunch detection: 71–86 % | high | + | Íñiguez et al., 2024 |
Yield (berry number) | field | grapevine canopy | RGB/camera system2 | 38 images | Riesling, Regent, Felicia | Berry counting: 84.5–92 % | high | ++ | Zabawa et al., 2020 |
Yield (berry number) | field | grapevine canopy | RGB/single camera2 | 1,646 image patches (18 original images), 60 original images (validation) | Six varieties (e.g., Syrah, Tempranillo) | Berry counting: 64.8–76.3 % | high | ++ | Palacios et al., 2020 |
Yield (shoot number) | field | cane with shoots | RGB – video/single camera3 | 29,046 shoot patches (1,281 original images (80 videos)) | Chardonnay, Shiraz | Shoot detection: 80–89.8 % | high | ++ | Liu et al., 2017 |
Flower number | field | cluster zone | RGB/single camera1 | 200 images of in-field plants | Riesling, Chardonnay | Single flower detection: 72.1–78.3 % | high | + | Rudolph et al., 2019 |
Flower number | field | cluster zone | RGB/single camera2 | 800 image patches (100 original images of in-field plants) | Merlot, Zinfandel, Sauvignon blanc | Inflorescence detection: 95 %; single flower detection: 87.1–98.5 % | high | ++ | Rahim et al., 2022 |
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Flower number | field | inflorescence | RGB/single camera1 | 539 images of individual inflorescences | Six varieties (e.g., Cabernet-Sauvignon, Barroca) | Inflorescence detection: 98.6 %; single flower detection: 88.1–91.6 % | medium | + | Moreira et al., 2025 |
Berry size | field | cluster zone | RGB/camera system2 | 271 images | Four varieties (e.g., Pinot noir, Riesling), seven breeding lines | N. A. | high | ++ | Engler et al., 2023 |
Bunch architecture | lab | grape bunch | 3D/3D Scanner3 | 296 full 3D scans | Nine varieties (e.g., Riesling, Pinot noir), 41 genotypes | N. A. | high | + | Rist et al., 2018 |
Bunch architecture | lab | grape bunch | RGB/single camera1 | 3,431 images | 139 genotypes | N. A. | high | + | Torres-Lomas et al., 2024 |
Bunch architecture | field | grape bunch | 3D/3D Scanner3 | 1,907 partial/full 3D scans | 16 varieties (e.g., Riesling, Regent), 218 genotypes | N. A. | high | – | Rist et al., 2019 |
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Distribution of berry waxes | lab | single berry | RGB/single camera3 | 495 image patches (33 original plate images) | Eight varieties (e.g., Riesling, Calardis blanc) | Epicuticular wax detection: 98.6 % | medium | – | Haucke et al., 2021 |
Root architecture | greenhouse | potted plant | 3D/µX-ray CT3 | 18 full 3D scans | Three varieties, two soils each | N. A. | low | – | Schmitz et al., 2021 |
Budburst | field | cane with shoots | RGB/single camera1 | 322 images | 16 varieties (e.g., Merlot), 150 genotypes | N. A. | low | – | Herzog et al., 2015 |
Canopy traits to predict phenology | field | grapevine canopy | 3D/LiDAR3 | 648 full 3D scans | Chardonnay | N. A. | low | – | Rinaldi et al., 2013 |
Ripening | field | cluster zone | HSI (400–1000 nm)3 | 36 HSI images | Tempranillo | N. A. | low | + | Fernández-Novales et al., 2021 |
Vigour as a proxy for yield | field (UAV) | grapevine canopy | MSI (R, G, NIR), RGB/camera system2 | 3 whole vineyard (1.4 ha) images | Sangiovese | N. A. | high | ++ | Di Gennaro et al., 2019 |
Vigour | field | grapevine canopy | RGB/single camera1 | 90 images | Riesling, Villard blanc, two genotypes | Leaf classification: 87 % | low | – | Klodt et al., 2015 |
Trait of interest | Environment | Object of interest | Data type/sensor1,2,3 | Number of sensor data used in the study | Investigated plant material | Accuracy/F1-Score of the model | Through-put | Feasibility in Breeding | Reference |
Pruning wood | field | grapevine canopy | RGB/camera system3 | 117 images | 39 genotypes | N. A. | medium | – | Kicherer et al., 2017b |
Pruning wood | field | grapevine canopy | 3D/LiDAR2 | 1,760 plot scans containing 3 plants each | Chardonnay, 209 genotypes | N. A. | high | + | Chedid et al., 2023 |
Photosynthesis | field | grapevine canopy | HSI (Vis-NIR, 397–1000 nm)3 | 192 HSI images | Marquette grafted on five rootstocks | N. A. | high | – | Sharma et al., 2024 |
High-throughput phenotyping under controlled conditions
The most invasive fungal diseases in viticulture are powdery and downy mildew. Thus, resistance (or susceptibility) to both diseases is of utmost interest in breeding programs and grapevine research (Vezzulli et al., 2019). The development of automated phenotyping pipelines can be supported by realising a high degree of standardisation. This requirement can be easily fulfilled by sensor data acquisition under controlled conditions in the laboratory, growth chambers, and the greenhouse with standardised illumination settings or artificial backgrounds. In these applications, potted vines or clearly defined individual organs, like detached leaves or grapes, are used, which are sampled in the field. HT phenotyping under controlled conditions focuses on: (i) the development of objective tools for the accurate measurement of phenotypic variation with improved precision and throughput, and (ii) the usage of that information to identify new QTLs and linked molecular markers of interest for MAS.
1. Evaluation of biotic stress resilience under controlled conditions
Laboratory-based phenotyping remains crucial regarding disease resistance, in general, to characterise novel resistance and understand resistance mechanisms. Laboratory phenotyping not only eliminates environmental dependencies but also enables high-resolution analysis of traits such as disease severity and pathogen dynamics (Macia et al., 2024). As visible in Table 1, leaf disc assays are often applied as a reliable method for phenotyping the resistance against the two most harmful grapevine pathogens (powdery and downy mildew). In the past, the throughput and consistency of the tests were limited due to low-throughput phenotyping such as visual scoring or microscopic quantification, which are labour-intensive, time-consuming, and rely heavily on the availability of experienced staff.
Since automated imaging systems like the Blackbird Imaging Robot (Moblanc Robotics, Binéfar, Spain) and AI-driven algorithms have found their way into grapevine HT phenotyping, they have revolutionised both precision and scalability (Reisch et al., 2023). Significant advances could be achieved for leaf disk assays using RGB imaging in order to detect and quantify powdery (Bierman et al., 2019; Qiu et al., 2022) or downy mildew (Macia et al., 2024; Zendler et al., 2021). The biggest advantage of phenotyping leaf disk assays using automated image capture and analysis is the throughput, as thousands of individual leaf disks can be screened in a fully comparable, standardised environment (Table 1). Although leaf disk assays work reliably for both mildew fungi, diagnostic screenings regarding black rot, grapevine yellows, or viruses need phenotyping of detached leaves or potted vines (Table 1). In HT phenotyping, either healthy potted plants that were inoculated with one pathogen were used, or detached leaves from naturally infected grapevines in the field were examined, as shown by Elsherbiny et al. (2024). Using RGB imaging is one cost-efficient way to clearly distinguish different disease symptoms visually, for example, the “shot-hole” appearance of Anthracnosis (Li et al., 2021) versus the necrotic lesions with a darker ring of black rot, both on the adaxial side of leaves. In that regard, Nagi and Tripathy (2022) used the open-source RGB image data set “Grapevine Disease Images” (www.kaggle.com) to develop CNN-based diagnosis algorithms for black rot, black measles, and leaf blight. However, the reliable usage of RGB images is sometimes limited, especially when symptoms of one disease are very similar to another disease (e.g., sporulation of downy and powdery mildew) or similar to nutrient deficiencies (e.g., grapevine leaf roll virus and magnesium deficiency). Here, hyperspectral imaging (HSI) or hyperspectral spectroscopy (point measurement) capturing the whole spectral information of Vis-NIR and SWIR wavelength bands can be used. Recent studies have shown that combining both ML and HSI offers innovative solutions for early disease detection (Table 1). Hereby, the spectral signatures of infected detached leaves, grapes or plants were investigated in comparison to healthy controls under standardised laboratory conditions as it was shown for the early detection of downy mildew on leaf disks (Hernández et al., 2024), powdery mildew infections on grapes (Pérez-Roncal et al., 2020), the detection of different viruses on detached leaves (Sawyer et al., 2023) or grapevine yellows on potted plants (Bendel et al., 2020). All of these developments represent the beginning of a shift in grapevine pathology, leveraging ML and DL models to enhance the accuracy and efficiency of disease detection.
2. Evaluation of abiotic stress resilience under controlled conditions
Although HT phenotyping of grapevine response to heat waves and/or drought is mainly focused on field applications (Carvalho et al., 2021), few approaches independent from the environment have recently been published. As roots are essential for water and nutrient uptake, they also mediate resilience to biotic and abiotic stresses, like drought, salt, or chemicals (Ollat et al., 2023). HT phenotyping of the grapevine root is comparably rare, but the application of rhizotrones, RGB imaging, or µX-ray Computer Tomography (CT) could be shown as a promising method under greenhouse conditions to phenotype root architecture or morphological characteristics (Fichtl et al., 2023; Krzyzaniak et al., 2021; Schmitz et al., 2021). Schmitz et al. (2021), for instance, could show differences in root growth between sandy and clay soils, which is important information within rootstock selection. Besides the root architecture, Ollat et al. (2023) have mentioned further important traits and strategies that need to be considered for the selection of climate-adapted genotypes. That involves, for instance, the plant response to water deficit or the response of rootstocks to heat and salinity treatments that were investigated using greenhouse phenotyping platforms or hyperspectral imaging (Ollat et al., 2023). Considering leaf pigments (especially leaf chlorophyll content) that can be measured using RGB or spectral sensors, facilitates phenotyping the plant response to applied abiotic stress, like cold or heat. A decrease in chlorophyll content and activity is a valid indicator of plant stress in general (Bodor-Pesti et al., 2023).
3. Objective description of fruit characteristics
The density and morphological architecture of the grape bunch are some of the most important traits for wine grape breeding. In breeding research, the grape bunch, therefore, represents an important proxy for resilience to Botrytis bunch rot in wine grapes and, of course, for fruit set and yield. For decades, it has been known that medium-sized berries and a loose grape bunch architecture are linked to the reduced risk for Botrytis bunch rot infection, and breeders worldwide visually select for these traits (Tello & Ibáñez, 2018). Although several QTL studies were published for grape bunch architecture-related traits, trait-specific molecular markers reliable within highly variable breeding material for MAS are not available, yet (Tello & Ibáñez, 2018). One reason is probably the highly variable nature of this quantitative trait, which is characterised by a compilation of several sub-traits, like the length of rachis and pedicels, the number of berries per rachis length unit, or berry size (Richter et al., 2019; Tello & Ibáñez, 2018). The usage of consumer cameras for RGB image capture under standardised settings is easier to handle and cost-effective, and thus, often applied for objective phenotyping of grape bunch traits (Grimm et al., 2019; Lopes & Cadima, 2021; Torres-Lomas et al., 2024; Underhill et al., 2020). Because of the three-dimensional nature of grape bunches, RGB images just enable the phenotyping of visible traits. Hidden berries or traits like rachis length cannot be determined within an intact bunch. Destemming of berries and image acquisition of individual traits is one opportunity to overcome that limitation, but it is labour-intensive and conclusively restricts the sample size (Richter et al., 2019). Furthermore, it is possible to estimate hidden parts using predictive regression models to receive all phenotypic information from partial sensor data (Rist et al., 2019). Xin and Whitty (2022) went another step further by using just one RGB image to reconstruct visible and non-visible berries using the mathematical tools of constraint-based optimisation and reconstruction grammar. This method seems to be feasible for selected varieties but might be challenging for highly variable breeding material.
However, the application of 3D sensors capturing the whole 3D structure is a popular approach to phenotyping these out-of-sight traits. First of all, different 3D sensor techniques are available on the market, starting from low-cost and lower-resolution RGB-D (Red-Green-Blue-Depth) cameras up to sensors providing high-resolution 3D point clouds. Liu et al. (2022a) for instance generated full 3D reconstructions of individual grapes using 3D point clouds captured with the RGB-D camera Kinect®v2 (Microsoft, Redmond, USA; scanning accuracy of 2 mm) and Parr et al. (2022) investigated the precision of grape phenotyping comparing different RGB-D cameras with high-resolution LiDAR, a typical sensor to measure distances and depth. Further opportunities are a near-range laser scanner with the highest scanning accuracy of 0.024 mm or the optical 3D Scanner Artec® (Artec 3D, Sennigerberg, Luxembourg) using multi-view triangulation with an accuracy of 0.05 mm (Rist et al., 2018). The application of one 3D scanner operated by a trained person enables 3D data acquisition from approximately 30 grape bunches per hour. An automated pipeline for data analysis generates precise and numeric phenotypic data, like berry number, berry sizes, and grape bunch volume, and facilitates phenotyping of 4,500 individual grape bunches per season (Rist et al., 2022). These high-resolution 3D point clouds of grape bunches (captured with a laser scanner or optical 3D scanner) can also be applied for constraint-based reconstruction of hidden traits related to grape bunch density, like rachis and pedicels length (Mack et al., 2020).
High-throughput phenotyping of grapevines under field conditions
Grapevines are perennial plants that set their first fruits around three years after planting in the field. Consequently, evaluation studies of fruit and canopy characteristics, phenology, as well as studies about yield or abiotic stress resilience, need plants growing under natural conditions in the field. In grapevine breeding, field phenotyping begins when vines are planted in the field. In the following years and stages (pre-test, intermediate, and main test), seedlings will be selected dependent on their overall appearance regarding yield, ripening time, general vitality (disease, vigour, nutritional status), viticultural traits, such as wood maturity or growth habit, as well as a focus on wine quality (Töpfer & Trapp, 2022). Especially within the advanced breeding stages of intermediate and main testing, HT field phenotyping would improve the evaluation process due to objective and reliable screenings. In addition, HT phenotyping opens up the possibility of having a retrospective view of the genotypes throughout the previous years with differing environmental conditions before deciding to take a genotype to the next breeding stage. In grapevine breeding research, the traits of interest are more or less the same. Equal to the controlled screenings e.g., for disease resistances in the lab, field screenings aiming at phenology, yield or plant vigour within genetic resources, breeding material or bi-parental mapping populations also generate phenotypic data that are useful for genome-wide association studies (GWAS), QTL mapping or genomic prediction (Brault et al., 2024). For example, LiDAR measurements have been used to calculate canopy characteristics on which genetically associated regions could be detected (Chedid et al., 2023). In contrast to precision viticulture, in breeding and breeding research of the grapevine, the detection and quantification of phenotypic traits from individual plants in the field is absolutely required. The assignment of sensor data to individual grapevines is usually done by parallel recording of Global Navigation Satellite System (GNSS) data. Since grapevines often develop a continuous canopy due to the way they are trained, the partial acquisition of plant parts from neighbouring plants in images cannot be completely ruled out. Although remote sensing techniques are frequently favoured within the field of precision viticulture, in grapevine breeding and its research, most traits need to be examined from a side view of individual grapevines growing side by side in the field. Thus, field phenotyping applications need to be carried out either using proximal sensing techniques or, if remote sensing is used, 3D reconstruction of the canopy is helpful. Fast and inexpensive field phenotyping platforms are an advantage, but pipelines with simple single-lens reflex (SLR) cameras can already be advantageous and provide added value. It does not always have to be a fully automated process, which is often very expensive, even if this reduces the frequency of errors and increases the throughput. Image data analysis from images acquired in the field by proximal sensing (side view) encounters different challenges depending on the feature being evaluated and the time of capture in the vegetation, for example, lighting conditions, background (ground, sky, other vines), variation of the feature (e.g., berry shape and colour), and occlusion by leaves and grapes (Kicherer et al., 2017a). The background can be problematic in early vegetation stages and is overcome, for example, by photogrammetric methods such as the use of stereo vision (Klodt et al., 2015).
1. Yield and other viticulture traits
Besides studies on disease resistance, yield parameters are by far the most studied quantitative traits, simply because yield is a key trait in viticulture, and as a morphological trait, it can be detected using RGB imaging (Table 1). Yield parameters that have been investigated non-destructively directly in the field include the number of shoots (Liu et al., 2017), the number of inflorescences (Moreira et al., 2025; Palacios et al., 2020; Rahim et al., 2022; Rudolph et al., 2019), the number of bunches (Íñiguez et al., 2024), bunch features like berry number or bunch length (Liu et al., 2022a; Rist et al., 2019), berries (Zabawa et al., 2020) as well as their size (Engler et al., 2023). Besides berry counting, it was shown that it is possible to detect and quantify flowers prior to opening using RGB images from a consumer camera (Moreira et al., 2025; Rudolph et al., 2019) or mobile platforms (Palacios et al., 2020; Rahim et al., 2022). In those studies, the detection and quantification of flowers is applied as a strategy of yield prediction and could be further used to quantify flower abscission by comparing the flower number with the berry number at fruit set. However, for the selection of grapevines in an early breeding step, it is sufficient to simply divide the genotypes into different yield classes according to the pixel area of the berries per vine. Thresholds for under- and over-yield, like the “grape bunch-pruning wood ratio”, could be determined and used as the selection criteria for very simple and fast image analysis using simple RGB sensors (Kicherer et al., 2017b). Visible berries have been shown to be good predictors of yield, but the occlusion of grapes and leaves needs to be taken into account (Kierdorf et al., 2022). However, the best results are obtained when the bunch zone is defoliated (Kierdorf et al., 2022). Overall, defoliation of the grape bunch zone can improve image data acquisition in high-throughput phenotyping by increasing the visibility of grapes and berries, reducing shadowing as shown by Íñiguez et al. (2024) and Kierdorf et al. (2022), and thus improving image quality and more effective capture of, for example, the yield trait. However, the potential disadvantages, particularly the risk of sunburn damage and overexposure, must be carefully considered. A balanced defoliation strategy that both leverages the benefits of high-throughput phenotyping and minimises the risks of altering the grapevine’s growth conditions is crucial for successful application in phenotyping.
Viticulture traits like growth habit, canopy structure, and wood maturation are further traits that need to be evaluated during the breeding process (Töpfer & Trapp, 2022). Klodt et al. (2015) used RGB image pairs to compare the visual leaf areas of two breeding lines with traditional cultivars and showed that phenotypic applications to monitor plant growth and fruit-to-leaf are possible. Kicherer et al. (2017b) showed an image-based approach to evaluate pruning mass as a highly valuable tool for grapevine research and grapevine breeding that has facilitated an objective evaluation of the yield:wood ratio. Furthermore, LiDAR technology facilitates high spatial and temporal resolution and thus facilitates investigations of canopy characteristics and vine growth (Chedid et al., 2023). The throughput and accuracy are high enough that Chedid et al. (2023) successfully applied it for QTL mapping. Moreover, the tool might be useful for industry and grapevine breeding programs to monitor vegetative and generative biomass and evaluate grapevine balance as done by Kicherer et al. (2017b). Even more specific recommendations for action, such as the selection of woody grapevines for propagation, could be derived from this.
2. Phenology
Due to climate change, extreme weather conditions such as late spring frosts shortly after bud burst and earlier ripening periods in warm and wet environmental conditions are becoming a challenge for viticulture in many wine-growing regions across the world and are therefore also relevant for breeding programmes. The detection of phenological stages such as bud burst, flowering, veraison, and ripening is becoming increasingly important. To detect the date of these stages, a combination of temperature sums and sensors could be used, as shown for the detection of bud bursts using SLR images by Herzog et al. (2015).
Detecting the time of flowering, defined as BBCH 65 (50 % of flowers per inflorescence open = OIV descriptor 302), is a developmental stage more difficult to detect because of the fine structure of flowers and a comparable low spatial resolution in images. To the best of our knowledge, no study has been published using HT phenotyping for flowering detection, yet. However, as it was shown that it is possible to quantify closed flowers using RGB images (Moreira et al., 2025; Rudolph et al., 2019) for yield prediction, RGB images could also be used to detect flowering time. Therefore, the visible changes in colour, as well as geometry between closed flower buds (clearly defined, green sphere/circles) and blossom flowers (fine structures of anthers with yellowish-white colour), could be used for image-based detection of flowering. To monitor flowering time with the mentioned proximal sensing would require multiple screenings, as the time of flowering varies depending on the genetic background of the breeding material and the environmental growing conditions. In addition, the spatial resolution of the sensor system used would have to be considered here, as the object size of individual flowers is very small, at just a few millimetres. However, this “discolouration” is also very variety-dependent and does not always appear equally, which could be a further challenge. One could imagine that the area performance of data acquisition could be better achieved to meet the breeder’s scaling requirements by using remote sensing, for example, the use of drones. However, the spatial resolution of the sensor would also have to be considered here.
Veraison is another very relevant phenological stage, which is described by OIV 303 as the softening of the berry and the onset of berry ripening. Almost simultaneously, red varieties change the berry colour. Detection of colour is possible using RGB imaging, although detecting the change of colour (veraison) will be more challenging for white-berried grapevine genotypes than for black ones. Detection of the right time point of colour change will require multiple measurements of the same set of genotypes over several days and weeks. The environment, and especially the changing light conditions in the field, will be the most challenging part of correctly detecting the time of ripening. The non-destructive detection of ripening will be feasible using handheld spectral sensors (Gebauer et al., 2021), fluorescence sensing (Tomada et al., 2022) in the field or on-the-go HSI (Fernández-Novales et al., 2021). But comparable to low-throughput phenotyping, repetition of the measurement will be required. Rinaldi et al. (2013) had a slightly different approach. To determine the grapevine’s BBCH stage, they characterised the canopy geometry and linked relationship between tree row volume (TRV) and leaf wall area (LWA) detected using a LiDAR sensor to each growth stage. Showing that the relationship between the estimated values of TRV and LWA and the growth stage was statistically significant. It remains to be seen whether the detection accuracy of this method is capable of resolving the exact veraison time point to distinguish individual genotypes and thus be of effective use in breeding.
3. Vitality and vigour of the grapevine canopy
Grapevine vitality describes the health, functionality, and performance of the plant, meaning how “fit” it is, independent of pure vigour. Characteristics that describe vitality include, for example, water and nutrient supply, photosynthetic activity, existing disease pressure or abiotic stress, and even development. Vigour, on the other hand, describes a plant’s growth potential—specifically, how strongly and quickly it grows (e.g., shoot length, number of leaves, foliage density). Plant performance under several stress conditions, disease resistance, a balanced fruit:leaf ratio, optimal yield, and high wine quality are key parameters for both vitality and health. Grapevine vigour is often determined by flying sensors, especially UAVs, and used to estimate vigour-related traits, like yield or wine quality (Table 1). Regarding grapevine vitality, it is essential to distinguish demands in grapevine breeding from those in grapevine breeding research.
4. Breeding
In the earliest stage of grapevine breeding, seedling selection is based on the evaluation of the overall behaviour of grapevines. The simplest strategy for HT field phenotyping is thus, screening seedlings and classifying them into “healthy” or “not healthy” genotypes without differentiation or quantification of individual symptoms, as exemplarily was shown by Miranda et al. (2022). It will, for example, give sufficient initial information on how good disease resistance QTLs work in a natural environment over several years with different conditions and disease pressure.
5. Breeding research
However, in grapevine research, it is needed to differentiate symptoms of non-healthy plants, quantify different diseases, as well as evaluate different plant organs. The individual detection of biotic and abiotic stresses or the nutritional status of grapevines has been done using multi- or hyperspectral sensors. MSI and HSI take changes in the spectral reflectance due to chemical changes in leaves and berries, their pigment content, organ structure, leaf nitrogen, water status, and stress markers into account. As the most important diseases in viticulture are powdery mildew and downy mildew, their investigation is rather done in lab experiments than in field trials (see section above). However, regarding the identification of novel resistance donors, HT field screenings of germplasms are promising. RGB imaging proved useful for diagnostic approaches and symptom detection, as shown for downy mildew (Liu et al., 2022b), but also hyperspectral imaging enables the assessment of reflectance variation caused by infections with, for example, powdery mildew (Vélez et al., 2024). Another important example of disease detection in the field is the detection of endogenous diseases such as grapevine yellows and viruses. They are transferable through vegetative propagation and grafting, no natural resistances are known, and until today, no curative treatment is possible in the field. Therefore, the management of these diseases is based on prophylaxis to produce healthy plant material during propagation, and, therefore, also of big interest for grapevine breeding. Screening is done by visual monitoring, random molecular checks, and clearing of infected vines. Often, the latency periods and asymptomatic plants at the time of visual monitoring are challenging. Bendel et al. (2020) developed HSI-based HT phenotyping approaches for the detection of endogenous diseases, demonstrating that the detection is possible even in pre-symptomatic grapevines in some cases. In these studies, specific wavelength bands could be identified as relevant for disease detection and thus are supportive in distinguishing different diseases. Here, reliable wavelength bands can be used to derive specific plant indices describing the phenotype (for review, see Xue and Su (2017)). The most commonly used index in plant science and precision viticulture is the “Normalized Difference Vegetation Index”, abbreviated as NDVI (NIR – RED/NIR + RED). It is very useful to characterise vegetation vitality by measuring the difference in reflection between near-infrared (strongly reflected by healthy plants) and red light, which is absorbed by them. In general, NDVI ranges from –1 (pure water) to +1, but for plant research, only values between 0 and +1 are relevant. Hereby, values closer to 1 indicate a larger surface of green, healthy vegetation, and values closer to 0 can be observed capturing, for instance, senescent leaves or non-vegetated surfaces like soil. That means the higher the values, the healthier the plants. Correlations of NDVI and diseases have been shown, for example, for viruses (Sinha et al., 2019) or were shown as a promising indicator for grape berry maturation, vigour, and subsequent grape yield (Shmuleviz et al., 2024). However, most of the published applications for NDVI belong to precision viticulture, and the spatial resolution for breeding and breeding research of these methods is often too low. Conclusive, extended research is needed on the differentiation and quantification of single symptoms when occurring mixed on single grapevines in the field.
Finally, the detection of abiotic stress symptoms using HT phenotyping has become increasingly important for grapevine breeding and viticulture. Methods combining sensors and machine learning have significantly enhanced precision and shown promising results in water stress detection and monitoring (Abbatantuono et al., 2024). HSI and MSI, in particular, have emerged as powerful tools for detecting and quantifying abiotic stress in viticulture (Carvalho et al., 2021) and it was further successfully used to detect water stress, nutrient deficiencies, grape maturity, and heat stress in grapevines (Diago et al., 2022; Kang et al., 2024).
Conclusion
Nowadays, different sensor technologies and promising AI tools are available for the HT phenotyping of grapevines. New sensor-based pipelines have been developed for more and more traits analysed in the lab, greenhouses, and fields, improving QTL mapping, knowledge about grapevine’s resistance, and the evaluation of breeding material and new PIWI varieties. From a technical point of view, new sensor technologies have been validated for their application in non-invasive and fast phenotyping of challenging traits like physiological responses to abiotic stressors. Additionally, new, mostly ANN-based algorithms have been developed aiming at an automated, reliable analysis of sensor data. Both facilitate technical solutions to: (1) overcome the phenotyping bottleneck in grapevine breeding research and pre-breeding; (2) provide objective, precise phenotypic information with high spatial and temporal resolution; and (3) speed up grapevine breeding due to a faster development of new molecular markers for MAS as well as an improved evaluation and selection of seedlings.
In the future, grapevine breeding, for instance, could use HT field phenotyping that could collect and store sensor data at the speed of agricultural machinery with high spatial resolution. Therefore, we can be sure that advancements and further developments are an ongoing process, and that computing power or technical infrastructure, be it hardware, software, or data transfer, will be continuously improved over the years to face that demand. That could be shown successfully in the last decade by the published advances in promising proofs-of-concept for a wide range of imaging and non-imaging sensors, platforms, and phenotypic characteristics. In addition, phenotypic data as well as the collection and storage of data need to be FAIR (findable, accessible, interoperable, and reusable) and robust throughout the pipeline. To achieve this, it would be beneficial to identify companies to develop phenotyping pipelines and provide technical support and adjustment of the system over time. Further, the exchange of knowledge, data, and AI models is of much importance, as challenges might limit the use and effectiveness of HT phenotyping. AI models hereby especially depend heavily on high-quality training data representing trait diversity. To the best of our knowledge, no standardised datasets (covering different varieties, vineyards, and environments) on, for example, different grapevine diseases, are available yet, which makes it difficult to use these tools widely across different vineyards. To overcome these limitations, future research needs to focus on creating affordable, reliable tools tailored to various vineyard conditions. Open-source datasets and collaborative platforms could speed up AI improvements and make these technologies more accessible. All of the stated points can be faced due to building an emphasised, collaborative phenotyping network within the community of grapevine (breeding) research in order to facilitate the exchange of data, i.e., providing sensor data and corresponding ground truth.
From a breeding point of view, there is also room for several improvements regarding traits that need to be evaluated objectively using HT phenotyping. That involves:
- New proxies for (i) greenhouse plants to foresee their performance in the field (e.g., canopy architecture) or (ii) complex, quantitative traits that mediate resilience to biotic and abiotic stress (e.g., physical-mechanical traits of berry skins).
- Sensor-based applications predict wine quality by screening grapevine plants, respectively, the grapes on the plant directly in the field.
- Extended research to distinguish between disease infections and nutrition status based on spectral imaging or improved AI-driven algorithms. Furthermore, it needs to be considered that the relevance of spectra may vary extremely between different years, making several years and locations of field observation necessary for reliable conclusions.
Advanced HT-phenotyping using sensors and AI combined with omics data-like genomics and metabolomics will finally provide deep insights into grapevine response to biotic and abiotic stresses, as well as supporting predictive breeding and supervising decision support systems. These advancements would not only enhance the precision and scalability of seedling selection in viticulture but also contribute to more sustainable and resilient grapevine cultivation practices.
Acknowledgements
We, the authors, acknowledge Ribana Roscher (Research Centre Jülich and Bonn University, Germany) for proofreading the AI section.
Further, we would like to express our gratitude to all the members, guests, and interns of our research group who have contributed to the progress and success of our work with their tireless effort over the past decade. We particularly thank every PhD student and post-doc who enhanced our third-party projects with their perspective, innovative ideas, and enthusiasm. Their contributions have been instrumental in advancing our understanding and addressing key challenges in grapevine breeding research. We further deeply acknowledge our technical team members, especially Sarina Elser, Ann-Kathrin Leonard, Katrin Portugall, and Patrick Römer, for their valuable support in maintaining the operation of our experiments, fieldwork, and laboratory analyses over the years. Their meticulous work and problem-solving abilities have been the backbone of our research efforts.
Finally, the authors gratefully acknowledge each of you—inside and outside of our institute—who have been part of our journey throughout the world of sensor-based phenotyping and precision viticulture; thank you for your commitment, collaboration, and passion for applied science. It is a privilege to work alongside such inspiring, talented, and dedicated scientists and breeders.
References
- Abbatantuono, F., Lopriore, G., Tallou, A., Brillante, L., Ali, S. A., Camposeo, S., & Vivaldi, G.A. (2024). Recent progress on grapevine water status assessment through remote and proximal sensing: A review. Scientia Horticulturae, 338, 113658. https://doi.org/10.1016/j.scienta.2024.113658
- Bendel, N., Backhaus, A., Kicherer, A., Köckerling, J. Maixner, M., Jarausch, B., Biancu, S., Klück, H. C., Seiffert, U., Voegele, R. T., & Töpfer, R. (2020). Detection of Two Different Grapevine Yellows in Vitis vinifera Using Hyperspectral Imaging. Remote Sensing, 12(24), 4151. https://doi.org/10.3390/rs12244151
- Bendel, N., Jäger, J., Töpfer, R., & Herzog, K. (2024). Climate protection based on increased humus amounts in vineyard soils: sensor-based analysis of grapevine vitality. Acta Horticulturae(1390), 241–248. https://doi.org/10.17660/ActaHortic.2024.1390.29
- Bierman, A., LaPlumm, T., Cadle-Davidson, L., Gadoury, D., Martinez, D., Sapkota, S., & Rea, M. (2019). A High-Throughput Phenotyping System Using Machine Vision to Quantify Severity of Grapevine Powdery Mildew. Plant Phenomics, 2019, 9209727. https://doi.org/10.34133/2019/9209727
- Bodor-Pesti, P., Taranyi, D., Nyitrainé Sárdy, D. Á., Le Phuong Nguyen, L., & Baranyai, L. (2023). Correlation of the Grapevine (Vitis vinifera L.) Leaf Chlorophyll Concentration with RGB Color Indices. Horticulturae, 9(8), 899. https://doi.org/10.3390/horticulturae9080899
- Brault, C., Segura, V., Roques, M., Lamblin, P., Bouckenooghe, V., Pouzalgues, N., Cunty, C., Breil, M., Frouin, M., Garcin, L., Camps, L., Ducasse, M. A., Romieu, C., Masson, G., Julliard, S., Flutre, T., & Le Cunff, L. (2024). Enhancing grapevine breeding efficiency through genomic prediction and selection index. G3 (Bethesda, Md.), 14(4). https://doi.org/10.1093/g3journal/jkae038
- Carvalho, L. C., Gonçalves, E. F., Da Marques Silva, J., & Costa, J. M. (2021). Potential Phenotyping Methodologies to Assess Inter- and Intravarietal Variability and to Select Grapevine Genotypes Tolerant to Abiotic Stress. Frontiers in Plant Science, 12, 718202. https://doi.org/10.3389/fpls.2021.718202
- Chedid, E., Avia, K., Dumas, V., Ley, L., Reibel, N., Butterlin, G., Soma, M., Lopez-Lozano, R., Baret, F., Merdinoglu, D., & Duchêne, É. (2023). Lidar Is Effective in Characterizing Vine Growth and Detecting Associated Genetic Loci. Plant Phenomics (Washington, D.C.), 5, 116. https://doi.org/10.34133/plantphenomics.0116
- Choi, R. Y., Coyner, A. S., Kalpathy-Cramer, J., Chiang, M. F., & Campbell, J. P. (2020). Introduction to Machine Learning, Neural Networks, and Deep Learning. Translational Vision Science & Technology, 9(2), 14.
- Damásio, M., Barbosa, M., Deus, J., Fernandes, E., Leitão, A., Albino, L., Fonseca, F., & Silvestre, J. (2023). Can Grapevine Leaf Water Potential Be Modelled from Physiological and Meteorological Variables? A Machine Learning Approach. Plants (Basel, Switzerland), 12(24). https://doi.org/10.3390/plants12244142
- Diago, M. P., Tardaguila, J., Barrio, I., & Fernández-Novales, J. (2022). Combination of multispectral imagery, environmental data and thermography for on-the-go monitoring of the grapevine water status in commercial vineyards. European Journal of Agronomy, 140, 126586. https://doi.org/10.1016/j.eja.2022.126586
- Di Gennaro, S. F., Toscano, P., Cinat, P., Berton, A., & Matese, A. (2019). A Low-Cost and Unsupervised Image Recognition Methodology for Yield Estimation in a Vineyard. Frontiers in Plant Science, 10, 559. https://doi.org/10.3389/fpls.2019.00559
- Elsherbiny, O., Elaraby, A., Alahmadi, M., Hamdan, M., & Gao, J. (2024). Rapid Grapevine Health Diagnosis Based on Digital Imaging and Deep Learning. Plants (Basel, Switzerland), 13(1). https://doi.org/10.3390/plants13010135
- Engler, H., Gauweiler, P., Huber, F., Krause, J., Fischer, B., Hoffmann, B., Schumacher, P., Yushchenko, A., Gruna, R., Steinhage, V., Herzog, K., Töpfer, R., & Kicherer, A. (2023). Phenoquad: A new multi sensor platform for field phenotyping and screening of yield relevant characteristics within grapevine breeding research. VITIS - Journal of Grapevine Research, Vol. 62: (Special Issue), 41-48. https://doi.org/10.5073/VITIS.2023.62.SPECIAL-ISSUE.41-48
- Fernandes de Oliveira, A., Piga, G. K., Najoui, S., Becca, G., Marceddu, S., Rigoldi, M. P., Satta, D., Bagella, S., & Nieddu, G. (2024). UV light and adaptive divergence of leaf physiology, anatomy, and ultrastructure drive heat stress tolerance in genetically distant grapevines. Frontiers in Plant Science, 15, 1399840. https://doi.org/10.3389/fpls.2024.1399840
- Fernandez, R., Le Cunff, L., Mérigeaud, S., Verdeil, J. L., Perry, J., Larignon, P., Spilmont, A. S., Chatelet, P., Cardoso, M., Goze-Bac, C., & Moisy, C. (2024). End-to-end multimodal 3D imaging and machine learning workflow for non-destructive phenotyping of grapevine trunk internal structure. Scientific Reports, 14(1), 5033. https://doi.org/10.1038/s41598-024-55186-3
- Fernández-Novales, J., Barrio, I., & Diago, M. P. (2021). Non-Invasive Monitoring of Berry Ripening Using On-the-Go Hyperspectral Imaging in the Vineyard. Agronomy, 11(12), 2534. https://doi.org/10.3390/agronomy11122534
- Fichtl, L., Hofmann, M., Kahlen, K., Voss-Fels, K. P., Cast, C. S., Ollat, N., Vivin, P., Loose, S., Nsibi, M., Schmid, J., Strack, T., Schultz, H. R., Smith, J., & Friedel, M. (2023). Towards grapevine root architectural models to adapt viticulture to drought. Frontiers in Plant Science, 14, 1162506. https://doi.org/10.3389/fpls.2023.1162506
- Gatou, P., Tsiara, X., Spitalas, A., Sioutas, S., & Vonitsanos, G. (2024). Artificial Intelligence Techniques in Grapevine Research: A Comparative Study with an Extensive Review of Datasets, Diseases, and Techniques Evaluation. Sensors (Basel, Switzerland), 24(19). https://doi.org/10.3390/s24196211
- Gebauer, L., Krause, J., Zheng, X., Kronenwett, F., Gruna, R., Töpfer, R., & Kicherer, A. (2021). Developing a handheld NIR sensor for the detection of ripening in grapevine. Proceedings OCM 2021 - 5th International Conference on Optical Characterization of Materials, March 17th - 18th, 2021, Karlsruhe, Germany: Conference, 70–81. https://doi.org/10.58895/ksp/1000128686-7
- Grimm, J., Herzog, K., Rist, F., Kicherer, A., Töpfer, R., & Steinhage, V. (2019). An adaptable approach to automated visual detection of plant organs with applications in grapevine breeding. Biosystems Engineering, 183, 170–183. https://doi.org/10.1016/j.biosystemseng.2019.04.018
- Gutiérrez, S., Fernández-Novales, J., Diago, M. P., & Tardaguila, J. (2018). On-The-Go Hyperspectral Imaging Under Field Conditions and Machine Learning for the Classification of Grapevine Varieties. Frontiers in Plant Science, 9, 1102. https://doi.org/10.3389/fpls.2018.01102
- Haucke, T., Herzog, K., Barré, P., Höfle, R., Töpfer, R., & Steinhage, V. (2021). Improved optical phenotyping of the grape berry surface using light-separation and automated RGB image analysis. VITIS - Journal of Grapevine Research, Vol. 60 No.1, 1-10. https://doi.org/10.5073/VITIS.2021.60.1-10
- Hernández, I., Gutiérrez, S., & Tardaguila, J. (2024). Image analysis with deep learning for early detection of downy mildew in grapevine. Scientia Horticulturae, 331, 113155. https://doi.org/10.1016/j.scienta.2024.113155
- Herzog, K., Kicherer, A., & Töpfer, R. (2015). Objective phenotyping the time of bud burst by analyzing grapevine field images. Acta Horticulturae(1082), 379–385. https://doi.org/10.17660/ActaHortic.2015.1082.53
- Herzog, K., Schwander, F., Kassemeyer, H. H., Bieler, E., Dürrenberger, M., Trapp, O., & Töpfer, R. (2021). Towards Sensor-Based Phenotyping of Physical Barriers of Grapes to Improve Resilience to Botrytis Bunch Rot. Frontiers in Plant Science, 12, 808365. https://doi.org/10.3389/fpls.2021.808365
- Íñiguez, R., Gutiérrez, S., Poblete-Echeverría, C., Hernández, I., Barrio, I., & Tardáguila, J. (2024). Deep learning modelling for non-invasive grape bunch detection under diverse occlusion conditions. Computers and Electronics in Agriculture, 226, 109421. https://doi.org/10.1016/j.compag.2024.109421
- Kang, C., Diverres, G., Karkee, M., Zhang, Q., & Keller, M. (2024). Assessing grapevine water status through fusion of hyperspectral imaging and 3D point clouds. Computers and Electronics in Agriculture, 226, 109488. https://doi.org/10.1016/j.compag.2024.109488
- Kicherer, A., Herzog, K., Bendel, N., Klück, H. C., Backhaus, A., Wieland, M., Rose, J. C., Klingbeil, L., Läbe, T., Hohl, C., & Petry, W. Kuhlmann, H. Seiffert, U. Töpfer, R. (2017a). Phenoliner: A New Field Phenotyping Platform for Grapevine Research. Sensors (Basel, Switzerland), 17(7). https://doi.org/10.3390/s17071625
- Kicherer, A., Klodt, M., Sharifzadeh, S., Cremers, D., Töpfer, R., & Herzog, K. (2017b). Automatic image-based determination of pruning mass as a determinant for yield potential in grapevine management and breeding. Australian Journal of Grape and Wine Research, 23(1), 120–124. https://doi.org/10.1111/ajgw.12243
- Kierdorf, J., Weber, I., Kicherer, A., Zabawa, L., Drees, L., & Roscher, R. (2022). Behind the Leaves: Estimation of Occluded Grapevine Berries With Conditional Generative Adversarial Networks. Frontiers in Artificial Intelligence, 5, 830026. https://doi.org/10.3389/frai.2022.830026
- Klodt, M., Herzog, K., Töpfer, R., & Cremers, D. (2015). Field phenotyping of grapevine growth using dense stereo reconstruction. BMC Bioinformatics, 16(1), 143. https://doi.org/10.1186/s12859-015-0560-x
- Krzyzaniak, Y., Cointault, F., Loupiac, C., Bernaud, E., Ott, F., Salon, C., Laybros, A., Han, S., Héloir, M. C., Adrian, M., & Trouvelot, S. (2021). In situ Phenotyping of Grapevine Root System Architecture by 2D or 3D Imaging: Advantages and Limits of Three Cultivation Methods. Frontiers in Plant Science, 12, 638688. https://doi.org/10.3389/fpls.2021.638688
- Li, Z., Guo, R., Li, M., Chen, Y., & Li, G. (2020). A review of computer vision technologies for plant phenotyping. Computers and Electronics in Agriculture, 176, 105672. https://doi.org/10.1016/j.compag.2020.105672
- Li, Z., Dos Santos, R. F., Gao, L., Chang, P., & Wang, X. (2021). Current status and future prospects of grapevine anthracnose caused by Elsinoe ampelina: An important disease in humid grape-growing regions. Molecular Plant Pathology, 22(8), 899–910. https://doi.org/10.1111/mpp.13076
- Liu, S., Cossell, S., Tang, J., Dunn, G., & Whitty, M. (2017). A computer vision system for early-stage grape yield estimation based on shoot detection. Computers and Electronics in Agriculture, 137, 88–101. https://doi.org/10.1016/j.compag.2017.03.013
- Liu, W., Wang, C., de Yan, Chen, W., & Luo, L. (2022a). Estimation of Characteristic Parameters of Grape Clusters Based on Point Cloud Data. Frontiers in Plant Science, 13, 885167. https://doi.org/10.3389/fpls.2022.885167
- Liu, E., Gold, K. M., Combs, D., Cadle-Davidson, L., & Jiang, Y. (2022b). Deep semantic segmentation for the quantification of grape foliar diseases in the vineyard. Frontiers in Plant Science, 13, 978761. https://doi.org/10.3389/fpls.2022.978761
- Lopes, C., & Cadima, J. (2021). Grapevine bunch weight estimation using image-based features: comparing the predictive performance of the number of visible berries and bunch area. OENO One, 55(4), 209–226. https://doi.org/10.20870/oeno-one.2021.55.4.4741
- Lu, Y., & Lu, R. (2019). Structured-illumination reflectance imaging for the detection of defects in fruit: Analysis of resolution, contrast and depth-resolving features. Biosystems Engineering, 180, 1–15. https://doi.org/10.1016/j.biosystemseng.2019.01.014
- Macia, F. M., Possamai, T., Dorne, M. A., Lacombe, M. C., Duchêne, E., Merdinoglu, D., Peeters, N., Rousseau, D., & Wiedemann-Merdinoglu, S. (2024). Phenotyping grapevine resistance to downy mildew: Deep learning as a promising tool to assess sporulation and necrosis. Plant Methods, 20(1), 90. https://doi.org/10.1186/s13007-024-01220-4
- Mack, J., Rist, F., Herzog, K., Töpfer, R., & Steinhage, V. (2020). Constraint-based automated reconstruction of grape bunches from 3D range data for high-throughput phenotyping. Biosystems Engineering, 197, 285–305. https://doi.org/10.1016/j.biosystemseng.2020.07.004
- Magalhães, S. C., Castro, L., Rodrigues, L., Padilha, T. C., Carvalho, F. de, Neves dos Santos, F., Pinho, T., Moreira, G., Cunha, J., Cunha, M., Silva, P., & Moreira, A. P. (2023). Toward Grapevine Digital Ampelometry Through Vision Deep Learning Models. IEEE Sensors Journal, 23(9), 10132–10139. https://doi.org/10.1109/JSEN.2023.3261544
- Malagol, N., Rao, T., Werner, A., Töpfer, R., & Hausmann, L. (2025). A high-throughput ResNet CNN approach for automated grapevine leaf hair quantification. Scientific Reports, 15(1), 1590. https://doi.org/10.1038/s41598-025-85336-0
- Miranda, M., Zabawa, L., Kicherer, A., Strothmann, L., Rascher, U., & Roscher, R. (2022). Detection of Anomalous Grapevine Berries Using Variational Autoencoders. Frontiers in Plant Science, 13, 729097. https://doi.org/10.3389/fpls.2022.729097
- Moreira, G., Neves dos Santos, F., & Cunha, M. (2025). Grapevine inflorescence segmentation and flower estimation based on Computer Vision techniques for early yield assessment. Smart Agricultural Technology, 10, 100690. https://doi.org/10.1016/j.atech.2024.100690
- Moreno, H., & Andújar, D. (2023). Proximal sensing for geometric characterization of vines: A review of the latest advances. Computers and Electronics in Agriculture, 210, 107901. https://doi.org/10.1016/j.compag.2023.107901
- Muthukumarana, P. S., & Aponso, A. C. (2020). A Review on Deep Learning Based Image Classification of Plant Diseases. International Journal of Computer Theory and Engineering, 12(5), 118–122. https://doi.org/10.7763/IJCTE.2020.V12.1275
- Nagi, R., & Tripathy, S. S. (2022). Deep convolutional neural network based disease identification in grapevine leaf images. Multimedia Tools and Applications, 81(18), 24995–25006. https://doi.org/10.1007/s11042-022-12662-0
- Ollat, N., Marguerit, E., Miguel, M. de, Coupel-Ledru, A., Cookson, S. J., van Leeuwen, C., Vivin, P., Gallusci, P., Segura, V., & Duchêne, E. (2023). Moving towards grapevine genotypes better adapted to abiotic constraints. Advance online publication. https://doi.org/10.5073/vitis.2023.62.special-issue.67-76 (67-76 Pages / VITIS - Journal of Grapevine Research, Vol. 62 (2023): Vitis (Special Issue)).
- Palacios, F., Bueno, G., Salido, J., Diago, M. P., Hernández, I., & Tardaguila, J. (2020). Automated grapevine flower detection and quantification method based on computer vision and deep learning from on-the-go imaging using a mobile sensing platform under field conditions. Computers and Electronics in Agriculture, 178, 105796. https://doi.org/10.1016/j.compag.2020.105796
- Parr, B., Legg, M., & Alam, F. (2022). Analysis of Depth Cameras for Proximal Sensing of Grapes. Sensors (Basel, Switzerland), 22(11). https://doi.org/10.3390/s22114179
- Pérez-Roncal, C., López-Maestresalas, A., Lopez-Molina, C., Jarén, C., Urrestarazu, J., Santesteban, L. G., & Arazuri, S. (2020). Hyperspectral Imaging to Assess the Presence of Powdery Mildew (Erysiphe necator) in cv. Carignan Noir Grapevine Bunches. Agronomy, 10(1), 88. https://doi.org/10.3390/agronomy10010088
- Qiu, T., Underhill, A., Sapkota, S., Cadle-Davidson, L., & Jiang, Y. (2022). High throughput saliency-based quantification of grape powdery mildew at the microscopic level for disease resistance breeding. Horticulture Research, 9, uhac187. https://doi.org/10.1093/hr/uhac187
- Rahim, U. F., Utsumi, T., & Mineno, H. (2022). Deep learning-based accurate grapevine inflorescence and flower quantification in unstructured vineyard images acquired using a mobile sensing platform. Computers and Electronics in Agriculture, 198, 107088. https://doi.org/10.1016/j.compag.2022.107088
- Reisch, B. I., Cadle-Davidson, L., Ikeogu, U., Sacks, G. L., Londo, J. P., & Martinson, T. E. (2023). Contributions of the VitisGen2 project to grapevine breeding and genetics. Advance online publication. https://doi.org/10.5073/vitis.2023.62.special-issue.88-91 (88-91 Pages / VITIS - Journal of Grapevine Research, Vol. 62 (2023): Vitis (Special Issue)).
- Richter, R., Gabriel, D., Rist, F., Töpfer, R., & Zyprian, E. (2019). Identification of co-located QTLs and genomic regions affecting grapevine cluster architecture. TAG. Theoretical and Applied Genetics. Theoretische Und Angewandte Genetik, 132(4), 1159–1177. https://doi.org/10.1007/s00122-018-3269-1
- Rinaldi, M., Llorens, J., & Gil, E. (2013). Electronic characterization of the phenological stages of grapevine using a LIDAR sensor. Wageningen Academic Publishers. https://doi.org/10.3920/9789086867783_076
- Rist, F., Herzog, K., Mack, J., Richter, R., Steinhage, V., & Töpfer, R. (2018). High-Precision Phenotyping of Grape Bunch Architecture Using Fast 3D Sensor and Automation. Sensors (Basel, Switzerland), 18(3). https://doi.org/10.3390/s18030763
- Rist, F., Gabriel, D., Mack, J., Steinhage, V., Töpfer, R., & Herzog, K. (2019). Combination of an Automated 3D Field Phenotyping Workflow and Predictive Modelling for High-Throughput and Non-Invasive Phenotyping of Grape Bunches. Remote Sensing, 11(24), 2953. https://doi.org/10.3390/rs11242953
- Rist, F., Schwander, F., Richter, R., Mack, J., Schwandner, A., Hausmann, L., Steinhage, V., Töpfer, R., & Herzog, K. (2022). Relieving the Phenotyping Bottleneck for Grape Bunch Architecture in Grapevine Breeding Research: Implementation of a 3D-Based Phenotyping Approach for Quantitative Trait Locus Mapping. Horticulturae, 8(10), 907. https://doi.org/10.3390/horticulturae8100907
- Rose, J. C., Kicherer, A., Wieland, M., Klingbeil, L., Töpfer, R., & Kuhlmann, H. (2016). Towards Automated Large-Scale 3D Phenotyping of Vineyards under Field Conditions. Sensors (Basel, Switzerland), 16(12). https://doi.org/10.3390/s16122136
- Rudolph, R., Herzog, K., Töpfer, R., & Steinhage, V. (2019). Efficient identification, localization and quantification of grapevine inflorescences and flowers in unprepared field images using Fully Convolutional Networks. VITIS - Journal of Grapevine Research, Vol 58 No 3, 95-104. https://doi.org/10.5073/VITIS.2019.58.95-104
- Sarić, R., Nguyen, V. D., Burge, T., Berkowitz, O., Trtílek, M., Whelan, J., Lewsey, M. G., & Čustović, E. (2022). Applications of hyperspectral imaging in plant phenotyping. Trends in Plant Science, 27(3), 301–315. https://doi.org/10.1016/j.tplants.2021.12.003
- Sawyer, E., Laroche-Pinel, E., Flasco, M., Cooper, M. L., Corrales, B., Fuchs, M., & Brillante, L. (2023). Phenotyping grapevine red blotch virus and grapevine leafroll-associated viruses before and after symptom expression through machine-learning analysis of hyperspectral images. Frontiers in Plant Science, 14, 1117869. https://doi.org/10.3389/fpls.2023.1117869
- Schmitz, R., Atkinson, B. S., Sturrock, C. J., Hausmann, L., Töpfer, R., & Herzog, K. (2021). High-resolution 3D phenotyping of the grapevine root system using X-ray Computed Tomography. VITIS - Journal of Grapevine Research, Vol. 60 No. 1, 21-27. https://doi.org/10.5073/vitis.2021.60.21-27
- Sharma, P., Thilakarathna, I., & Fennell, A. (2024). Hyperspectral imaging and artificial intelligence enhance remote phenotyping of grapevine rootstock influence on whole vine photosynthesis. Frontiers in Plant Science, 15, 1409821. https://doi.org/10.3389/fpls.2024.1409821
- Shmuleviz, R., Amato, A., Previtali, P., Green, E., Sanchez, L., Alsina, M. M., Dokoozlian, N., Tornielli, G. B., & Fasoli, M. (2024). Spatial Variability of Grape Berry Maturation Program at the Molecular Level. Horticulturae, 10(3), 238. https://doi.org/10.3390/horticulturae10030238
- Sinha, R., Khot, L. R., Rathnayake, A. P., Gao, Z., & Naidu, R. A. (2019). Visible-near infrared spectroradiometry-based detection of grapevine leafroll-associated virus 3 in a red-fruited wine grape cultivar. Computers and Electronics in Agriculture, 162, 165–173. https://doi.org/10.1016/j.compag.2019.04.008
- Tello, J., & Ibáñez, J. (2018). What do we know about grapevine bunch compactness? A state-of-the-art review. Australian Journal of Grape and Wine Research, 24(1), 6–23. https://doi.org/10.1111/ajgw.12310
- Tomada, S., Agati, G., Serni, E., Michelini, S., Lazazzara, V., Pedri, U., Sanoll, C., Matteazzi, A., Robatscher, P., & Haas, F. (2022). Non-destructive fluorescence sensing for assessing microclimate, site and defoliation effects on flavonol dynamics and sugar prediction in Pinot blanc grapes. PloS One, 17(8), e0273166. https://doi.org/10.1371/journal.pone.0273166
- Too, E. C., Yujian, L., Njuki, S., & Yingchun, L. (2019). A comparative study of fine-tuning deep learning models for plant disease identification. Computers and Electronics in Agriculture, 161, 272–279. https://doi.org/10.1016/j.compag.2018.03.032
- Töpfer, R., Trapp, O. (2022). A cool climate perspective on grapevine breeding: climate change and sustainability are driving forces for changing varieties in a traditional market. Theoretical and Applied Genetics, 135, 3947–3960. https://doi.org/10.1007/s00122-022-04077-0
- Torres-Lomas, E., Lado-Bega, J., Garcia-Zamora, G., & Diaz-Garcia, L. (2024). Segment Anything for Comprehensive Analysis of Grapevine Cluster Architecture and Berry Properties. Plant Phenomics (Washington, D.C.), 6, 202. https://doi.org/10.34133/plantphenomics.0202
- Underhill, A., Hirsch, C., & Clark, M. (2020). Image-based Phenotyping Identifies Quantitative Trait Loci for Cluster Compactness in Grape. Journal of the American Society for Horticultural Science, 145(6), 363–373. https://doi.org/10.21273/JASHS04932-20
- Vélez, S., Barajas, E., Rubio, J. A., Pereira-Obaya, D., & Rodríguez-Pérez, J. R. (2024). Field-Deployed Spectroscopy from 350 to 2500 nm: A Promising Technique for Early Identification of Powdery Mildew Disease (Erysiphe necator) in Vineyards. Agronomy, 14(3), 634. https://doi.org/10.3390/agronomy14030634
- Vezzulli, S., Doligez, A., & Bellin, D. (2019). Molecular Mapping of Grapevine Genes. In D. Cantu & M. A. Walker (Eds.), Compendium of Plant Genomes. The grape genome (pp. 103–136). Springer. https://doi.org/10.1007/978-3-030-18601-2_7
- Xavier, A., Hall, B., Casteel, S., Muir, W., & Rainey, K. M. (2017). Using unsupervised learning techniques to assess interactions among complex traits in soybeans. Euphytica, 213(8). https://doi.org/10.1007/s10681-017-1975-4
- Xin, B., & Whitty, M. (2022). A 3D grape bunch reconstruction pipeline based on constraint-based optimisation and restricted reconstruction grammar. Computers and Electronics in Agriculture, 196, 106840. https://doi.org/10.1016/j.compag.2022.106840
- Xue, J., & Su, B. (2017). Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. Journal of Sensors, 2017, 1–17. https://doi.org/10.1155/2017/1353691
- Zabawa, L., Kicherer, A., Klingbeil, L., Töpfer, R., Kuhlmann, H., & Roscher, R. (2020). Counting of grapevine berries in images via semantic segmentation using convolutional neural networks. ISPRS Journal of Photogrammetry and Remote Sensing, 164, 73–83. https://doi.org/10.1016/j.isprsjprs.2020.04.002
- Zendler, D., Malagol, N., Schwandner, A., Töpfer, R., Hausmann, L., & Zyprian, E. (2021). High-Throughput Phenotyping of Leaf Discs Infected with Grapevine Downy Mildew Using Shallow Convolutional Neural Networks. Agronomy, 11(9), 1768. https://doi.org/10.3390/agronomy11091768

Views: 2213
XML: 37