A comprehensive and continuous analysis of relevant phenotypic traits in broadly diffused crops such as tomatoes is extremely important to assess the status of plants, especially in the current scenario of extreme climate events, which can threaten the foundations of the food supply chain for large populations. However, these operations are extremely costly in human effort; therefore, automating these processes is of paramount relevance. To this end, this work first provides a systematic benchmark of different, well-assessed versions of the You Only Look Once (YOLO) object detector, identifying the most suited baseline in terms of iteration and model density. Afterwards, the effectiveness of attention-based mechanisms was evaluated, highlighting possible critical aspects which may undermine overall results and real-time applicability. Thus, to enhance the baseline performance, an Incremental Learning pipeline was assessed, evaluating domain adaptation via fine-tuning. Specifically, a YOLOv11-based object detector was incrementally trained on specific subsets of a larger dataset, with the aim of providing the resulting model with subset-specific knowledge to allow more generalisation capabilities. This procedure resulted in an overall improvement of mAP@0.5 of about 1.36%, and F1 score of 1.1%, while slightly lowering the computational burden during inference, with an improvement of 19 ms.The effectiveness of these capabilities was also tested against a scenario with a low number of samples available, yielding promising domain adaptation results even under adverse conditions, and providing a practical path for evolving phenotypical evaluation under real-world agricultural scenarios.

Incremental Learning with Domain Adaption for Tomato Plant Phenotyping

Angelo Cardellicchio;Vito Renò
;
Annalisa Milella
2025

Abstract

A comprehensive and continuous analysis of relevant phenotypic traits in broadly diffused crops such as tomatoes is extremely important to assess the status of plants, especially in the current scenario of extreme climate events, which can threaten the foundations of the food supply chain for large populations. However, these operations are extremely costly in human effort; therefore, automating these processes is of paramount relevance. To this end, this work first provides a systematic benchmark of different, well-assessed versions of the You Only Look Once (YOLO) object detector, identifying the most suited baseline in terms of iteration and model density. Afterwards, the effectiveness of attention-based mechanisms was evaluated, highlighting possible critical aspects which may undermine overall results and real-time applicability. Thus, to enhance the baseline performance, an Incremental Learning pipeline was assessed, evaluating domain adaptation via fine-tuning. Specifically, a YOLOv11-based object detector was incrementally trained on specific subsets of a larger dataset, with the aim of providing the resulting model with subset-specific knowledge to allow more generalisation capabilities. This procedure resulted in an overall improvement of mAP@0.5 of about 1.36%, and F1 score of 1.1%, while slightly lowering the computational burden during inference, with an improvement of 19 ms.The effectiveness of these capabilities was also tested against a scenario with a low number of samples available, yielding promising domain adaptation results even under adverse conditions, and providing a practical path for evolving phenotypical evaluation under real-world agricultural scenarios.
2025
Istituto di Sistemi e Tecnologie Industriali Intelligenti per il Manifatturiero Avanzato - STIIMA (ex ITIA) Sede Secondaria Bari
Plant phenotyping; Incremental Transfer Learning; trait detection; YOLO
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S2772375525005556-main.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 6.29 MB
Formato Adobe PDF
6.29 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/553995
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact