DeepPhenoTree

A simple plugin to use models developped for flowering, fruitlet and fruit

  • Herearii Metuarea

License GNU LGPL v3.0 PyPI Python Version tests codecov napari hub npe2 Copier

Herearii Metuarea, Abdoul djalil Ousseni hamza, Walter Guerra†, Andrea Patocchi, Lidia Lozano, Shauny Van Hoye, Francois Laurens, Jeremy Labrosse, Pejman Rasti, David Rousseau†.

† project lead

445202004-4a110408-5854-4e8c-b655-4cb588434b79 DeepPhenoTreeTools

DeepPhenoTree is though as a tool to enable automatic detection of phenological stages associated with flowering, fruitlet, and fruit in harvest time from images using deep learning–based object detection models.

This napari plugin was generated with copier using the napari-plugin-template (None).

Contribution

Article (Draft)

DeepPhenoTree – Apple Edition: a Multi-site apple phenology RGB annotated dataset with deep learning baseline models. Herearii Metuarea, Abdoul djalil Ousseni hamza, Walter Guerra, Andrea Patocchi, Lidia Lozano, Shauny Van Hoye, Francois Laurens, Jeremy Labrosse, Pejman Rasti, David Rousseau.

Dataset

Herearii Metuarea; Abdoul djalil Ousseni hamza; Lou Decastro; Jade Marhadour; Oumaima Karia; Lorène Masson; Marie Kourkoumelis-Rodostamos; Walter Guerra; Francesca Zuffa; Francesco Panzeri; Andrea Patocchi; Lidia Lozano; Shauny Van Hoye; Marijn Rymenants; François Laurens; Jeremy Labrosse; Pejman Rasti; David Rousseau, 2026, "DeepPhenoTree - Apple Edition", https://doi.org/10.57745/NORPF1, Recherche Data Gouv, V5, UNF:6:FyJNuJx4BVZxWuG8hI4gEw== [fileUNF]


Installation

You can install deepphenotree via pip:

pip install deepphenotree

If napari is not already installed, you can install deepphenotree with napari and Qt via:

pip install "deepphenotree[all]"

To install latest development version :

pip install git+https://github.com/hereariim/deepphenotree.git

GPU is mandatory for time processing and models running (especially RT-DETR). Please visit the official PyTorch website to get the appropriate installation command: 👉 https://pytorch.org/get-started/locally

Exemple : GPU (CUDA 12.1)

pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121

Getting started

Running from Python

1. Load sample image

from deepphenotree._sample_data import DeepPhenoTreeData

# Flowering data
data_flower = DeepPhenoTreeData('Flowering')
images = data_flower.data # Dimension : (5120, 5120, 3, 4)
country = data_flower.names # ['Belgium', 'Italy', 'Spain', 'Switzerland']

# Fruitlet data
data_fruitlet = DeepPhenoTreeData('Fruitlet')
# Fruit data
data_fruit = DeepPhenoTreeData('Fruit')

2. Run inference

from deepphenotree.inference import YoloInferencer
image = # Your RGB image

# Flowering task
infer = YoloInferencer("Flowering")
bbx = infer.predict_boxes(image)

# Fruitlet task
infer = YoloInferencer("Fruitlet")
bbx = infer.predict_boxes(image)

# Fruit task
infer = YoloInferencer("Fruit")
bbx = infer.predict_boxes(image)

Running from Napari

This plugin is a tool to perform targeted image inference on user-provided images. Users can run three specific detection tasks via dedicated buttons: flowering, fruitlet, and fruit detection. The plugin returns the coordinates of bounding boxes around detected objects, and a message informs the user of the number of detected boxes. Several developments are ongoing—feel free to contact us if you have requests or suggestions.

Deepphenotree_plugin

Scheme

User_tool

Input

User drag and drop RGB image on napari window. Otherwise, user can select an image among suggested images from the plugin :

File > Open Sample > DeepPhenoTree > images

Note : The images available in Open Sample > DeepPhenoTree correspond to the test data associated with the models provided in this plugin.

Process

User click to make inference in image :

  • Flowering : Detect all objects (from BBCH 00 to BBCH 69) from bud developpement to flowering.
  • Fruitlet : Detect fruit in developement (from BBCH 71 to 77)
  • Fruit : Detect all fruit in harvest time (from BBCH 81 to 89)

Output

Bounding box displayed in layer Flowering for flowering, Fruitlet for fruitlet and Fruit for fruit.

Model

DeepPhenoTree consists of a RT-DETR trained on DeepPhenoTree dataset.

The trained models used in this project are not publicly available. They are part of ongoing research and collaborative projects, and therefore cannot be distributed at this time.
However, the codebase is provided to ensure reproducibility and transparency of the proposed methodology.

Images results

Standard deviation is computed over 5-fold cross-validation. Overall (4 sites) denotes the aggregated evaluation across the four experimental sites (Switzerland, Belgium, Spain, and Italy).

Dataset Location Precision Recall mAP@.5 mAP@.5:.95
Overall (4 sites) 0.69 ± 0.01 0.58 ± 0.02 0.65 ± 0.02 0.37 ± 0.02
Switzerland 0.73 ± 0.02 0.60 ± 0.04 0.68 ± 0.03 0.40 ± 0.04
Flowering Belgium 0.72 ± 0.02 0.63 ± 0.03 0.69 ± 0.03 0.40 ± 0.03
Spain 0.66 ± 0.01 0.53 ± 0.05 0.60 ± 0.03 0.30 ± 0.02
Italy 0.69 ± 0.04 0.61 ± 0.03 0.67 ± 0.04 0.40 ± 0.04
------------ ---------------------- ---------------- ---------------- -------------- --------------
Overall (4 sites) 0.85 ± 0.02 0.73 ± 0.02 0.82 ± 0.02 0.53 ± 0.01
Switzerland 0.86 ± 0.04 0.78 ± 0.04 0.84 ± 0.06 0.56 ± 0.04
Fruitlet Belgium 0.83 ± 0.03 0.65 ± 0.04 0.77 ± 0.04 0.52 ± 0.14
Spain 0.86 ± 0.02 0.72 ± 0.03 0.81 ± 0.03 0.52 ± 0.03
Italy 0.88 ± 0.01 0.80 ± 0.01 0.88 ± 0.01 0.61 ± 0.01
------------ ---------------------- ---------------- ---------------- -------------- --------------
Overall (4 sites) 0.87 ± 0.01 0.79 ± 0.01 0.86 ± 0.01 0.57 ± 0.01
Switzerland 0.86 ± 0.03 0.80 ± 0.02 0.87 ± 0.02 0.59 ± 0.01
Fruit Belgium 0.90 ± 0.01 0.84 ± 0.01 0.90 ± 0.01 0.63 ± 0.02
Spain 0.86 ± 0.02 0.75 ± 0.02 0.84 ± 0.02 0.51 ± 0.03
Italy 0.88 ± 0.02 0.84 ± 0.03 0.90 ± 0.02 0.66 ± 0.02

DeepPhenoTree Dataset

DeepPhenoTree – Apple Edition, a multi-site, multi-variety, RGB image dataset dedicated to the classification of key apple treephenological stages.

IRTA_time

Acknowlegments

This work was supported by the PHENET project. The authors also acknowledge IDRIS for providing access to high-performance computing resources.

Contact

Imhorphen team, bioimaging research group 42 rue George Morel, Angers, France

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the GNU LGPL v3.0 license, "deepphenotree" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

Citing

If you use DeepPhenoTree plugin in your research, please use the following BibTeX entry.

Not available

Version:

  • 1.0.6

Last updated:

  • 2026-02-26

First released:

  • 2026-02-09

License:

  • GNU LESSER GENERAL PUBLIC LICE...

Supported data:

  • Information not submitted

Open extension:

Save extension:

Python versions supported:

Operating system:

  • Information not submitted

Requirements:

  • numpy
  • magicgui
  • qtpy
  • scikit-image
  • pillow
  • requests
  • tqdm
  • sahi==0.11.36
  • ultralytics==8.3.214
  • opencv-python<5,>=4.10
  • torch>=2.2
  • torchvision>=0.17
  • napari[all]; extra == "all"
Website by the napari team, original design by CZI. Go to napari main website.