harpy
single-cell spatial proteomics analysis that makes you happy
Harpy: single-cell spatial proteomics analysis that makes you happy 
π« If you find Harpy useful, please give us a β! It helps others discover the project and supports continued development.
Installation
Recommended for end-users.
uv venv --python=3.12 # set python version
source .venv/bin/activate # activate the virtual environment
uv pip install "git+https://github.com/saeyslab/harpy.git#egg=harpy-analysis[extra]" # use uv to pip install dependencies
python -c 'import harpy; print(harpy.__version__)' # check if the package is installed
Only for developers. Clone this repository locally, install the .[dev] instead of the [extra] dependencies and read the contribution guide.
# Clone repository from GitHub
uv venv --python=3.12 # set python version
source .venv/bin/activate # activate the virtual environment
uv pip install -e '.[dev]' # use uv to pip install dependencies
python -c 'import harpy; print(harpy.__version__)' # check if the package is installed
# make changes
python -m pytest # run the tests
Checkout the docs for installation instructions using conda.
π§ Tutorials and Guides
Explore how to use Harpy for segmentation, shallow and deep feature extraction, clustering, and spatial analysis of gigapixel-scale multiplexed data with these step-by-step notebooks:
-
π Basic Usage of Harpy
Learn how to read in data, perform tiled segmentation using Cellpose and Dask-CUDA, extract features, and carry out clustering. π Tutorial
-
π§ Technology specific advice
Learn which technologies Harpy supports. π Notebook
-
π§© Pixel and Cell Clustering
Learn how to perform unsupervised pixel- and cell-level clustering using
Harpytogether with FlowSOM. π Tutorial
-
βοΈ Cell Segmentation
Explore segmentation workflows in
Harpyusing different tools:π‘ Want us to add support for another segmentation method? π Open an issue and let us know!
-
π§ͺ Single-cell representations from highly multiplexed images and downstream use with PyTorch
Learn how single-cell representations can be generated from highly multiplexed images. These representations can then be used downstream to train classifiers in PyTorch. π Tutorial
-
π§ Deep Feature Extraction
Discover how
Harpyenables fast, scalable extraction of deep, cell-level features from multiplex imaging data with the KRONOS foundation model for proteomics. π Tutorialπ‘ Want us to add support for another deep feature extraction method? π Open an issue and let us know!
-
π¬ Shallow Feature Extraction
Learn to extract shallow featuresβsuch as mean, median, and standard deviation of intensitiesβfrom multiplex imaging data with
Harpy. π Tutorial
-
𧬠Spatial Transcriptomics
Learn how to analyze spatial transcriptomics data with
Harpy. For detailed information, refer to the SPArrOW documentation.
-
π Multiple samples and coordinate systems
Learn how to work with multiple samples, instrinsic and micron coordinates. π Tutorial
-
π Rasterize and vectorize labels and shapes
Learn how to convert a segmentation mask (array) into its vectorized form, and segmentation boundaries (polygons) into their rasterized equivalents. This conversion is useful, for example, when integrating annotations (e.g., from QuPath) into downstream spatial omics analysis.π Tutorial
π For a complete list of tutorials, visit the Harpy documentation.
Computational benchmark
Explore the benchmark performance of Harpy on a large MACSima tonsil proteomics dataset. π Results
Usage
Learn how Harpy can be integrated into your workflow.
Contributing
See here for info on how to contribute to Harpy.
References
License
Check the license. Harpy is free for academic usage. For commercial usage, please contact Saeyslab.
Issues
If you encounter any problems, please file an issue along with a detailed description.
Version:
- 0.3.0
Last updated:
- 2026-02-05
First released:
- 2024-12-09
License:
- Academic Non-commercial Softwa...
Python versions supported:
Operating system:
- Information not submitted
Requirements:
- crick
- distributed
- flowsom
- lazy-loader>=0.4
- leidenalg>=0.9.1
- magicgui
- omegaconf
- pyrootutils
- rasterio>=1.3.2
- scanpy>=1.9.1
- seaborn>=0.12.2
- session-info2
- spatialdata-io
- spatialdata-plot
- spatialdata>=0.4.0
- universal-pathlib
- voronoi-diagram-for-polygons>=0.1.6
- basicpy==1.0.0; (python_version <= '3.11') and extra == 'basic'
- jax<=0.4.23,>=0.4.6; (python_version <= '3.11') and extra == 'basic'
- jaxlib<=0.4.23,>=0.4.6; (python_version <= '3.11') and extra == 'basic'
- numpy<2; (python_version <= '3.11') and extra == 'basic'
- asv; extra == 'benchmark'
- cellpose>=2.2.3; extra == 'cellpose'
- hydra-colorlog>=1.2.0; extra == 'cli'
- hydra-core>=1.2.0; extra == 'cli'
- hydra-submitit-launcher>=1.2.0; extra == 'cli'
- submitit>=1.4.5; extra == 'cli'
- flowsom; extra == 'clustering'
- scikit-learn>=1.3.1; extra == 'clustering'
- asv; extra == 'dev'
- bokeh; extra == 'dev'
- cellpose>=2.2.3; extra == 'dev'
- flowsom; extra == 'dev'
- hydra-colorlog>=1.2.0; extra == 'dev'
- hydra-core>=1.2.0; extra == 'dev'
- hydra-submitit-launcher>=1.2.0; extra == 'dev'
- ipython; extra == 'dev'
- ipywidgets; extra == 'dev'
- joypy; extra == 'dev'
- myst-nb; extra == 'dev'
- napari-spatialdata>=0.2.6; extra == 'dev'
- napari[all]>=0.4.18; extra == 'dev'
- nbconvert; extra == 'dev'
- opencv-python; extra == 'dev'
- pre-commit; extra == 'dev'
- pytest; extra == 'dev'
- pytest-cov; extra == 'dev'
- pytest-qt; extra == 'dev'
- scikit-learn>=1.3.1; extra == 'dev'
- sphinx-autodoc-typehints; extra == 'dev'
- sphinx-book-theme>=1.0.0; extra == 'dev'
- sphinx-copybutton; extra == 'dev'
- sphinx-design; extra == 'dev'
- sphinx-rtd-theme; extra == 'dev'
- sphinx>=4.5; extra == 'dev'
- sphinxcontrib-bibtex>=1.0.0; extra == 'dev'
- submitit>=1.4.5; extra == 'dev'
- supervenn>=0.5.0; extra == 'dev'
- textalloc; extra == 'dev'
- tox; extra == 'dev'
- tqdm; extra == 'dev'
- twine>=4.0.2; extra == 'dev'
- myst-nb; extra == 'docs'
- sphinx-autodoc-typehints; extra == 'docs'
- sphinx-book-theme>=1.0.0; extra == 'docs'
- sphinx-copybutton; extra == 'docs'
- sphinx-design; extra == 'docs'
- sphinx-rtd-theme; extra == 'docs'
- sphinx>=4.5; extra == 'docs'
- sphinxcontrib-bibtex>=1.0.0; extra == 'docs'
- bokeh; extra == 'extra'
- cellpose>=2.2.3; extra == 'extra'
- flowsom; extra == 'extra'
- hydra-colorlog>=1.2.0; extra == 'extra'
- hydra-core>=1.2.0; extra == 'extra'
- hydra-submitit-launcher>=1.2.0; extra == 'extra'
- ipython; extra == 'extra'
- ipywidgets; extra == 'extra'
- joypy; extra == 'extra'
- napari-spatialdata>=0.2.6; extra == 'extra'
- napari[all]>=0.4.18; extra == 'extra'
- nbconvert; extra == 'extra'
- opencv-python; extra == 'extra'
- scikit-learn>=1.3.1; extra == 'extra'
- submitit>=1.4.5; extra == 'extra'
- supervenn>=0.5.0; extra == 'extra'
- textalloc; extra == 'extra'
- tqdm; extra == 'extra'
- napari-spatialdata>=0.2.6; extra == 'napari'
- napari[all]>=0.4.18; extra == 'napari'
- bokeh; extra == 'notebook'
- ipython; extra == 'notebook'
- ipywidgets; extra == 'notebook'
- joypy; extra == 'notebook'
- nbconvert; extra == 'notebook'
- supervenn>=0.5.0; extra == 'notebook'
- textalloc; extra == 'notebook'
- tqdm; extra == 'notebook'
- opencv-python; extra == 'opencv'
- cellpose>=2.2.3; extra == 'segmentation'
- opencv-python; extra == 'test'
- pytest; extra == 'test'
- pytest-cov; extra == 'test'
- pytest-qt; extra == 'test'
- tox; extra == 'test'

