movement

A Python toolbox for analysing animal body movements across space and time

  • Nikoloz Sirmpilatze, Chang Huan Lo, Sofía Miñano

Python Version PyPI Version Conda Forge Version Downloads License CI codecov Binder Code style: Ruff pre-commit project chat DOI

movement

A Python toolbox for analysing animal body movements across space and time.

Quick install

Create and activate a conda environment with movement installed (including the GUI):

conda create -n movement-env -c conda-forge movement napari pyqt
conda activate movement-env

[!Note] Read the documentation for more information, including full installation instructions and examples.

Overview

Deep learning methods for motion tracking have revolutionised a range of scientific disciplines, from neuroscience and biomechanics, to conservation and ethology. Tools such as DeepLabCut and SLEAP now allow researchers to track animal movements in videos with remarkable accuracy, without requiring physical markers. However, there is still a need for standardised, easy-to-use methods to process the tracks generated by these tools.

movement aims to provide a consistent, modular interface for analysing motion tracks, enabling steps such as data cleaning, visualisation, and motion quantification. We aim to support all popular animal tracking frameworks and file formats.

Find out more on our mission and scope statement and our roadmap.

[!Tip] If you prefer analysing your data in R, we recommend checking out the animovement toolbox, which is similar in scope. We are working together with its developer to gradually converge on common data standards and workflows.

Join the movement

movement is made possible by the generous contributions of many people.

We welcome and encourage contributions in any form—whether it is fixing a bug, developing a new feature, or improving the documentation—as long as you follow our code of conduct.

Go to our community page to find out how to connect with us and get involved.

Citation

If you use movement in your work, please cite the following Zenodo DOI:

Nikoloz Sirmpilatze, Chang Huan Lo, Sofía Miñano, Brandon D. Peri, Dhruv Sharma, Laura Porta, Iván Varela & Adam L. Tyson (2024). neuroinformatics-unit/movement. Zenodo. https://zenodo.org/doi/10.5281/zenodo.12755724

License

⚖️ BSD 3-Clause

Package template

This package layout and configuration (including pre-commit hooks and GitHub actions) have been copied from the python-cookiecutter template.

Version:

  • 0.13.0

Last updated:

  • 2026-01-13

First released:

  • 2025-03-10

License:

  • BSD-3-Clause

Supported data:

  • Information not submitted

Plugin type:

Open extension:

Save extension:

Python versions supported:

Operating system:

  • Information not submitted

Requirements:

  • numpy>=2.0.0
  • pandas
  • scipy<1.17.0
  • h5py
  • netCDF4<1.7.3
  • tables>=3.10.1
  • attrs
  • pooch
  • tqdm
  • shapely
  • sleap-io
  • xarray[accel,io,viz]
  • PyYAML
  • napari-video>=0.2.13
  • pyvideoreader>=0.5.3
  • qt-niu
  • loguru
  • pynwb
  • ndx-pose>=0.2.1
  • napari[all]>=0.6.0; extra == "napari"
  • pytest; extra == "dev"
  • pytest-cov; extra == "dev"
  • pytest-mock; extra == "dev"
  • coverage; extra == "dev"
  • tox; extra == "dev"
  • mypy; extra == "dev"
  • pre-commit; extra == "dev"
  • ruff; extra == "dev"
  • codespell; extra == "dev"
  • setuptools_scm; extra == "dev"
  • pandas-stubs; extra == "dev"
  • types-attrs; extra == "dev"
  • check-manifest; extra == "dev"
  • types-PyYAML; extra == "dev"
  • types-requests; extra == "dev"
  • pytest-qt; extra == "dev"
  • scipy-stubs; extra == "dev"
  • types-shapely; extra == "dev"
  • movement[napari]; extra == "dev"
Website by the napari team, original design by CZI. Go to napari main website.