movement#

A Python toolbox for analysing animal body movements across space and time.

User guide

Installation, supported formats and key concepts.

User guide
Examples

A gallery of examples using movement.

Examples
Join the movement

How to get in touch and contribute.

Community

Overview#

Deep learning methods for motion tracking have revolutionised a range of scientific disciplines, from neuroscience and biomechanics, to conservation and ethology. Tools such as DeepLabCut and SLEAP now allow researchers to track animal movements in videos with remarkable accuracy, without requiring physical markers. However, there is still a need for standardised, easy-to-use methods to process the tracks generated by these tools.

movement aims to provide a consistent, modular interface for analysing motion tracks, enabling steps such as data cleaning, visualisation, and motion quantification. We aim to support all popular animal tracking frameworks and file formats.

Find out more on our mission and scope statement and our roadmap.

Warning

🏗️ The package is currently in early development and the interface is subject to change. Feel free to play around and provide feedback.

Tip

If you prefer analysing your data in R, we recommend checking out the animovement toolbox, which is similar in scope. We are working together with its developer to gradually converge on common data standards and workflows.

Citation#

If you use movement in your work, please cite the following Zenodo DOI:

Nikoloz Sirmpilatze, Chang Huan Lo, Sofía Miñano, Brandon D. Peri, Dhruv Sharma, Laura Porta, Iván Varela & Adam L. Tyson (2024). neuroinformatics-unit/movement. Zenodo. https://zenodo.org/doi/10.5281/zenodo.12755724