Roadmap#

The roadmap outlines current development priorities and aims to guide core developers and to encourage community contributions. It is a living document and will be updated as the project evolves.

The roadmap is not meant to limit movement features, as we are open to suggestions and contributions. Join our Zulip chat to share your ideas. We will take community demand and feedback into account when planning future releases.

Long-term vision#

The following features are being considered for the first stable version v1.0.

  • Import/Export pose tracks from/to diverse formats. We aim to interoperate with leading tools for animal pose estimation and behaviour classification, and to enable conversions between their formats.

  • Standardise the representation of pose tracks. We represent pose tracks as xarray data structures to allow for labelled dimensions and performant processing.

  • Interactively visualise pose tracks. We are considering napari as a visualisation and GUI framework.

  • Clean pose tracks, including, but not limited to, handling of missing values, filtering, smoothing, and resampling.

  • Derive kinematic variables like velocity, acceleration, joint angles, etc., focusing on those prevalent in neuroscience.

  • Integrate spatial data about the animal’s environment for combined analysis with pose tracks. This covers regions of interest (ROIs) such as the arena in which the animal is moving and the location of objects within it.

  • Define and transform coordinate systems. Coordinates can be relative to the camera, environment, or the animal itself (egocentric).

Short-term milestone - v0.1#

We plan to release version v0.1 of movement in early 2024, providing a minimal set of features to demonstrate the project’s potential and to gather feedback from users. At minimum, it should include:

  • Ability to import pose tracks from DeepLabCut, SLEAP and LightningPose into a common xarray.Dataset structure.

  • At least one function for cleaning the pose tracks.

  • Ability to compute velocity and acceleration from pose tracks.

  • Public website with documentation.

  • Package released on PyPI.

  • Package released on conda-forge.

  • Ability to visualise pose tracks using napari. We aim to represent pose tracks via napari’s Points and Tracks layers and overlay them on video frames.