HerdHover logo

New Publication in eLife

The first publication from the Herd Hover project is out today in eLife! The paper introduces DeepPoseKit, a new, easy-to-use, open-source software toolkit for estimating animal posture from videos or images. The software was developed by authors Jake Graving and Daniel Chae, in collaboration with Hemail Naik, Liang Li, Benjamin Koger, Blair Costelloe and Iain Couzin.

A Grevy's zebra herd with posture keypoints generated with DeepPoseKit.

A Grevy’s zebra herd with posture keypoints generated with DeepPoseKit.

In addition to describing the software and its performance, the article provides an accessible overview of pose estimation technology and its applications to a wide range of fields, from neuroscience to ecology. This article is a great place to start if you’re new to pose estimation!

At Herd Hover, we are using DeepPoseKit to track the body posture of zebras and other ungulates in our drone footage. This allows us to determine the orientation of each animal’s body, the position of its head, and its behavioral state many times per second. Such detailed data are impossible to collect on entire groups of freely-moving wild animals using conventional observation methods, so posture tracking is opening some exciting doors for field studies of collective behavior. We believe that this combination of consumer drone technology and posture estimation also has the potential to be a valuable tool in wildlife management, health, and conservation, allowing practitioners to efficiently collect and objectively assess behavioral data with minimal disturbance to animals.

Be sure to check out the paper, as well as the official press release. The code for DeepPoseKit is available on GitHub - give it a try!