Biologging tags are a key enabling tool for investigating cetacean behavior and locomotion in their natural habitat. Identifying and then parameterizing gait from movement sensor data is critical for these investigations. But how best to characterize gait from tag data remains an open question. Further, the location and orientation of the tag on an animal in the field are variable and can change multiple times during deployment. As a result, the relative orientation of the tag with respect to (wrt) the animal must be determined before a wide variety of further analyses. Currently, custom scripts that involve specific manual heuristics methods tend to be used in the literature. These methods require a level of knowledge and experience that can affect the reliability and repeatability of the analysis. The authors of this work argue that an animal’s gait is composed of a sequence of body poses observed by the tag, demonstrating a specific spatial pattern in the data that can be utilized for different purposes. This work presents an automated data processing pipeline (and software) that takes advantage of the common characteristics of pose and gait of the animal to 1) Identify time instances associated with occurrences of relative motion between the tag and animal; 2) Identify the relative orientation of tag wrt the animal’s body for a given data segment; and 3) Extract gait parameters that are invariant to pose and tag orientation. The authors included biologging tag data from bottlenose dolphins, humpback whales, and beluga whales in this work to validate and demonstrate the approach. Results show that the average relative orientation error of the tag wrt the dolphin’s body after processing was within 11 degrees in roll, pitch, and yaw directions. The average precision and recall for identifying relative tag motion were 0.87 and 0.89, respectively. Examples of the resulting pose and gait analysis demonstrate the potential of this approach to enhance studies that use tag data to investigate movement and behavior. MATLAB source code and data presented in the paper were made available to the public (https://github.com/ding-z/cetacean-pose-gait-analysis.git), with suggestions related to tag data processing practices provided in this paper. The proposed analysis approach will facilitate the use of biologging tags to study cetacean locomotion and behavior.
生物记录标签是在自然栖息地研究鲸目动物行为和运动的关键工具。从运动传感器数据中识别并参数化步态对于这些研究至关重要。但是如何从标签数据中最好地描述步态仍然是一个未解决的问题。此外,在野外,标签在动物身上的位置和方向是可变的,在部署期间可能会多次改变。因此,在进行多种进一步分析之前,必须确定标签相对于动物的相对方向。目前,文献中往往使用涉及特定手动启发式方法的自定义脚本。这些方法需要一定的知识和经验水平,这可能会影响分析的可靠性和可重复性。这项工作的作者认为,动物的步态是由标签所观察到的一系列身体姿态组成的,在数据中呈现出一种特定的空间模式,可用于不同目的。这项工作提出了一种自动化数据处理流程(和软件),它利用动物姿态和步态的共同特征来:1)识别与标签和动物之间相对运动发生相关的时间实例;2)确定给定数据段中标签相对于动物身体的相对方向;3)提取对姿态和标签方向不变的步态参数。作者在这项工作中纳入了宽吻海豚、座头鲸和白鲸的生物记录标签数据,以验证和演示该方法。结果表明,处理后标签相对于海豚身体的平均相对方向误差在横滚、俯仰和偏航方向上均在11度以内。识别标签相对运动的平均准确率和召回率分别为0.87和0.89。所得姿态和步态分析的示例展示了这种方法在加强使用标签数据研究运动和行为的研究方面的潜力。论文中提供的MATLAB源代码和数据已向公众开放(https://github.com/ding - z/cetacean - pose - gait - analysis.git),并在本文中提供了与标签数据处理实践相关的建议。所提出的分析方法将促进生物记录标签用于研究鲸目动物的运动和行为。