Automated Object Tracking for Animal Behaviour Studies

Timmy Manning, Miguel Somarriba, Rainer Roehe, Simon Turner, Haiying / HY Wang, Huiru Zheng, Brian Kelly, Jennifer Lynch, Paul Walsh

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Citations (Scopus)

Abstract

It can be an arduous and time consuming task to manually curate video data for tracking the movement of an object. This is particularly challenging when tracking animals whose behaviour is sporadic and unpredictable, but it can provide pertinent information when monitoring health, conditions and treatments. Here, we evaluate the use of machine learning retroactively applied to automatically track specific steers in hours of video with minimal manual interaction. This is approached using the Faster R-CNN Object Detection algorithm with VGG-16 acting as a feature extractor. Performance on a number of video segments is presented and discussed, and the issues encountered are outlined. This highlights a number of guidelines that should be taken under consideration when generating video data to improve object detection performance, and helps define the applicability of the approach to pre-existing data.
Original languageEnglish
Title of host publication2019 IEEE International Conference on Bioinformatics and Biomedicine (BIBM 2019)
PublisherIEEE
Pages1876-1883
ISBN (Electronic)978-1-7281-1867-3
ISBN (Print)978-1-7281-1868-0
DOIs
Publication statusPublished (in print/issue) - 6 Feb 2020
Event2019 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2019 - San Diego, United States
Duration: 18 Nov 201921 Nov 2019

Conference

Conference2019 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2019
Country/TerritoryUnited States
CitySan Diego
Period18/11/1921/11/19

Keywords

  • behaviour
  • object detection
  • tracking
  • cattle

Fingerprint

Dive into the research topics of 'Automated Object Tracking for Animal Behaviour Studies'. Together they form a unique fingerprint.

Cite this