ROT-Harris: A Dynamic Approach to Asynchronous Interest Point Detection

Research output: Contribution to conferencePaperpeer-review

115 Downloads (Pure)


Event-based vision sensors are a paradigm shift in the way that visual information is obtained and processed. These devices are capable of low-latency transmission of data which represents the scene dynamics. Additionally, low-power benefits make the sensors popular in finitepower scenarios such as high speed robotics or machine vision applications where latency in visual information is desired to be minimal. The core datatype of such vision sensors is the ’event’ which is an asynchronous per-pixel signal indicating a change in light intensity at an instance in time corresponding to the spatial location of that sensor on the array. A popular approach to event-based processing is to map events onto a 2D plane over time which is comparable with traditional imaging techniques. However, this paper presents a disruptive approach to event data processing that uses a tree based filter framework that directly processes raw event data to extract events corresponding to interest point features, which is then combined with a Harris interest point approach to isolate features. We hypothesise that since the tree structure contains the same spatial information as a 2D surface mapping, Harris may be applied directly to the content of the tree, bypassing the need for transformation to the 2D plane. Results illustrate that the proposed approach performs better than other state-ofthe-art approaches with limited compromise on the run-time performance.
Original languageEnglish
Publication statusPublished (in print/issue) - 25 Jul 2021
EventMachine Vision and Applications 2021: MVA 2021 -
Duration: 25 Jul 202127 Jul 2021


ConferenceMachine Vision and Applications 2021


  • Event-based vision
  • Harris interest point
  • Reduction-Over-Time


Dive into the research topics of 'ROT-Harris: A Dynamic Approach to Asynchronous Interest Point Detection'. Together they form a unique fingerprint.

Cite this