Vision-Based Guidance for Object Tracking

Overview

Due to their agility, mobility, and form factor, the number of applications using unmanned aircraft systems (UASs) has steadily grown larger. Not only can UASs reach places that are intractable for humans to access, but they are also excellent platforms for dangerous and monotonous jobs. This includes traffic monitoring, search and rescue, reconnaissance for military operations, and much more. Vision is increasingly being used as the primary source of perception. This is mainly due to cameras becoming cheaper in cost, smaller in size, lighter in weight, and higher in image resolution. Likewise, as computing resources evolve to be more economical and powerful, there has been a heighten demand in research and development for UASs. In particular, the problem of using a UAS, or a network of UASs, to detect and track moving targets (e.g., ground-based vehicles) under difficult operating conditions is one of active interest.

Contributions

  • We devised guidance laws, supported by a comprehensive set of computer vision algorithms, that allow a UAS to track moving targets via a monocular camera
  • We expanded upon our guidance laws and computer vision methods to enable a UAS to visually track multiple, moving ground targets with a single camera

Publications

  1. P. Karmokar, K. Dhal, W.J. Beksi, and A. Chakravarthy, "Vision-Based Guidance for Tracking Dynamic Objects," International Conference on Unmanned Aircraft Systems (ICUAS), 2021.
    Paper  •   Preprint  •   Source Code  •   Citation
  2. K. Dhal, P. Karmokar, A. Chakravarthy, and W.J. Beksi, "Vision-Based Guidance for Tracking Multiple Dynamic Objects," Journal of Intelligent & Robotic Systems, 2022.
    Paper  •   Citation  •   Source Code