Extracting motion data from video using optical flow with physically-based constraints

David Frakes, Christine Zwart, William Singhose

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Motion analysis of video data is a powerful tool for studying dynamic behavior and determining sources of failures. In the case of failure analysis, the available video may be of poor quality, such as from surveillance cameras. It is also likely to have been taken from a bad angle, with poor lighting, and occlusions may be present. To address such cases, this paper presents an optical flow-based tracking algorithm incorporating physically-based constraints to extract motion data from video. The technique can accurately track a significant number of data points with a high degree of automation and efficiency. Many traditional methods of video data extraction from poor-quality video have proven tedious and time-consuming due to extensive user-input requirements. With this in mind, the proposed optical flow-based algorithm functions with a minimal degree of user involvement. Points identified at the outset of a video sequence, and within a small subset of frames spaced throughout, can be automatically tracked even when they become occluded or undergo translational, rotational, or deformational motion. The proposed algorithm improves upon previous optical flow-based tracking algorithms by providing greater flexibility and robustness. Example results are presented that show the method tracking machines with flexible components, Segway personal transporters, and athletes pole vaulting.

Original languageEnglish (US)
Pages (from-to)48-57
Number of pages10
JournalInternational Journal of Control, Automation and Systems
Issue number1
StatePublished - Feb 1 2013


  • Dynamic motion
  • failure analysis
  • tracking
  • video analysis

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications


Dive into the research topics of 'Extracting motion data from video using optical flow with physically-based constraints'. Together they form a unique fingerprint.

Cite this