Q. Diao, J. Lu, W. Hu, Y. Zhang (PRC), and G. Bradski (USA)
dynamic Bayesian networks, visual tracking, prediction
In a visual tracking task, the object may exhibit rich dynamic behavior in complex environments that can corrupt target observations via background clutter and occlusion. Such dynamics and background induce non linear, non-Gaussian and multimodal observation densities. These densities are difficult to model with traditional methods such as Kalman Filter Models (KFMs) due to their Gaussian assumptions. Dynamic Bayesian Networks (DBNs) provide a more general framework in which to solve these problems. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linear-Gaussian. Under the DBN umbrella, a broad class of learning and inference algorithms for time-series models can be used in visual tracking. Furthermore, DBNs provide a natural way to combine multiple vision cues. In this paper, we describe some DBN models for tracking in non-linear, non Gaussian and multimodal situations, and present a prediction method to assist feature extraction part by making a hypothesis for the new observations.
Important Links:
Go Back