An adaptive fusion architecture for target tracking

Abstract
A vision system is demonstrated that adaptively allocates computational resources over multiple cues to robustly track a target in 3D. The system uses a particle filter to maintain multiple hypotheses of the target location. Bayesian probability theory provides the framework for sensor fusion, and resource scheduling is used to intelli-gently allocate the limited computational resources available across the suite of cues. The system is shown to track a person in 3D space moving in a cluttered environment.

This publication has 6 references indexed in Scilit: