Abstract
This paper tackles the problem of self- calibration of multiple cameras which are very far apart. Given a set of feature correspon- dences one can determine the camera geometry. The key problem we address is finding such cor- respondences. Since the camera geometry (location and ori- entation) and photometric characteristics vary considerably between images one cannot use brightness and/or proximity constraints. In- stead we propose a three step approach: first we use moving objects in the scene to determine a rough planar alignment, next we use static fea- tures to improve the alignment, finally we use off plane features to determine the epipolar ge- ometry and the horizon line. We do not assume synchronized cameras and we show that enforcing the geometric con- straints enables us to align the tracking data in time. We present results on challenging outdoor scenes using real time tracking data.

This publication has 6 references indexed in Scilit: