Shankar, S and Lasenby, J and Kokaram, A (2013) Warping trajectories for video synchronization. ARTEMIS 2013 - Proceedings of the 4th ACM/IEEE International Workshop on Analysis and Retrieval of Tracked Events and Motion in Imagery Stream. pp. 41-48.Full text not available from this repository.
Temporal synchronization of multiple video recordings of the same dynamic event is a critical task in many computer vision applications e.g. novel view synthesis and 3D reconstruction. Typically this information is implied, since recordings are made using the same timebase, or time-stamp information is embedded in the video streams. Recordings using consumer grade equipment do not contain this information; hence, there is a need to temporally synchronize signals using the visual information itself. Previous work in this area has either assumed good quality data with relatively simple dynamic content or the availability of precise camera geometry. In this paper, we propose a technique which exploits feature trajectories across views in a novel way, and specifically targets the kind of complex content found in consumer generated sports recordings, without assuming precise knowledge of fundamental matrices or homographies. Our method automatically selects the moving feature points in the two unsynchronized videos whose 2D trajectories can be best related, thereby helping to infer the synchronization index. We evaluate performance using a number of real recordings and show that synchronization can be achieved to within 1 sec, which is better than previous approaches. Copyright 2013 ACM.
|Uncontrolled Keywords:||Event detection Optical flow Synchronization|
|Divisions:||Div F > Signal Processing and Communications|
|Depositing User:||Cron job|
|Date Deposited:||16 Jul 2015 13:34|
|Last Modified:||01 Aug 2015 04:17|