Thursday 18 July 2013
Up until now the synchronization of the video streams was often set manually, or by using a distinguishable sound. However, the sound synchronization was not always possible and suffered from the fact that certain cameras (like the GoPro 3) have slight lag between their sound and video. Kolor is developing an additional solution based on visual information.
The idea is simple: the cameras being trapped in a solid rig, they all suffer the same movement at the same time. We can determine the motion of the camera by analyzing the image displacement from one frame to another. Our algorithm then measures the amount of movement of each camera through time and seeks temporal shifts that would form the best match.
Although based on the image, this synchronization does not require a pre-stitching and is able to handle non-overlaping videos. The resulting synchronization is frame accurate, provided that the rig does not remain static all along the shot.
Are you new to 360Â° video creation? Take a tour on our special 360Â° video website and discover how to record, stitch and publish 360Â° videos easily.