Autopano Video - Practical synchronization
- 1 DIFFERENT WAYS TO SYNCHRONIZE YOUR VIDEOS
- 2 IMPORTANCE OF SYNCHRONIZATION
- 3 SOUND SYNCHRONIZATION
- 4 MOTION SYNCHRONIZATION
- 5 MANUAL SYNCHRONIZATION
- 6 Final render
- 7 MANUAL SYNCHRONIZATION USING A FLASH
DIFFERENT WAYS TO SYNCHRONIZE YOUR VIDEOS
|Sound||If your video and audio track are perfectly synchronized (very rare) this method provides a millisecond accurate synchronization.|
|Motion||Can be very easily used in most cases and very reliable : frame accurate.|
|Manual||Adaptability to your situation.|
IMPORTANCE OF SYNCHRONIZATION
The video stitching is based on the stitching of images taken at a precise moment in the different videos. For this stitching to look as best as possible, we have to make sure that the entry source flow is synchronised.
The stitching allows you to define the relative positions of the cameras. If the flows aren't synchronised, it's the equivalent of relative motion of the cameras themselves (if one camera starts before another, the relative position isn't stable).
Ideally, it would be best if your equipment could assure this synchronicity but that is not always possible, which is why Autopano Video gives you the necessary tools to do so.
Synchronisation problem down to the same frame
We should also note that synchronisation down to the same frame does not entirely correct temporal stabilisation problems.
For example, during the stitching of a ski sequence in HD at 25fps, we noticed that the actual displacement between two consecutive frames of the same video (in other words during 40 ms) could have an amplitude of 20px.
So if there is an unsynchronisation inferior to one frame, there may still be temporal flaws in the final video stitching.
The automatic synchronisation algorithm allows you to calculate the similarity in the audio signals.
For this to be effective, the audio stream has to be:
- clear (beware of the wind in the microphone)
- audible (beware of thick casings)
- identifiable (do not hesitate to speak at the beginning of the sequence, or to tap on the prop a few times if the casing is thick)
By default, the audio synchronisation research uses a sample of 10s around the current moment.
Increasing the research zone allows you to verify the reliability of the detection as the audio samples are also increased in size.
For example, for a big difference (up to 15s), you musn't hesitate to increase the research value to 30s.
You can also position the current moment beyond the approximate gap to augment the common elements in the signal from the different samples.
At the end of the automatic synchronisation, a message will indicate if the results seem reliable or not: if autopano video says the synchronisation is precise, then there are strong chances that the values are perfect. Otherwise you can start again by moving the current moment and/or increasing the maximum reasearch value.
In certain cases, the algorithm may question the result even though the synchronisation may be perfectly fine.
If the values are close to what you observed manually, then there is a good chance that they are correct despite the warning.
In this sequence we found a good coherence in the desynchronizations found, except in two videos where the results were off by a whole frame.
Which tends to show that the audio-video synchronization is not necessarily the same from one camera to another.
Still in this sequence, one of the videos has close to a half-frame time gap.
This is why we offer some tips to manually correct the synchronization of the videos.
Remember to turn around your rig like this when all your cameras are recording to insure the best synchronization:
In Autopano Video
The desynchronizations create stitching problems close to parallax issues.
So it is interesting to work on a reference panorama created at a time sequence where movement is low to minimize stitching problems that may occur as a result of temporal desynchronization.
However, to properly visualize the remaining desynchronisations, it is important to chose (on the timeline) a moment where there is ample movement.
Emphasizing the flaws
The render presets allow you to improve the visual render despite the remaining stitching errors.
Deactivate the “diamond” and “cutting” modes, and use the linear blender to enhance the stitching defects and try to fix a maximum of them by changing the values of the video synchronization.
In the synchronization editing window, you can view your modifications by clicking on the “update” button. You can also work on only certain videos by hiding some images while editing in Autopano Pro or Giga.
Dynamic Visualization (pro version only)
The “Show a Sample” button allows you to view the resulting synchronization by playing a few seconds of the video.
Make a new reference panorama that takes into account the better found synchronization.
MANUAL SYNCHRONIZATION USING A FLASH
The video sequence used contains synchronization issues of about one minute.
The first step consists to do an approximative stitching using a template or an old .pano file with the option "Stitch as pano". Without using this option, desynchronizations are too important to detect a panorama.
Then, find (approx.) the flash triggering moment on each of the individual video.
Use the Show a sample option to check if the approximative synchronization is right, ie the flash appears on each video.
The creation of the approximative panorama can be also done without any template when videos are synchronized to the split second.
Then, use the image offset frame by frame and the Update option to obtain the best synchronization possible, ie a completely flashed panorama.
Here is the result of the flash synchronization:
Create a Reference panorama that use this synchronization on a more interesting moment of the sequence to adjust the other required parameters (horizon or other...).
GO TO THE NEXT STEP: Creating and editing a reference panorama