Abstract:
In view of the difficulty of real-time video splicing in scenes with dynamic targets, artifacts and dislocation are easy to occur when moving objects pass through the joint of video splicing, and complex registration and fusion algorithms are difficult to meet the real-time requirements. In order to solve this problem, this paper proposes a fast video splicing algorithm that takes dynamic targets into account. For the collected video frames, ORB algorithm was used to match sparse feature points between frames, YOLOv5 was used to segment possible moving targets (cars and pedestrians) in the left and right frames, and then combined with optical flow method to determine whether the entire target area is a moving target area and remove the influence of moving targets, and a single mapping
H-array estimation was performed between frames based on the pair of feature matching points in the static area. The highly accurate
H-array is obtained. Finally, when image fusion is carried out through the optimal suture line, the suture line can avoid the moving target area based on the previously detected moving target area, and the dynamic target area is updated on the left and right sides respectively. Finally, the new algorithm is tested in detail, and the average stitching speed can reach 61 ms every two frames (
1280×720 resolution), which has a faster stitching speed than APAP and ELA algorithms, and finally achieves a fast video stitching that can take into account the influence of dynamic objects.