RUS  ENG
Full version
JOURNALS // Proceedings of the Institute for System Programming of the RAS // Archive

Proceedings of ISP RAS, 2017 Volume 29, Issue 4, Pages 73–86 (Mi tisp236)

This article is cited in 2 papers

Real-time digital video stabilization using MEMS-sensors

A. V. Kornilova, I. A. Kirilenko, N. I. Zabelina

Saint Petersburg State University

Abstract: This article describes our ongoing research on real-time digital video stabilization using MEMS-sensors. The authors propose to use the described method for stabilizing the video that is transmitted to the mobile robot operator who controls the vehicle remotely, as well as increasing the precision of video-based navigation for subminiature autonomous models. The article describes the general mathematical models needed to implement the video stabilization module based on the MEMS sensors readings. These models includes the camera motion model, frame transformation model and rolling-shutter model. The existing approaches to stabilization using sensors data were analyzed and considered from the point of view of the application in a real-time mode. This article considers the main problems that came up during the experiments that were not resolved in the previous research papers. Such problems include: calibration of the camera and sensors, synchronization of the camera and sensors, increasing the accuracy of determining the camera position from sensors data. The authors offer possible solutions to these problems that would help improve quality of the work of existing algorithms, such as a system for parallel synchronized recording of video and sensor data based on the Android operating system. As the main result, the authors represent a framework for implementing video stabilization algorithms based on MEMS sensors readings.

Keywords: video stabilization, MEMS sensors, real-time system, digital signal processing, computer vision, rolling shutter.

Language: English

DOI: 10.15514/ISPRAS-2017-29(4)-5



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024