RUS  ENG
Full version
JOURNALS // Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia // Archive

Dokl. RAN. Math. Inf. Proc. Upr., 2024 Volume 520, Number 2, Pages 182–192 (Mi danma599)

This article is cited in 1 paper

SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES

SwiftDepth++: an efficient and lightweight model for accurate depth estimation

Ya. Dayouba, I. A. Makarovbc

a National Research University Higher School of Economics, Moscow
b Artificial Intelligence Research Institute (AIRI), Moscow, Russia
c ISP RAS Research Center for Trusted Artificial Intelligence, Moscow, Russia

Abstract: Depth estimation is a crucial task across various domains, but the high cost of collecting labeled depth data has led to growing interest in self-supervised monocular depth estimation methods. In this paper, we introduce SwiftDepth++, a lightweight depth estimation model that delivers competitive results while maintaining a low computational budget. The core innovation of SwiftDepth++ lies in its novel depth decoder, which enhances efficiency by rapidly compressing features while preserving essential information. Additionally, we incorporate a teacher-student knowledge distillation framework that guides the student model in refining its predictions. We evaluate SwiftDepth++ on the KITTI and NYU datasets, where it achieves an absolute relative error (Abs-rel) of 10.2% on the KITTI dataset and 22% on the NYU dataset without fine-tuning, all with approximately 6 million parameters. These results demonstrate that SwiftDepth++ not only meets the demands of modern depth estimation tasks but also significantly reduces computational complexity, making it a practical choice for real-world applications.

Keywords: 3D vision, knowledge distillation, lightweight depth model, monocular depth estimation, self-supervised training, hybrid models, unsupervised learning.

UDC: 004.8

Received: 27.09.2024
Accepted: 02.10.2024

DOI: 10.31857/S2686954324700553


 English version:
Doklady Mathematics, 2024, 110:suppl. 1, S162–S171

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025