RUS  ENG
Full version
JOURNALS // Computer Optics // Archive

Computer Optics, 2021 Volume 45, Issue 4, Pages 608–614 (Mi co946)

This article is cited in 13 papers

IMAGE PROCESSING, PATTERN RECOGNITION

One-shot learning with triplet loss for vegetation classification tasks

A. V. Uzhinskiyab, G. A. Ososkova, P. V. Goncharova, A. V. Nechaevskiyab, A. A. Smetaninc

a Russian State Agrarian University - Moscow Timiryazev Agricultural Academy, Russia, Moscow, Timiryazevskaya st., 49
b Russian State Agrarian University - Moscow Agricultural Academy after K.A. Timiryazev
c National Research University ITMO, 197101, Russia, Saint–Petersburg, Kronverkskiy pr., 49

Abstract: Triplet loss function is one of the options that can significantly improve the accuracy of the One-shot Learning tasks. Starting from 2015, many projects use Siamese networks and this kind of loss for face recognition and object classification. In our research, we focused on two tasks related to vegetation. The first one is plant disease detection on 25 classes of five crops (grape, cotton, wheat, cucumbers, and corn). This task is motivated because harvest losses due to diseases is a serious problem for both large farming structures and rural families. The second task is the identification of moss species (5 classes). Mosses are natural bioaccumulators of pollutants; therefore, they are used in environmental monitoring programs. The identification of moss species is an important step in the sample preprocessing. In both tasks, we used self-collected image databases. We tried several deep learning architectures and approaches. Our Siamese network architecture with a triplet loss function and MobileNetV2 as a base network showed the most impressive results in both above-mentioned tasks. The average accuracy for plant disease detection amounted to over 97.8% and 97.6% for moss species classification.

Keywords: deep neural networks; siamese networks; triplet loss; plant diseases detection; moss species classification.

Received: 28.12.2020
Accepted: 24.02.2021

Language: English

DOI: 10.18287/2412-6179-CO-856



© Steklov Math. Inst. of RAS, 2024