RUS  ENG
Полная версия
ЖУРНАЛЫ // Компьютерная оптика // Архив

Компьютерная оптика, 2021, том 45, выпуск 4, страницы 608–614 (Mi co946)

Эта публикация цитируется в 13 статьях

ОБРАБОТКА ИЗОБРАЖЕНИЙ, РАСПОЗНАВАНИЕ ОБРАЗОВ

One-shot learning with triplet loss for vegetation classification tasks

A. V. Uzhinskiyab, G. A. Ososkova, P. V. Goncharova, A. V. Nechaevskiyab, A. A. Smetaninc

a Russian State Agrarian University - Moscow Timiryazev Agricultural Academy, Russia, Moscow, Timiryazevskaya st., 49
b Russian State Agrarian University - Moscow Agricultural Academy after K.A. Timiryazev
c National Research University ITMO, 197101, Russia, Saint–Petersburg, Kronverkskiy pr., 49

Аннотация: Triplet loss function is one of the options that can significantly improve the accuracy of the One-shot Learning tasks. Starting from 2015, many projects use Siamese networks and this kind of loss for face recognition and object classification. In our research, we focused on two tasks related to vegetation. The first one is plant disease detection on 25 classes of five crops (grape, cotton, wheat, cucumbers, and corn). This task is motivated because harvest losses due to diseases is a serious problem for both large farming structures and rural families. The second task is the identification of moss species (5 classes). Mosses are natural bioaccumulators of pollutants; therefore, they are used in environmental monitoring programs. The identification of moss species is an important step in the sample preprocessing. In both tasks, we used self-collected image databases. We tried several deep learning architectures and approaches. Our Siamese network architecture with a triplet loss function and MobileNetV2 as a base network showed the most impressive results in both above-mentioned tasks. The average accuracy for plant disease detection amounted to over 97.8% and 97.6% for moss species classification.

Ключевые слова: deep neural networks; siamese networks; triplet loss; plant diseases detection; moss species classification.

Поступила в редакцию: 28.12.2020
Принята в печать: 24.02.2021

Язык публикации: английский

DOI: 10.18287/2412-6179-CO-856



© МИАН, 2024