RUS  ENG
Полная версия
ЖУРНАЛЫ // Компьютерная оптика // Архив

Компьютерная оптика, 2019, том 43, выпуск 2, страницы 264–269 (Mi co644)

Эта публикация цитируется в 2 статьях

ОБРАБОТКА ИЗОБРАЖЕНИЙ, РАСПОЗНАВАНИЕ ОБРАЗОВ

Unsupervised color texture segmentation based on multi-scale region-level Markov random field models

X. Songabc, L. Wua, G. Liuabc

a School of Computer and Information Engineering, Anyang Normal University, Anyang 455000, Henan, China
b Collaborative Innovation Center of International Dissemination of Chinese Language Henan Province, Anyang, Henan, China
c Henan Key Laboratory of Oracle Bone Inscriptions Information Processing, Anyang, Henan, China

Аннотация: In the field of color texture segmentation, region-level Markov random field model (RMRF) has become a focal problem because of its efficiency in modeling the large-range spatial constraints. However, the RMRF defined on a single scale cannot describe the un-stationary essence of the image, which highly limits its robustness. Hence, by combining wavelet transformation and the RMRF model, we present a multi-scale RMRF (MsRMRF) model in wavelet domainin this paper. In the Bayesian framework, the proposed model seamlessly integrates the multi-scale information stemmed from both the original image and the region-level spatial constraints. Therefore, the new model can accurately describe the characteristics of different kinds of texture. Based on MsRMRF, an unsupervised segmentation algorithm is designed for segmenting color texture images. Both synthetic color texture images and remote sensing images are employed in the comparative experiments, and the experimental results show that the proposed method can obtain more accurate segmentation results than the competitors.

Ключевые слова: region-level Markov random field model, color texture image, image segmentation, wavelet transformation, multi-scale.

Поступила в редакцию: 24.07.2018
Принята в печать: 10.12.2018

Язык публикации: английский

DOI: 10.18287/2412-6179-2019-43-2-264-269



© МИАН, 2024