RUS  ENG
Full version
JOURNALS // Zhurnal Vychislitel'noi Matematiki i Matematicheskoi Fiziki // Archive

Zh. Vychisl. Mat. Mat. Fiz., 2021 Volume 61, Number 7, Pages 1149–1161 (Mi zvmmf11265)

This article is cited in 1 paper

Computer science

Prior distribution selection for a mixture of experts

A. V. Grabovoya, V. V. Strijovab

a Moscow Institute of Physics and Technology, 141701, Dolgoprudny, Moscow oblast, Russia
b Moscow Institute of Physics and Technology, Dorodnicyn Computing Centre, Federal Research Center "Computer Science and Control" of the Russian Academy of Sciences, 141701, Dolgoprudny, Moscow oblast, Russia

Abstract: The paper investigates a mixture of expert models. The mixture of experts is a combination of experts, local approximation model, and a gate function, which weighs these experts and forms their ensemble. In this work, each expert is a linear model. The gate function is a neural network with softmax on the last layer. The paper analyzes various prior distributions for each expert. The authors propose a method that takes into account the relationship between prior distributions of different experts. The EM algorithm optimises both parameters of the local models and parameters of the gate function. As an application problem, the paper solves a problem of shape recognition on images. Each expert fits one circle in an image and recovers its parameters: the coordinates of the center and the radius. The computational experiment uses synthetic and real data to test the proposed method. The real data is a human eye image from the iris detection problem.

Key words: mixture of experts, bayesian model selection, prior distribution.

UDC: 519.72

Received: 26.11.2020
Revised: 26.11.2020
Accepted: 11.03.2021

DOI: 10.31857/S0044466921070073


 English version:
Computational Mathematics and Mathematical Physics, 2021, 61:7, 1140–1152

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024