Abstract:
In classification problems, an important property is the monotonicity of the criteria employed, relative to the error probability, in the process of constructively choosing informative measurements and subspaces. This property is investigated in the present paper. It is shown that a widely employed criterion, namely the Kullback divergence, is markedly inferior to other criteria in this respect.