RUS  ENG
Full version
JOURNALS // Artificial Intelligence and Decision Making // Archive

Artificial Intelligence and Decision Making, 2020 Issue 4, Pages 55–65 (Mi iipr152)

This article is cited in 6 papers

Decision analysis

A review of methods for explaining and interpreting decisions of intelligent cancer diagnosis systems

L. V. Utkina, A. A. Meldob, M. S. Kovaleva, E. M. Kasimova

a Peter the Great St. Petersburg Polytechnic University, St.Petersburg, Russia
b Saint-Petersburg Clinical Research Center of Specialized Types of Medical Care (Oncological), St.Petersburg, Russia

Abstract: The paper presents a review of methods for explaining and interpreting the classification results provided by various machine learning models. A general classification of the interpretation and explanation methods is given depending on a type of interpreted models. Main approaches and examples of explanation methods in medicine and, in particular, in oncology, are considered. A general scheme of the explainable intelligence subsystem is proposed, which allows to implement explanations by means of the natural language.

Keywords: machine learning, explainable intelligence, interpretation, oncology, deep neural networks, intelligent diagnostic system.

DOI: 10.14357/20718594200406


 English version:
, 2021, 48:5, 398–405

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024