RUS  ENG
Full version
JOURNALS // Modelirovanie i Analiz Informatsionnykh Sistem // Archive

Model. Anal. Inform. Sist., 2023 Volume 30, Number 4, Pages 418–428 (Mi mais812)

This article is cited in 1 paper

Artificial intelligence

Keyphrase generation for the Russian-language scientific texts using mT5

A. V. Glazkovaab, D. A. Morozovac, M. S. Vorobevab, A. A. Stupnikovb

a Institute for Information Transmission Problems (Kharkevich Institute), 19/1 Bol’shoj Karetnyj pereulok str., Moscow, 127051, Russia
b University of Tyumen, 6 Volodarskogo str., Tyumen, 625003, Russia
c Novosibirsk National Research State University, 1 Pirogova str., Novosibirsk, 630090, Russia

Abstract: In this work, we applied the multilingual text-to-text transformer (mT5) to the task of keyphrase generation for Russian scientific texts using the Keyphrases CS&Math Russian corpus. The automatic selection of keyphrases is a relevant task of natural language processing since keyphrases help readers find the article easily and facilitate the systematization of scientific texts. In this paper, the task of keyphrase selection is considered as a text summarization task. The mT5 model was fine-tuned on the texts of abstracts of Russian research papers. We used abstracts as an input of the model and lists of keyphrases separated with commas as an output. The results of mT5 were compared with several baselines, including TopicRank, YAKE!, RuTermExtract, and KeyBERT. The results are reported in terms of the full-match F1-score, ROUGE-1, and BERTScore. The best results on the test set were obtained by mT5 and RuTermExtract. The highest F1-score is demonstrated by mT5 (11,24 %), exceeding RuTermExtract by 0,22 %. RuTermextract shows the highest score for ROUGE-1 (15,12 %). According to BERTScore, the best results were also obtained using these methods: mT5 — 76,89 % (BERTScore using mBERT), RuTermExtract — 75,8 % (BERTScore using ruSciBERT). Moreover, we evaluated the capability of mT5 for predicting the keyphrases that are absent in the source text. The important limitations of the proposed approach are the necessity of having a training sample for fine-tuning and probably limited suitability of the fine-tuned model in cross-domain settings. The advantages of keyphrase generation using pre-trained mT5 are the absence of the need for defining the number and length of keyphrases and normalizing produced keyphrases, which is important for flective languages, and the ability to generate keyphrases that are not presented in the text explicitly.

Keywords: automatic text summarization, selecting keyphrases, mT5.

UDC: 004.912

MSC: 68T50

Received: 13.11.2023
Revised: 22.11.2023
Accepted: 29.11.2023

DOI: 10.18255/1818-1015-2023-4-418-428



© Steklov Math. Inst. of RAS, 2024