RUS  ENG
Полная версия
ЖУРНАЛЫ // Eurasian Mathematical Journal // Архив

Eurasian Math. J., 2022, том 13, номер 4, страницы 70–81 (Mi emj455)

Convergence of the partition function in the static word embedding model

K. Mynbaeva, Zh. Assylbekovb

a International School of Economics, Kazakh-British Technical University, 59 Tolebi St, 050000 Almaty, Kazakhstan
b Department of Mathematics, School of Sciences and Humanities, Nazarbayev University, 53 Kabanbay Batyr Ave, 010000 Astana, Kazakhstan

Аннотация: We develop an asymptotic theory for the partition function of the word embeddings model WORD2VEC. The proof involves a study of properties of matrices, their determinants and distributions of random normal vectors when their dimension tends to infinity. The conditions imposed are mild enough to cover practically important situations. The implication is that for any word $i$ from a vocabulary $\mathcal{W}$, the context vector $\mathbf{c}_i$ is a reflection of the word vector $\mathbf{w}_i$ in approximately half of the dimensions. This allows us to halve the number of trainable parameters in static word embedding models.

Ключевые слова и фразы: word embeddings, partition function, neural networks, WORD2VEC, asymptotic distribution.

MSC: 68T50

Поступила в редакцию: 07.07.2022

Язык публикации: английский

DOI: 10.32523/2077-9879-2022-13-4-70-81



Реферативные базы данных:


© МИАН, 2024