RUS  ENG
Full version
JOURNALS // Informatika i Ee Primeneniya [Informatics and its Applications] // Archive

Inform. Primen., 2015 Volume 9, Issue 4, Pages 78–84 (Mi ia394)

Performance improvement of Lempel–Ziv–Welch compression algorithm

S. Frenkelab, M. Kopeetskyc, R. Molotkovskic, P. Borovskyc

a Moscow State University of Information Technologies, Radioengineering, and Electronics, 78 Vernadskogo Ave., Moscow 119454, Russian Federation
b Institute of Informatics Problems, Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, 44-2 Vavilov Str., Moscow 119333, Russian Federation
c Departmentof Software Engineering, Shamoon College of Engineering, Basel/Bialik Sts, Beer-Sheva, Israel

Abstract: The paper proposes two novel schemes which improve the dictionary-based Lempel–Ziv–Welch (LZW) compression algorithm. The first scheme proposes an improvement over the LZW algorithm by applying an exponential decay (ED) technique as a tool to manage and remove infrequently used entries in the LZW dictionary. The presented results demonstrate that ED may be an efficient tool to manage and refresh the LZW dictionary. The achieved compression ratio (CR) is higher than in the traditional methods like Dictionary Reset (DR) and Least Recently Used (LRU). Another approach uses the Distance from Last Use (DLU) method. The DLU can be compressed by Huffman coding based on the frequencies of the phrases. The compression scheme, called HCD (Huffman Coding of Distance), was tested on different real-life data types such as text, programming code, audio, video and image files, characterized by different Shannon entropy. The experimental results demonstrate that the ED and HCD scheme may provide higher CR, compared with the LZW algorithm.

Keywords: (LZW) Dictionary Compression; dynamic dictionary; dictionary reset DR; least recently used LRU; exponential decay ED.

Received: 05.10.2015

Language: English

DOI: 10.14357/19922264150408



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024