Skip to content

We will use the text file compression developed in C# using the Huffman Algorithm. Huffman Algorithm is an algorithm used for doing data compression and it forms the basic idea behind file compression. The most frequent data encoded with lower bits. It is the compression of data depending upon the frequency of occurrence without losing the data.

Notifications You must be signed in to change notification settings

Dev-Saqlain/FIle-Compression

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FIle-Compression

We will use the text file compression using the Huffman Algorithm. Huffman Algorithm is an algorithm used for doing data compression and it forms the basic idea behind file compression. The most frequent data encoded with lower bits. It is compression of data depending upon the frequency of occurrence without losing the data. This post talks about fixed-length and variable-length encoding, uniquely decodable codes, prefix rules and construction of Huffman Tree. There is too much usage of text file compression in the real world. We can use this in the file sending and transferring whenever we need it. The main idea based upon a binary tree called a Huffman tree which helps to generate the bits string. For every word in a text, Huffman generates a unique number of bit. Huffman creates a binary tree that has two values 0 and 1. We make a priority queue on the basis of frequency for sorting bytes. Each node of the tree is presented with a byte symbol. File compression is used to reduce the file size of one or more files. When a file or a group of files is compressed, the resulting "archive" often takes up 50% to 90% less disk space than the original file(s). Common types of file compression include Zip, RAR, and 7z compression. Each one of these compression methods uses a unique algorithm to compress the data. Compressing files on your computer save disk space on both removable and non-removable drives. The compression process reduces the overall size of a computer file by physically removing data from the file that is repeated or empty. The process then places a flag indicating the removal. The flag takes up less space. That’s why we are making this software for the purpose of ease and for solving the memory issues.There is complete code huffman encoding and decoding.

About

We will use the text file compression developed in C# using the Huffman Algorithm. Huffman Algorithm is an algorithm used for doing data compression and it forms the basic idea behind file compression. The most frequent data encoded with lower bits. It is the compression of data depending upon the frequency of occurrence without losing the data.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C# 100.0%