The term data compression means decreasing the number of bits of info which should be stored or transmitted. This can be done with or without losing information, which means that what will be removed during the compression will be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the information and its quality shall be the same, whereas in the second case the quality will be worse. There are different compression algorithms that are better for different kind of info. Compressing and uncompressing data generally takes plenty of processing time, which means that the server performing the action needs to have ample resources in order to be able to process the data quick enough. An example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 inside the binary code instead of storing the actual 1s and 0s.

Data Compression in Shared Web Hosting

The compression algorithm which we work with on the cloud hosting platform where your new shared web hosting account will be created is called LZ4 and it is applied by the revolutionary ZFS file system that powers the platform. The algorithm is greater than the ones other file systems use since its compression ratio is a lot higher and it processes data a lot quicker. The speed is most noticeable when content is being uncompressed as this happens at a faster rate than info can be read from a hard disk drive. For that reason, LZ4 improves the performance of each and every Internet site stored on a server which uses this algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio make it possible for us to produce a couple of daily backups of the full content of all accounts and store them for one month. Not only do these backup copies take less space, but also their generation won't slow the servers down like it can often happen with many other file systems.