The term data compression identifies reducing the number of bits of data that should be saved or transmitted. You can do this with or without the loss of information, so what will be deleted during the compression will be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the data and the quality shall be the same, while in the second case the quality shall be worse. You can find various compression algorithms which are more effective for different sort of information. Compressing and uncompressing data in most cases takes plenty of processing time, which means that the server executing the action must have sufficient resources in order to be able to process the data quick enough. An example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 inside the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Hosting

The compression algorithm which we work with on the cloud web hosting platform where your new hosting account will be created is named LZ4 and it's applied by the advanced ZFS file system which powers the platform. The algorithm is better than the ones other file systems work with because its compression ratio is a lot higher and it processes data a lot faster. The speed is most noticeable when content is being uncompressed as this happens at a faster rate than info can be read from a hard disk. Consequently, LZ4 improves the performance of each and every site hosted on a server which uses the algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio make it possible for us to make a number of daily backups of the full content of all accounts and store them for a month. Not only do these backups take less space, but in addition their generation won't slow the servers down like it can often happen with other file systems.