The term data compression means reducing the number of bits of info which has to be saved or transmitted. You can do this with or without losing data, which means that what will be removed during the compression shall be either redundant data or unnecessary one. When the data is uncompressed later on, in the first case the data and the quality will be identical, while in the second case the quality shall be worse. You'll find different compression algorithms which are more efficient for various sort of data. Compressing and uncompressing data usually takes a lot of processing time, so the server executing the action needs to have adequate resources in order to be able to process your info quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 within the binary code instead of storing the actual 1s and 0s.

Data Compression in Shared Web Hosting

The ZFS file system that is run on our cloud web hosting platform employs a compression algorithm identified as LZ4. The aforementioned is considerably faster and better than every other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data quite well and it does that very quickly, we're able to generate several backup copies of all the content kept in the shared web hosting accounts on our servers on a daily basis. Both your content and its backups will require reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the servers where your content will be stored.