Data compression is the compacting of info by lowering the number of bits which are stored or transmitted. Because of this, the compressed information will require substantially less disk space than the original one, so additional content can be stored using identical amount of space. You'll find many different compression algorithms which work in different ways and with a number of them just the redundant bits are deleted, which means that once the info is uncompressed, there is no decrease in quality. Others delete unneeded bits, but uncompressing the data later will result in reduced quality in comparison with the original. Compressing and uncompressing content needs a significant amount of system resources, and in particular CPU processing time, therefore every web hosting platform that employs compression in real time needs to have adequate power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of keeping the whole code.
Data Compression in Shared Hosting
The ZFS file system which operates on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The aforementioned is a lot faster and better than every other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the performance of websites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that very quickly, we can generate several backups of all the content stored in the shared hosting accounts on our servers daily. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the web servers where your content will be kept.