Data compression is the compacting of information by decreasing the number of bits which are stored or transmitted. In this way, the compressed info takes less disk space than the original one, so much more content could be stored on the same amount of space. You can find different compression algorithms that function in different ways and with some of them just the redundant bits are removed, which means that once the info is uncompressed, there's no loss of quality. Others remove excessive bits, but uncompressing the data later on will result in reduced quality in comparison with the original. Compressing and uncompressing content requires a huge amount of system resources, and in particular CPU processing time, therefore every web hosting platform that uses compression in real time should have enough power to support that feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of storing the whole code.
Data Compression in Shared Hosting
The compression algorithm used by the ZFS file system which runs on our cloud internet hosting platform is called LZ4. It can improve the performance of any Internet site hosted in a shared hosting account with us since not only does it compress data more effectively than algorithms employed by alternative file systems, but it also uncompresses data at speeds that are higher than the hard drive reading speeds. This is achieved by using a great deal of CPU processing time, that is not a problem for our platform because it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to generate backups quicker and on less disk space, so we shall have several daily backups of your files and databases and their generation won't affect the performance of the servers. That way, we could always recover the content that you could have erased by accident.