Data compression is the compacting of info by reducing the number of bits which are stored or transmitted. This way, the compressed info requires less disk space than the original one, so more content could be stored using the same amount of space. You'll find many different compression algorithms that work in different ways and with a lot of them just the redundant bits are erased, which means that once the information is uncompressed, there is no decrease in quality. Others delete unneeded bits, but uncompressing the data at a later time will result in lower quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, in particular CPU processing time, therefore any web hosting platform which employs compression in real time should have enough power to support this attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the entire code.

Data Compression in Cloud Website Hosting

The ZFS file system which is run on our cloud Internet hosting platform uses a compression algorithm named LZ4. The aforementioned is significantly faster and better than every other algorithm you'll find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that very quickly, we can generate several backup copies of all the content stored in the cloud website hosting accounts on our servers daily. Both your content and its backups will require less space and since both ZFS and LZ4 work very fast, the backup generation will not influence the performance of the servers where your content will be kept.