Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. This way, the compressed information will require considerably less disk space than the initial one, so more content can be stored on identical amount of space. You will find many different compression algorithms that function in different ways and with some of them just the redundant bits are deleted, so once the data is uncompressed, there's no decrease in quality. Others remove excessive bits, but uncompressing the data following that will result in lower quality compared to the original. Compressing and uncompressing content consumes a significant amount of system resources, especially CPU processing time, therefore any Internet hosting platform that employs compression in real time should have enough power to support that attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the whole code.
Data Compression in Hosting
The ZFS file system that is run on our cloud hosting platform uses a compression algorithm named LZ4. The latter is substantially faster and better than every other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that quickly, we can generate several backups of all the content stored in the hosting accounts on our servers daily. Both your content and its backups will take reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the hosting servers where your content will be kept.
Data Compression in Semi-dedicated Hosting
The ZFS file system which runs on the cloud platform where your semi-dedicated hosting account will be created uses a powerful compression algorithm called LZ4. It's among the best algorithms out there and definitely the best one when it comes to compressing and uncompressing website content, as its ratio is very high and it can uncompress data at a higher speed than the same data can be read from a hard disk drive if it were uncompressed. In this way, using LZ4 will speed up any kind of Internet site that runs on a platform where this algorithm is present. The high performance requires plenty of CPU processing time, that is provided by the numerous clusters working together as a part of our platform. What's more, LZ4 makes it possible for us to generate several backup copies of your content every day and keep them for one month as they'll take a smaller amount of space than regular backups and will be generated much quicker without loading the servers.