The term data compression refers to decreasing the number of bits of information which needs to be saved or transmitted. This can be achieved with or without losing info, which means that what will be deleted during the compression will be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the info and its quality will be the same, whereas in the second case the quality will be worse. You can find various compression algorithms which are more effective for various sort of info. Compressing and uncompressing data usually takes a lot of processing time, so the server carrying out the action must have plenty of resources in order to be able to process the data quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 within the binary code instead of storing the actual 1s and 0s.
Data Compression in Hosting
The compression algorithm which we employ on the cloud web hosting platform where your new hosting account shall be created is called LZ4 and it is used by the leading-edge ZFS file system which powers the system. The algorithm is superior to the ones other file systems work with as its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than data can be read from a hard disk drive. Consequently, LZ4 improves the performance of each and every website located on a server that uses the algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio allow us to make multiple daily backups of the entire content of all accounts and store them for 30 days. Not only do the backups take less space, but in addition their generation will not slow the servers down like it can often happen with many other file systems.