Data compression is the compacting of info by lowering the number of bits which are stored or transmitted. As a result, the compressed info needs less disk space than the initial one, so more content might be stored using the same amount of space. There're different compression algorithms which function in different ways and with a lot of them just the redundant bits are deleted, which means that once the data is uncompressed, there is no loss of quality. Others erase unneeded bits, but uncompressing the data later on will lead to reduced quality compared to the original. Compressing and uncompressing content consumes a large amount of system resources, in particular CPU processing time, therefore each and every hosting platform that uses compression in real time needs to have enough power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the actual code.