@Carnildo said:@death said:LOL. It should be smart enough not to try compressing data that already has high information density. I bet that that data is already heavily compressed, ie backup dumps already ziped etc. Negative compression ratios in this case are FULLY NATURAL.There is a simple and logical reason why magical compression algorithms(that is all data fed to it is ALWAYS compacted) don't exist. Even if it's already compressed, a competent compression algorithm will only expand a file by one bit.There is no lower limit. It could increase the size by a fraction of a bit.Don't expect to understand what a fraction of a bit is unless you're familiar with compression coding mechanisms.