Data compression

From Uncyclopedia, the content-free encyclopedia
Jump to: navigation, search
For those without comedic tastes, the so-called experts at Wikipedia think they have an article about Data compression.
A graphic demonstration of a collection of datas.

Data compression began as a field of computing science dedicated to making data smaller while still preserving it, which, as it is apparent to the reader, is patently impossible (except through homeopathy, as discussed below). Naturally, if you want to make data smaller, you have to take some out, so that is exactly what file compressors do. Only the most dishonest retailers that offer compressors for insecure download actually claim that all your data will come out the other end.

Conventional methods[edit]

A number of ways of "compressing," or eroding the content of, data, have, in the past, been, through research, discovered. They are all based on the Nobody cares principle, that one should leave out the least important data.

For example:

  • The JPEG algorithm compresses images by deleting all remnants of form and colour.
  • Diebold voting machines "compress" election data by stripping out all voter information and vote counts, to produce:
  • One of the many successful porn download methods "compresses" "Jennifer Lopez hotness" to " ". Burn!



The religious media discovered a more efficient compressor in 1927, based on prayer to God. Our Lord and Saviour packs your phone messages into His divine TAR file, at a shocking 5 bytes in size, then expands it to the righteous, so they may partake of the snitty digital information, and live. This is what Christ died for?


Based on the principle of, "Let like compress like," skilled homeopaths can rebound and tri-drumble bytes that in a "proving" have been shown similar, to produce a gushing fountain of water that drowns everyone in binary. There aren't many homeos around anymore.

Popular compressors[edit]


The most popular secular compressor for a time was OKZIP, whose icon depicted a large zipper. It advertised to "zip up" any data you had, as well as wounds. The program quickly fell out of favor among young Internet users after other things than data began getting caught in the zipper of their trousers.



Meanwhile during all of the 1990s and 19100s Macintosh users were off surly-burley in their own little bubble. When a forward thinking suit approached Steve Jobs in 1995 to suggest that he create a competitor on the compression market, Jobs, evidently addled and furious, replied, "Stuff it!" The rest is history, and Stuff It! went on the shelves in the near past.

Theoretical fish[edit]

Out-of-touch mathematicians postulated a formula for the bound of compression that is agreed upon by dogma:

Compression constant B = \frac{geekiness(spamminess)gay}{military(useful)porn}

The future of compression technology is growing as fast as data files are shrinking, which is to say, with a speed proportional to the entropy of the data, which is beyond the scope of this document.

A promising algorithm[edit]

This is how to compress any file to at most 8 bits:

  1. No action necessary for empty files
  2. No action necessary for one-byte files
  3. A two-or-more byte files have to be normalized and compresed

Normalization: append the 0x01 byte to the file. This actually increases the size, but don't worry, the compression will take care of that

Compression step (done recursively until one byte is obtained):

  • take the whole file contents as a large integer value stored in the little endian format and subtract 1 from it
  • store the resulting integer in the little endian format back to the file

Stop the compression when one byte is obtained.

To decompress the file you just need to do decompression as many times as you did when you compressed the file. Don't forget to remove the trailing 0x01 byte after decompression.


Advantages: The number of compression passes works as a decent password, without which the data are quite hard to retrieve.