Using the pigeonhole theorem prove that an algorithm cannot losslessly compress any 1024-bit binary string so that it's not more than 1023 bits.
Let, the 1024-bit binary string be compressed into a binary string of length k.
By Pigeon Hole Principle there exist at least 1 hole containing pigeons.
i.e there there exist at least 1 bit in the k-bit string that represent bits from the original 1024-bit string.
at least - 1 bits of memory is destroyed in the process of compression.
Get Answers For Free
Most questions answered within 1 hours.