kmguru
Staff member
WASHINGTON -- Physicists do not question the laws of thermodynamics. Chemistry researchers unwaveringly cite Boyle's Law to describe the relationship between gas pressure and temperature.
Computer scientists also have their own fundamental laws, perhaps not as well known, but arguably even more solid. One of those laws says a perfect compression mechanism is impossible.
A slightly expanded version of that law says it is mathematically impossible to write a computer program that can compress all files by at least one bit. Sure, it's possible to write a program to compress typical data by far more than one bit -- that assignment is commonly handed to computer science sophomores, and the technique is used in .jpg and .zip files.
But those general techniques, while useful, don't work on all files; otherwise, you could repeatedly compress a .zip, .gzip or .sit file to nothingness. Put another way, compression techniques can't work with random data that follow no known patterns.
So when a little-known company named ZeoSync announced last week it had achieved perfect compression -- a breakthrough that would be a bombshell roughly as big as e=mc2 -- it was greeted with derision. Their press release was roundly mocked for having more trademarks than a Walt Disney store, not to mention the more serious sin of being devoid of any technical content or evidence of peer review.
A Reuters article was far more credulous, saying in the lead paragraph that "a Florida research startup working with a team of renowned mathematicians said on Monday it had achieved a breakthrough that overcomes the previously known limits of compression used to store and transmit data."
For compression buffs, responding to such assertions ranks somewhere between a teeth-gnashing migraine and a full-contact sport.
The comp.compression FAQ has an entire section devoted to debunking: "From time to time some people claim to have invented a new algorithm for (perfect compression). Such algorithms are claimed to compress random data and to be applicable recursively; that is, applying the compressor to the compressed output of the previous run, possibly multiple times."
Several comp.compression fans have even offered rewards up to $5,000 for independently verifiable proof of perfect compression. They've never been claimed.
Perfect compression, or even compression of a few hundred times -- what ZeoSync claims -- would revolutionize the storage, broadband and digital entertainment industry. It would mean that modems would be as fast as DSL, and DSL speeds would be blinding. A 40-GB hard drive would hold a terabyte, and so on.
Link: http://www.wired.com/news/print/0,1294,49599,00.html
Computer scientists also have their own fundamental laws, perhaps not as well known, but arguably even more solid. One of those laws says a perfect compression mechanism is impossible.
A slightly expanded version of that law says it is mathematically impossible to write a computer program that can compress all files by at least one bit. Sure, it's possible to write a program to compress typical data by far more than one bit -- that assignment is commonly handed to computer science sophomores, and the technique is used in .jpg and .zip files.
But those general techniques, while useful, don't work on all files; otherwise, you could repeatedly compress a .zip, .gzip or .sit file to nothingness. Put another way, compression techniques can't work with random data that follow no known patterns.
So when a little-known company named ZeoSync announced last week it had achieved perfect compression -- a breakthrough that would be a bombshell roughly as big as e=mc2 -- it was greeted with derision. Their press release was roundly mocked for having more trademarks than a Walt Disney store, not to mention the more serious sin of being devoid of any technical content or evidence of peer review.
A Reuters article was far more credulous, saying in the lead paragraph that "a Florida research startup working with a team of renowned mathematicians said on Monday it had achieved a breakthrough that overcomes the previously known limits of compression used to store and transmit data."
For compression buffs, responding to such assertions ranks somewhere between a teeth-gnashing migraine and a full-contact sport.
The comp.compression FAQ has an entire section devoted to debunking: "From time to time some people claim to have invented a new algorithm for (perfect compression). Such algorithms are claimed to compress random data and to be applicable recursively; that is, applying the compressor to the compressed output of the previous run, possibly multiple times."
Several comp.compression fans have even offered rewards up to $5,000 for independently verifiable proof of perfect compression. They've never been claimed.
Perfect compression, or even compression of a few hundred times -- what ZeoSync claims -- would revolutionize the storage, broadband and digital entertainment industry. It would mean that modems would be as fast as DSL, and DSL speeds would be blinding. A 40-GB hard drive would hold a terabyte, and so on.
Link: http://www.wired.com/news/print/0,1294,49599,00.html