4 Replies - 1950 Views - Last Post: 21 February 2011 - 02:55 AM

#1 joshuabraham   User is offline

  • New D.I.C Head

Reputation: 2
  • View blog
  • Posts: 42
  • Joined: 18-November 08

how far can compression go

Posted 23 February 2009 - 11:41 AM

I'm just wondering,what factors declare that data cannot be compressed any further when data is compressed recursively.
Is This A Good Question/Topic? 0
  • +

Replies To: how far can compression go

#2 Lyle   User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 12
  • Joined: 23-February 09

Re: how far can compression go

Posted 23 February 2009 - 01:54 PM

I think it's heavily dependent on what your trying to compress
Was This Post Helpful? 0
  • +
  • -

#3 Dr. Fox   User is offline

  • New D.I.C Head

Reputation: 2
  • View blog
  • Posts: 12
  • Joined: 02-March 08

Re: how far can compression go

Posted 25 February 2009 - 11:23 PM

It depends on the type of compression, namely lossy or lossless. Examples of lossy compression algorithms are for images, videos, and music where some loss of information is acceptable (i.e. loss of quality), and the complete set of information can never be recovered from the compressed data. I would say that the single factor that dictates how far these types of information can be compressed is simply the perception of the end user, and what they find acceptable.

Lossless algorithms on the other hand, compress data in a manner such that there is no loss of the original data. For example, compressing text documents into a zip file. Statistical redundancy (patterns and repetition) are what decide how far you can go with this. It is the specific algorithm, e.g. Huffman coding, that decides what patterns you are exploiting for the compression, and that's what decides how far you can go. You can use a combination of algorithms to compress files further than just one and you can use algorithms that decide the optimal compression method for you specific set of data. But put simply, the single factor that decides the lower limit on compression is statistical redundancy, however you choose to go about exploiting it.
Was This Post Helpful? 1
  • +
  • -

#4 Guest_Gerald Hume*


Reputation:

Re: how far can compression go

Posted 21 February 2011 - 01:24 AM

Lossless compression keeps on compressing even it is a dumb one
like statistical mode compression. Lossless compression doesn't
keep compressing because the quality starts becomming poor.
Was This Post Helpful? 0

#5 Guest_Gerald Hume*


Reputation:

Re: how far can compression go

Posted 21 February 2011 - 02:55 AM

View PostDr. Fox, on 25 February 2009 - 11:23 PM, said:

It depends on the type of compression, namely lossy or lossless. Examples of lossy compression algorithms are for images, videos, and music where some loss of information is acceptable (i.e. loss of quality), and the complete set of information can never be recovered from the compressed data. I would say that the single factor that dictates how far these types of information can be compressed is simply the perception of the end user, and what they find acceptable.

Lossless algorithms on the other hand, compress data in a manner such that there is no loss of the original data. For example, compressing text documents into a zip file. Statistical redundancy (patterns and repetition) are what decide how far you can go with this. It is the specific algorithm, e.g. Huffman coding, that decides what patterns you are exploiting for the compression, and that's what decides how far you can go. You can use a combination of algorithms to compress files further than just one and you can use algorithms that decide the optimal compression method for you specific set of data. But put simply, the single factor that decides the lower limit on compression is statistical redundancy, however you choose to go about exploiting it.

Huffman died because he was wrong. You should be able to use a procedure
like his repetitively on a file to make it smaller and smaller instead of
only a single time. Prime number compression is probably the best. That is
what they wanted to use with Mpeg instead of the lossy method. Those 64 golden
juicy delicious Mpeg primes with getting even or something with a lossless
compression ratio of about 48:1.
Was This Post Helpful? 0

Page 1 of 1