Re: Fwd: Compression, encoding, entropy

From: David Mertz <voting-project_at_gnosis_dot_cx>
Date: Mon May 03 2004 - 15:58:24 CDT

>Let n be the highest number being encoded [DQM: "Vote Space"]
>Let d be the number of bits in a digit
>Let b be the base (2**n)
>Let m be one-half of the base
>The number of digits, g = floor (log_m(n) +1)
>The number of bits is d*g

David Mertz wrote:
|>The term 'g' very quickly converges to 1, i.e. for n >= 3.

Arthur Keller <arthur@kellers.org> wrote:
|g doesn't [converge] to 1. It grows with the log of n.

I don't understand then. Looking at the above prior description:

  (1) b = 2**n
  (2) m = b/2
  (3) or: m = (2**n)/2
  (4) g = floor(log_m(n)+1)

Python uses natural log, so I use the formula log_x(y)=log_z(y)/log_z(x)
to compute the log with appropriate base.

  (5) g = floor(log(n)/log((2**n)/2)+1)

So in code, in an interactive session:

>>> from math import log, floor
>>> g = lambda n: floor(log(n)/log((2**n)/2)+1)
>>> g(2)
  2.0
>>> g(3)
  1.0
>>> g(4)
  1.0
>>> g(100)
  1.0

Where did I go wrong in these steps? Do you maybe mean 'log_n(m)'
instead? Or something else?

--
---[ to our friends at TLAs (spread the word) ]--------------------------
Echelon North Korea Nazi cracking spy smuggle Columbia fissionable Stego
White Water strategic Clinton Delta Force militia TEMPEST Libya Mossad
---[ Postmodern Enterprises <mertz@gnosis.cx> ]--------------------------
==================================================================
= The content of this message, with the exception of any external 
= quotations under fair use, are released to the Public Domain    
==================================================================
Received on Mon May 31 23:17:05 2004

This archive was generated by hypermail 2.1.8 : Mon May 31 2004 - 23:18:15 CDT