Skip site navigation (1)Skip section navigation (2)
Date:      Wed, 26 Jul 2000 01:33:12 -0700 (PDT)
From:      Kris Kennaway <kris@FreeBSD.org>
To:        "Jeroen C. van Gelderen" <jeroen@vangelderen.org>, markm@freebsd.org
Cc:        arch@freebsd.org
Subject:   Re: Estimating entropy
Message-ID:  <Pine.BSF.4.21.0007260130220.64909-100000@freefall.freebsd.org>
In-Reply-To: <Pine.BSF.4.21.0007252346200.58758-100000@freefall.freebsd.org>

next in thread | previous in thread | raw e-mail | index | archive | help
On Tue, 25 Jul 2000, Kris Kennaway wrote:

> 2) Keep a frequency table and calculate or estimate the shannon entropy
> periodically. This may be feasible if we treat the samples as 8-bit
> sources, as you only have to loop over 256 values and calculate a log_2 of
> the probabilities (although lack of FP in the kernel would complicate
> this)

I was thinking about this on the way home, and we can do a big
optimisation here if we assume that the measurement have a gaussian
distribution (which is probably fairly reasonable for most sources, at
least to a first approximation). In that case we only need to know the
mean and variance of the last n samples (e.g. stored in a circular
buffer), which can be computed incrementally without having to do a full
pass over the entire n samples each time, and the entropy has a simple
closed-form solution.

Kris

--
In God we Trust -- all others must submit an X.509 certificate.
    -- Charles Forsythe <forsythe@alum.mit.edu>



To Unsubscribe: send mail to majordomo@FreeBSD.org
with "unsubscribe freebsd-arch" in the body of the message




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?Pine.BSF.4.21.0007260130220.64909-100000>