Skip site navigation (1)Skip section navigation (2)
Date:      Mon, 31 Jul 2000 18:57:33 -0400 (EDT)
From:      Brian Dean <bsd@bsdhome.com>
To:        Kris Kennaway <kris@FreeBSD.ORG>
Cc:        "Jeroen C. van Gelderen" <jeroen@vangelderen.org>, markm@FreeBSD.ORG, arch@FreeBSD.ORG
Subject:   Re: Estimating entropy
Message-ID:  <Pine.BSF.4.21.0007311819100.7494-100000@vger.bsdhome.com>
In-Reply-To: <Pine.BSF.4.21.0007252346200.58758-100000@freefall.freebsd.org>

next in thread | previous in thread | raw e-mail | index | archive | help
On Tue, 25 Jul 2000, Kris Kennaway wrote:

> I've been looking for some good entropy estimation algorithms which are
> suitable for running "online", i.e. at runtime to estimate the content of
> a set of samples. I haven't had much success so far - the only two things
> I can think of are to keep a pool of the last n samples from a source, and
> then either:

[snip]

> Any thoughts?

I'm not sure if this it is quite what you're after or applicable to
your particular application, but have you looked at "correlation
dimension"?  Correlation dimension (a google search should turn up
some information) is an algorithmic technique that can be utilized to
"measure" the randomness of a set of data, and can be used to
differentiate between deterministic chaos and pure noise (noise being
highly random, but derministic chaos, while can appear random, is
not).

The only problem with this, keeping in mind your requirements, is that
it can take quite a bit of data to reach confidence in the result.
You can get pretty good results with about 10^4.5 data points, and
much better results with 10^5.5 data points.

I can send you a few references if you think it might be useful.

-Brian
-- 
Brian Dean					bsd@bsdhome.com



To Unsubscribe: send mail to majordomo@FreeBSD.org
with "unsubscribe freebsd-arch" in the body of the message




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?Pine.BSF.4.21.0007311819100.7494-100000>