Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 16 Jan 2003 11:33:20 -0300 (ART)
From:      Fernando Gleiser <fgleiser@cactus.fi.uba.ar>
To:        Alex.Wilkinson@dsto.defence.gov.au
Cc:        freebsd-questions@FreeBSD.ORG
Subject:   Re: entropy
Message-ID:  <20030116112515.C8104-100000@cactus.fi.uba.ar>
In-Reply-To: <441y3ehwpb.fsf@be-well.ilk.org>

next in thread | previous in thread | raw e-mail | index | archive | help
On 15 Jan 2003, Lowell Gilbert wrote:

> "Wilkinson,Alex" <Alex.Wilkinson@dsto.defence.gov.au> writes:
>
> > Can someone recommend to me where I can read up on
> > entropy.
> >
> > ie what it is ? Why we have it ? etc etc
>
> The term "entropy" is often used (in rough analogy to its technical
> meaning in thermodynamics) in computer systems to describe the
> "amount" of "randomness" available to random-number functionality.

The entropy is the mean information per symbol. The higher the entropy,
the more random the language is.

It can be shown that the entropy is maximum when all the symbols in
the alphabet have the same probability 1/n, where n is the number of
symbols in the alphabet. In that case, the entropy has a value of
-n*log2(n). Also, in that case the alphabet is trully random.

>
> What to read depends on why you need to know about it, but you could
> always refer to some manual pages, particularly rndcontrol(8).

If you want a more solid background, get an introductory book in
information/coding theory. But don't try it if you don't know calculus,
probability and algebra.


			Fer



To Unsubscribe: send mail to majordomo@FreeBSD.org
with "unsubscribe freebsd-questions" in the body of the message




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20030116112515.C8104-100000>