Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 21 Feb 2002 14:19:26 +1100
From:      Michael Wardle <michael.wardle@adacel.com>
To:        Tom Rhodes <darklogik@pittgoth.com>
Cc:        Wouter Van Hemel <wouter@pair.com>, Giorgos Keramidas <keramida@ceid.upatras.gr>, doc@FreeBSD.ORG
Subject:   Re: inconsistent use of data units
Message-ID:  <3C74673E.8010905@adacel.com>
References:  <3C743707.3080505@adacel.com>	<20020221003116.GA11893@hades.hell.gr>  <3C744D39.1020308@adacel.com> <1014256250.304.66.camel@cocaine> <3C745639.8080509@adacel.com> <3C7463A5.5060204@pittgoth.com>

next in thread | previous in thread | raw e-mail | index | archive | help
Tom Rhodes wrote:
> This is confusing me... Let me throw my "vision" in here...
> 
> 1000 is very easy for a human to work with, mainly newbies, but I like 
> the 1024.  The reason I think that is because 1024 is more "realistic" 
> because there are 1024 numbers from 0 to 1023, and 1023 seems to be 10 
> bits in binary: 11 1111 1111, which is a very convient binary value. So, 
> whilist 1000 may be a very easy decimal value for a human to work with 
> (1111101000) I don't feel that it looks "nice" for a binary machine.
> 
> Yes, a bit of thought went into this, and I understand that standards 
> are standards, but I am trying to put understand this from, what I feel, 
> is a "logical" view point...  Although my view alone

Systeme International (SI) (a.k.a. "metric"), is not designed to 
necessarily correspond exactly to real-life phenomena.  It is designed 
to be an aribitrary, accurate, unambiguous system.

I feel that "inch" is a far nicer unit than "millimeter" for measuring 
small distances, and "mile" may well be nicer than "kilometer" for 
measuring long distances, but the reality is that more and more 
countries, and more and more disiplines are choosing to adopt SI for 
clarity.

If computer scientists had wanted to define their own units for 
computing, then they so be it.  In adopting the SI prefixes (K, M, ...), 
however, there was an implied decision to make computing units defacto 
SI units.  At first, because sizes were so small, and the distinction 
between kilobyte (1000 bytes) and binary kilobyte (1024 bytes) was 
fairly unimportant, as they were so close many regarded them as being 
the same.  Unfortunately, when you extend the units to larger sizes, the 
factor of error becomes larger, and a distinction between the two must 
be made.  Modifying these prefixes for any one field substantially 
weakens the SI standard.

I am not aware of *any* standard that prescribes
1 kilobyte = 1024 bytes, as it is clearly incorrect.

The *only* official statement on this matter I am aware of is the one 
the IEEE, IEC, and CIPM were involved in which clearly states:
1000 bytes = 1 kilobyte (symbol "kB")
1024 bytes = 1 kibibyte (symbol "KiB")

By continuing the current practise (which I must say is far from 
uniform), we are continuing inaccuracy and ambiguity.

-- 
MICHAEL WARDLE                |  WORK   +61-2-6024-2699
SGI Desktop & Admin Software  |  MOBILE +61-415-439-838
Adacel Technologies Limited   |  WEB    http://www.adacel.com/


To Unsubscribe: send mail to majordomo@FreeBSD.org
with "unsubscribe freebsd-doc" in the body of the message




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?3C74673E.8010905>