Re: [Haskell-cafe] Endianess

2008-05-14 Thread Andrew Coppin

Brandon S. Allbery KF8NH wrote:


On 2008 May 13, at 17:12, Andrew Coppin wrote:


[Oh GOD I hope I didn't just start a Holy War...]



Er, I'd say it's already well in progress.  :/



Oh dear.

Appologies to everybody who doesn't actually _care_ about which endian 
mode their computer uses...


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess

2008-05-13 Thread Ketil Malde
Aaron Denney [EMAIL PROTECTED] writes:

 I used to be a big-endian advocate, on the principle that it doesn't
 really matter, and it was standard network byte order.  Now I'm
 convinced that little endian is the way to go

I guess it depends a lot on what you grew up with.  The names
(little/big endian) are incredibly apt.

The only argument I can come up with, is that big endian seems to make
more sense for 'od':

  % echo foobar  foo 
  % od -x foo
  000 6f66 626f 7261 000a
  007

Since this is little endian, the output corresponds to of bo ra
\0\n.

So I guess the argument is that for big-endian, the concatenation of
hex numbers is invariant with respect to word sizes?

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess

2008-05-13 Thread Jed Brown
On Tue 2008-05-13 20:46, Ketil Malde wrote:
 Aaron Denney [EMAIL PROTECTED] writes:
 
 I guess it depends a lot on what you grew up with.  The names
 (little/big endian) are incredibly apt.
 
 The only argument I can come up with, is that big endian seems to make
 more sense for 'od':
 
   % echo foobar  foo 
   % od -x foo
   000 6f66 626f 7261 000a
   007

This, of course, is because `od -x' regards the input as 16-bit integers.  We
can get saner output if we regard it is 8-bit integers.

  $ od -t x1 foo
  000 66 6f 6f 62 61 72 0a
  007

  Now I'm convinced that little endian is the way to go, as bit number n
  should have value 2^n, byte number n should have value 256^n, and so forth.

It's not that simple with bits.  They lack consistency just like the usual US
date format and the way Germans read numbers.

Jed


pgphk5bR3rQBd.pgp
Description: PGP signature
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess

2008-05-13 Thread Ketil Malde
Jed Brown [EMAIL PROTECTED] writes:

 This, of course, is because `od -x' regards the input as 16-bit integers.  We
 can get saner output if we regard it is 8-bit integers.

Yes, of course. The point was that for big-endian, the word size
won't matter.  Little-endian words will be reversed with respect to
the normal (left-to-right, most significant first) way we print
numbers.

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess

2008-05-13 Thread Andrew Coppin

Aaron Denney wrote:

On 2008-05-12, Andrew Coppin [EMAIL PROTECTED] wrote:
  

(Stupid little-endian nonsense... mutter mutter...)



I used to be a big-endian advocate, on the principle that it doesn't
really matter, and it was standard network byte order.  Now I'm
convinced that little endian is the way to go, as bit number n should
have value 2^n, byte number n should have value 256^n, and so forth.

Yes, in human to human communication there is value in having the most
significant bit first.  Not really true for computer-to-computer
communication.
  


It just annoys me that the number 0x12345678 has to be transmuted into 
0x78563412 just because Intel says so. Why make everything so complicated?


[Oh GOD I hope I didn't just start a Holy War...]

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess (was Re: GHC predictability)

2008-05-13 Thread Lennart Augustsson
Also, the way we write numbers is little endian when writing in
Arabic; we just forgot to reverse the digits when we borrowed the
notation.

Little endian is more logical unless you also number your bits with
MSB as bit 0.

On Tue, May 13, 2008 at 7:35 PM, Aaron Denney [EMAIL PROTECTED] wrote:
 On 2008-05-12, Andrew Coppin [EMAIL PROTECTED] wrote:
 (Stupid little-endian nonsense... mutter mutter...)

 I used to be a big-endian advocate, on the principle that it doesn't
 really matter, and it was standard network byte order.  Now I'm
 convinced that little endian is the way to go, as bit number n should
 have value 2^n, byte number n should have value 256^n, and so forth.

 Yes, in human to human communication there is value in having the most
 significant bit first.  Not really true for computer-to-computer
 communication.

 --
 Aaron Denney
 --

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Endianess

2008-05-13 Thread Brandon S. Allbery KF8NH


On 2008 May 13, at 17:12, Andrew Coppin wrote:


[Oh GOD I hope I didn't just start a Holy War...]



Er, I'd say it's already well in progress.  :/

--
brandon s. allbery [solaris,freebsd,perl,pugs,haskell] [EMAIL PROTECTED]
system administrator [openafs,heimdal,too many hats] [EMAIL PROTECTED]
electrical and computer engineering, carnegie mellon universityKF8NH


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe