> If an implementation "uses" 8 bits for ASCII text (as opposed to
> hardware storage which is never less than 8 bits for a single C char,
> AFAIK), then it is not a valid ASCII implementation, i.e. does not
> interpret ASCII according to its definition. The whole point of
> specifying a format as 7 bits is that the 8th bit is ignored, or
> perhaps used in an implementation-defined manner, regardless of whether
> the 8th bit in a char is available or not.

ASCII was designed back in the days of low reliability serial communications -- 
you know, back when data was sent using 7 bit data + 1 parity bits + 2 stop 
bits -- to increase the reliability of the communications.  A "byte" was also 9 
bits.  8 bits of data and a parity bit.

Nowadays we use 8 bits for data with no parity, no error correction, and no 
timing bits.  Cuz when things screw up we want them to REALLY screw up ... and 
remain undetectable.





_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to