yeah cool. thanks heaps for the info adam. here's my real question: if i'm
storing standard utf-8 at 8 bytes per char, is varchar2(2000) in the 'char
sematics' larger that varchar2(2000) in bytes? (since 8 * 2000 is like,
way bigger than 2000 bytes...)

or am i just confused, way off base and a little bit tooo drunk to be asking
questions like this?
G


On 12/6/06, Adam Cameron <[EMAIL PROTECTED]> wrote:
>
>
> A byte is a standard length - almost always eight bits - which can
> store 256 possible values.  I suspect that's the part you already knew
> ;-)
>
> A character is often one byte in size (ASCII, for example... well:
> strictly speaking that's only seven bits, but you get my drift); but
> depending on the encoding scheme in use, could be two, three or more
> bytes long.  In Unicode's UTF naming standard, the number at the end
> (UTF-8, UTF-16, etc) denotes how many BITS each character uses.
>
> Make sense?
>
> --
> Adam
>
>
> >
>


--~--~---------~--~----~------------~-------~--~----~
 You received this message because you are subscribed to the Google Groups 
"cfaussie" group.
To post to this group, send email to cfaussie@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/cfaussie?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to