On Sun, May 13, 2012 at 1:18 PM, Simon Slavin <[email protected]> wrote:

>
> On 13 May 2012, at 4:49pm, Roger Binns <[email protected]> wrote:
>
> > You should be accessing things via SQL and the C API.  In that case the
> > encoding in the database is not relevant as the strings have their
> > encoding converted as appropriate.
> >
> > sqlite3_column_bytes and sqlite3_column_bytes16 tell you the length of
> the
> > utf8/16 string in bytes while sqlite3_column_text and
> > sqlite3_column_text16 get you the utf8/16 data.
> >
> >  http://www.sqlite.org/c3ref/column_blob.html
>
> Rather than this shillying and shallying, I'll just tell you that if you
> want to know how many Unicode characters there are in a SQLite string,
> you'll have to use a unicode library that isn't part of SQLite.  SQLite
> just stores Unicode strings.  It doesn't understand them.
>

SQLite understands unicode well enough to tell you how many characters
there are in a string.

   SQLite version 3.7.12 2012-05-12 18:29:53
   NDSeV devkit 1.1.0 2012-05-12 22:34:14 d995e28ccc6299a2
   Enter ".help" for instructions
   Enter SQL statements terminated with a ";"
   sqlite> select length('Gödel'), length(CAST('Gödel' AS blob));
   5|6

SQLite does not know how to convert unicode to upper or lower case because
case conversion is locale dependent.  But the number of characters in a
string is not locale dependent, nor does it require massive tables, so
SQLite does that just fine.



>
> Simon.
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
D. Richard Hipp
[email protected]
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to