As quick insight I gleaned from this list sometime ago: if you only need to
be able to search and sort, the blobs can be used.

If you can live with a fixed size (f.i. 256bits), then just use the
fixed-size big-endian representation as blob content.

For variable-length big integers, encode them as variable-length
big-endian, then prepend it with its byte length encoded to a fixed size
big-endian (f.i. a dword), so encoding 123456 would be the blob
x'0000000301E240'  (of course if your biginteger are never going to grow
too large, you could use a word or a byte to encode the length)

SQLite will not be able to do arithmetic on those or convert them to string
or anything, but you can add custom functions to handle that format should
you need them.

Eric


On Thu, May 3, 2018 at 1:54 PM, Simon Slavin <slav...@bigfraud.org> wrote:

> On 2 May 2018, at 6:08pm, Thomas Kurz <sqlite-users@mailinglists.
> sqlite.org> wrote:
>
> > Are there any plans for supporting a true BigInt/HugeInt data type (i.e.
> without any length restriction) in the near future?
>
> The largest integer storable as an integer is current 2^63-1, which is the
> value as signed BigInt in many libraries.  In other words, SQLite already
> does BigInt.  Just the same as SQL Server, MySQL, Postgres and DB2.
>
> I have not seen any plans for anything bigger than that.
>
> Simon.
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to