A compression program like gzip is not a l"library", it is a free
standing, open source program. It has no secrets.
vlema...@ausy.org wrote:
> Hello, thank you and others for your answers.
>
> We are not allowed to use external libraries, because of industrial
> certification constraints. We can
Hello!
В сообщении от Tuesday 20 January 2009 12:24:41 vlema...@ausy.org написал(а):
> For now we are just wondering how to use SQlite facilities, and if it's
> not sufficient, maybe we would think of the opportunity to developp a tiny
> compression algorithm by ourselves, or not... There is no re
Hello, thank you and others for your answers.
We are not allowed to use external libraries, because of industrial
certification constraints. We can use SQLite because we can not do without
database, but it's a big stuff to validate it according to those
constraints, so we have to reduce as much as
Hello!
В сообщении от Monday 19 January 2009 20:22:33 vlema...@ausy.org написал(а):
> It is a requirement that the size of those copies being as small as
> possible, without having to perform an external compression.
You can using internal data compression. By compressing a few big fields you
ca
On Mon, Jan 19, 2009 at 06:22:33PM +0100, vlema...@ausy.org wrote:
> Hello,
>
> We need to produce copies of our databases for archive.
> It is a requirement that the size of those copies being as small as
> possible, without having to perform an external compression.
> vacuum doesn't seem to perf
Just use something like gzip to make a compressed version of the
database for storage. You would most likely save up to 80% of the
space. The .gz files are an industry standard for compression.
vlema...@ausy.org wrote:
> Hello,
>
> We need to produce copies of our databases for archive.
> It i
> We need to produce copies of our databases for archive.
> It is a requirement that the size of those copies being as small as
> possible, without having to perform an external compression.
> vacuum doesn't seem to perform a compression (it works on fragmented
> data), is there any other way to do
Hello,
We need to produce copies of our databases for archive.
It is a requirement that the size of those copies being as small as
possible, without having to perform an external compression.
vacuum doesn't seem to perform a compression (it works on fragmented
data), is there any other way to do t
8 matches
Mail list logo