> -Original Message-
> From: Christos Vasilakis [mailto:[EMAIL PROTECTED]
> Sent: Friday, August 29, 2008 12:46 AM
> To: Derby Discussion
> Subject: Re: RE : Maximum amount of data suitable for Derby
>
> Benoît Chaluleau wrote:
> > Voilà un défi ! Etre les premiers à avoir planté Derby pa
Hi Dmitri,
If you are bulk-loading this data, then the following may help
1) You can use JDBC statement batching. Please see the javadoc for
Statement.addBatch(), PreparedStatement.addBatch() and
Statement.executeBatch() as well as this link:
http://java.sun.com/j2se/1.3/docs/guide/jdbc/spec2
Hello!
The main problem (at the moment) is the insertion of these data.
What can I do to improve the speed, at which the data are inserted
(apart from using prepared statements instead of normal ones and
setting "durability=test") ?
Thanks in advance
Dmitri Pissarenko
--
http://www.xing.com/pr
Derby has problems when you do simultaneous INSERTs and SELECTs on the
same table with fast growing tables. See
https://issues.apache.org/jira/browse/DERBY-2991
for details. If this is the case for your application, then make sure
that there are no SELECTs while INSERTing data.
Am Freitag 29 A
Narayanan <[EMAIL PROTECTED]> writes:
> Not sure about a document for optimizing large data. But this link is
> the pointer to the performance tuning manual,
>
> http://db.apache.org/derby/docs/dev/tuning/
You may also find some of the papers and presentations on this page
interesting: http://db.
Benoît Chaluleau wrote:
Voilà un défi ! Etre les premiers à avoir planté Derby par volumétrie, sans
que ce soit par cause de place disque ou d'environnement...
Chui sur que c'est jouable.
Chui pas sur qu'Airbus finance l'étude par contre...
-Message d'origine-
De : [EMAIL PROTECTED] [
Not sure about a document for optimizing large data. But this link is
the pointer to the performance tuning manual,
http://db.apache.org/derby/docs/dev/tuning/
Narayanan
Dmitri Pissarenko wrote:
Hello!
Many thanks for your answer, John!
Could you tell me, what documents explain how to use
(
Hello!
Many thanks for your answer, John!
Could you tell me, what documents explain how to use
(performance-optimize) Derby with large amounts of data?
Thanks in advance
Dmitri Pissarenko
--
http://www.xing.com/profile/Dmitri_Pissarenko
Voilà un défi ! Etre les premiers à avoir planté Derby par volumétrie, sans
que ce soit par cause de place disque ou d'environnement...
Chui sur que c'est jouable.
Chui pas sur qu'Airbus finance l'étude par contre...
-Message d'origine-
De : [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Dmitri Pissarenko wrote:
> Hello!
>
> I have an application, which has to store following amounts of data:
>
> 1) There are approx. 20 MB of data per day.
> 2) The database should be able to store data of, at least, one week,
> i. e. 7 * 20 = 140 MB, at most - data of a quarter, i. e. 30 * 20 * 3
Hello!
I have an application, which has to store following amounts of data:
1) There are approx. 20 MB of data per day.
2) The database should be able to store data of, at least, one week,
i. e. 7 * 20 = 140 MB, at most - data of a quarter, i. e. 30 * 20 * 3
= 1800 MB.
The data is imported into
11 matches
Mail list logo