Thanks for your suggestion.Will try it out and get back to you soon.

On Jan 9, 12:26 pm, Christoph Läubrich <[email protected]> wrote:
> I see, you must not put the server in the shm, but the database file:
> For example:
> jdbc:h2:tcp://<ipadderess>:8001/dev/shm/
>
> of course you can create an own folder for the db, just use the shm like
> a regular harddisk but keep in mind the content is lost whenever you
> reboot or power off our pc
>
> Am 09.01.2012 08:14, schrieb Karun:
>
>
>
>
>
>
>
> > Hi Christoph,
>
> > Connection string is : jdbc:h2:tcp://<ipadderess>:8001/mem:testh2memdb
> > Tried to load 2 million data only.
>
> > Created .csv files of 1 million data, then inserted it by redaing
> > these 2  csv files.
> > Inbetween of two insertion..there is gap of 5 mintues time interval.
>
> > On Jan 9, 12:05 pm, Christoph L ubrich<[email protected]>  wrote:
>
> >> Hi Karun,
>
> >> what was your connect string? Do you try to insert all data in one
> >> transaction or do you flush the data from time to time? I rember
> >> somwhere else was mentioned that h2 itself holds some data in memory
> >> while a transaction is running and might exceed gc limit when inserting
> >> large very datasets.
>
> >> Am 09.01.2012 07:48, schrieb Karun:
>
> >>> Hi Christoph,
>
> >>> Thanks for your suggestion.
>
> >>> I tried to use dev/shm in linux machine to populate more data using in-
> >>> memory database.
>
> >>> But was able to load only 1 million data only.after that i get
> >>> JDBCException "OutOfMemoryError: GC overhead limit exceeded".
>
> >>> I tried these below steps:
> >>> 1)install db under dev/shm
> >>> 2)started server from this directory in one process
> >>> 3) loaded data(using .csv file read) from same machine with another
> >>> process.
>
> >>> Can you please help me out if there is anything I am missing when i
> >>> use dev/shm directory.
> >>> or kindly tell me if in-memory db can hold more data(approx 160
> >>> million data )
>
> >>> Thanks in advance.
>
> >>> Regards,
> >>> Karun
>
> >>> On Jan 5, 5:32 pm, Karun<[email protected]>    wrote:
>
> >>>> Hi
>
> >>>> Thanks for quick reply.
>
> >>>> Regards,
> >>>> Mohanty
>
> >>>> On Jan 4, 11:17 pm, Christoph L ubrich<[email protected]>    wrote:
>
> >>>>> If you are running on Linux you can "store" the db in the special
> >>>>> /dev/shm device which is in fact a (shared) memory disk, access to this
> >>>>> device is really fast and you even don't waste any java heapspace. One
> >>>>> nice thing is, when you run out of memory it autmatically uses the
> >>>>> swapspace if I rember right.
>
> >>>>> Am 04.01.2012 18:42, schrieb [email protected]:
>
> >>>>>> There was a suggestion some time ago about using a RAM disk as db
> >>>>>> storage as an alternative to in-memory database. An advantage will be
> >>>>>> that you should not worry about GC, but it will still be slower that a
> >>>>>> plain in-memory db. Maybe, the nio stuff in H2 can alleviate that.
>
> >>>>>> Otherwise.. the problem with hard questions is that they have no
> >>>>>> obvious questions.
>
> >>>>>> --
> >>>>>>      Vasile Rotaru
> >>>>>> --
> >>>>>> You received this message because you are subscribed to the Google
> >>>>>> Groups "H2 Database" group.
> >>>>>> To post to this group, send email to [email protected].
> >>>>>> To unsubscribe from this group, send email to
> >>>>>> [email protected].
> >>>>>> For more options, visit this group at
> >>>>>>http://groups.google.com/group/h2-database?hl=en.

-- 
You received this message because you are subscribed to the Google Groups "H2 
Database" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/h2-database?hl=en.

Reply via email to