Hi Guys,

whatever i did , it was very simple as Simon suggested,
my project something which will have 50k to millions of record lines, in 2
-3 or 4 text files, each having  same no. of records i.e each will have 50k
to millions of record lines ,there is one common filed such as Doc-id which
work as primary key here. i will generate a output file which will have
filed which required in output

so for my Test : 10.5k row lines in both text files with 20 column and 11
column.
i have taken 1st text file into 1-D array, same with 2nd file.
Concatenate them into one string , and doing simple INSERT command.that
takes near to 30 sec now for doing all these, cz some of internal
verification and manipulation.

i will do performance tuning on my project in few days will update the email
thread. If we simply see Inserting/ Importing 10.5k row record from 2 files
with 20+ columns it will only take hardly a sec.
but here i am some manipulation over that where i am killing some second of
time.

Thanks Simon ,Olaf and Garry.
please update this email thread if we will have any faster processes than
now.

thanks a lot Guys.. :-)
*
Regards,
Alok*


On 7 January 2011 01:48, GS <[email protected]> wrote:

> Olaf Schmidt pretended :
> > "Alok Singh" <[email protected]> schrieb
> > im Newsbeitrag
> > news:[email protected]...
> >
> >> yeah that's correct Simon, its in 0.6 sec to insert
> >> for 10.5K rows with 20 columns (2 files both
> >> having 10.5k rows)
> >
> > That's the timing I would expect, if you'd have used
> > Garrys recommendation (to read in the whole file
> > into a String first, and then split the string into an
> > Array "InMemory", finally followed by an Insert-
> > Transaction, which makes use of this 2D-Array).
> >
> > That's memory-intensive - but "Ok" (and fast) for Testfiles
> > with that RowCount (filesize of  your 10.5K-Rows
> > testfiles around 4-6MB I'd guess).
>
> <FWIW>
> I just checked a 21,000 line x 30 column delimited file and it is
> 817KB. I draw the line (for performance and/or convenience working with
> the data) at about 50K lines before I'd use ADO to load the entire file
> into a recordset, OR criteria-specific recordsets depending on how I
> want to work with it.
>
> What I was hoping to learn here is whether we can dump an entire
> recordset into a SQLite table. Is that doable.
>
> >
> > Are you sure, that your replies address the person
> > you have in mind ... your previous reply was going to
> > Garry - and your last reply here was going to me,
> > and "both of us" are not Simon (who is a very helpful
> > person on this list, no doubt about that... :-).
> >
> > Olaf
> >
> >
> >
> > _______________________________________________
> > sqlite-users mailing list
> > [email protected]
> > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
> --
> Garry
>
> Free usenet access at http://www.eternal-september.org
> ClassicVB Users Regroup! comp.lang.basic.visual.misc
>
>
>
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to