What os? what python version?

On Sunday, 8 April 2012 16:08:00 UTC-5, Czeski wrote:
>
> Hi,
>
> I am new web2py user and I have some performance problems with 
> import_from_csv_file method. First of all i have big collection of data 
> that i want to upload to Google App Engine. I splited data into 1000 parts, 
> each contains csv serialized rows - about 1367 rows per file. I am doing 
> loop to import each file to database using:
>
> def csv_import():
>     
>     for i in xrange(0, 1000):
>         file = open(os.path.join(request.folder,'private', 'geonames', 
> 'chunk_' + str(i)), 'r')
>         db.geonames.import_from_csv_file(file)
>         db._timings = []
>         file.close()
>
> As You can see it is rather simple method to achieve this. But the main 
> problem is that every loop iteration is increasing the overall memory usage 
> for application. It is never stoped and in 10 iteration it used all system 
> resources and app is terminated.
>
> I think that with every iteration some objects related with DAL still stay 
> in memory and are not collected by gc.
>
> Please advise so I could import all 1000 parts with constant memory usage.
>
> Best Regards
> Lucas
>

Reply via email to