Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-11 Thread szimszon
I had only to reach that count of browser request :-D

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-11 Thread szimszon
OK. Finally I have it :)

It was only the amount of memory it was consumed each time. I tried it with 
a smaller but a significant number of records and finally I figured out 
that there is a limit and after that memory consumption  is ok. I had only 
reach that count of browser request :-D

Sorry for your time and thanks for help. =-<>-=

2012. május 11., péntek 9:30:28 UTC+2 időpontban szimszon a következőt írta:
>
> The problem is not that if I call the controller function if consumes 
> memory. My problem is that there is something that still reference 
> something after the execution is finished so the consumed memory never get 
> released / reused...
>
> 2012. május 10., csütörtök 23:27:32 UTC+2 időpontban Martin.Mulone a 
> következőt írta:
>>
>> The problem is fetchall(), is getting all the records to memory.
>> If you want to iterate a large dataset or not so heavy, but have a table 
>> with many fields, perhaps you can do it by blocks.
>>
>> For example using raw sql with dal: example: this fetch by 1000 records
>>
>> queryraw = "select * from mytable %(limit_query)s"
>> def records(query):
>> rows = db.executesql(query)
>> if rows:
>> return True
>> else:
>> return False
>>
>> rec = 0
>> stage = 0
>> block = 1000while True:
>> offset = stage * block
>> query = queryraw % {'limitquery': 'LIMIT %s, %s' % (offset, block)}
>> if not records(query):
>> break
>>
>> rows = db.executesql(query)
>> for row in rows:
>> rec += 1
>> # do something
>> 
>> stage += 1
>>
>> This is not fast but keep memory low.
>>
>>
>> 2012/5/10 szimszon 
>>
>>> Could it be related?  
>>> https://groups.google.com/forum/#!topic/web2py/hmsupVHdDHo/discussion 
>>> (Memory 
>>> leak in standalone DAL (issue #731), can you please help test?)
>>>
>>> 2012. május 10., csütörtök 22:32:53 UTC+2 időpontban szimszon a 
>>> következőt írta:
>>>
 Okay. It's clear.

 I'm only puzzled about why the memory didn't get freed or reused after 
 execution is finished. And if I execute the controller function in 1-2min 
 interval mem is still not reused.

 So I understand it can eat up the memory but why is all memory locked 
 forever and didn't get reused - in my understanding (it's not much) in my 
 case GC do not free mem at all in python level. So some reference is still 
 intact after function is finished.


 2012. május 10., csütörtök 21:53:06 UTC+2 időpontban Richard a 
 következőt írta:
>
> Ok, you don't need it to works all the time.
>
> Did you get it to update your records?
>
> If not, and if as you said it is a one trip, you can just treat the 
> whole records batch by batch...
>
> Look here : http://web2py.com/books/**default/chapter/29/14#**
> Populating-database-with-**dummy-data
>
> for i in range(10):
>
>
> populate(db.mytable,100)
>
>
> db.commit()
>
>
> In this example only 100 records are populated at a time between 
> db.commit()... So maybe you just have to wrap for loop that will 
> db.commit() a couples of time during your processing.
>
> Richard
>
>
> On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:
>
>> I had to store files and a lot of properties for it. It was in csv. 
>> But after I processed it we figured out that not all value was correct 
>> in 
>> csv but it was a bit redundant. So I can correct it by go through all 
>> the 
>> records row by row. So that was a one time trip.
>>
>> I just realized after the process I had no memory left. So now I'm 
>> investigating what happened... 
>>
>> 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a 
>> következőt írta:
>>>
>>> Yes but in this case it is not for the entire reecords...
>>>
>>> Why would you return a full list of all the records?
>>>
>>> I don't understand what is the purpose of listar that you return in 
>>> the view under a html table, why do you need to return all the 10+ 
>>> entries?
>>>
>>> Richard
>>>
>>> On Thu, May 10, 2012 at 2:56 PM, szimszon wrote:
>>>
 In book it is a recommended way to iterate over sql results:

 http://web2py.com/books/**defaul**t/chapter/29/6

 You can do all the steps in one statement:

 1.
 2.
 3.

 >>> for row in db(db.person.name=='Alex').select():



 print row.name


 Alex




 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> Sorry, you really need to read more about 

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-11 Thread szimszon
The problem is not that if I call the controller function if consumes 
memory. My problem is that there is something that still reference 
something after the execution is finished so the consumed memory never get 
released / reused...

2012. május 10., csütörtök 23:27:32 UTC+2 időpontban Martin.Mulone a 
következőt írta:
>
> The problem is fetchall(), is getting all the records to memory.
> If you want to iterate a large dataset or not so heavy, but have a table 
> with many fields, perhaps you can do it by blocks.
>
> For example using raw sql with dal: example: this fetch by 1000 records
>
> queryraw = "select * from mytable %(limit_query)s"
> def records(query):
> rows = db.executesql(query)
> if rows:
> return True
> else:
> return False
>
> rec = 0
> stage = 0
> block = 1000while True:
> offset = stage * block
> query = queryraw % {'limitquery': 'LIMIT %s, %s' % (offset, block)}
> if not records(query):
> break
>
> rows = db.executesql(query)
> for row in rows:
> rec += 1
> # do something
> 
> stage += 1
>
> This is not fast but keep memory low.
>
>
> 2012/5/10 szimszon 
>
>> Could it be related?  
>> https://groups.google.com/forum/#!topic/web2py/hmsupVHdDHo/discussion 
>> (Memory 
>> leak in standalone DAL (issue #731), can you please help test?)
>>
>> 2012. május 10., csütörtök 22:32:53 UTC+2 időpontban szimszon a 
>> következőt írta:
>>
>>> Okay. It's clear.
>>>
>>> I'm only puzzled about why the memory didn't get freed or reused after 
>>> execution is finished. And if I execute the controller function in 1-2min 
>>> interval mem is still not reused.
>>>
>>> So I understand it can eat up the memory but why is all memory locked 
>>> forever and didn't get reused - in my understanding (it's not much) in my 
>>> case GC do not free mem at all in python level. So some reference is still 
>>> intact after function is finished.
>>>
>>>
>>> 2012. május 10., csütörtök 21:53:06 UTC+2 időpontban Richard a 
>>> következőt írta:

 Ok, you don't need it to works all the time.

 Did you get it to update your records?

 If not, and if as you said it is a one trip, you can just treat the 
 whole records batch by batch...

 Look here : http://web2py.com/books/**default/chapter/29/14#**
 Populating-database-with-**dummy-data

 for i in range(10):


 populate(db.mytable,100)


 db.commit()


 In this example only 100 records are populated at a time between 
 db.commit()... So maybe you just have to wrap for loop that will 
 db.commit() a couples of time during your processing.

 Richard


 On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:

> I had to store files and a lot of properties for it. It was in csv. 
> But after I processed it we figured out that not all value was correct in 
> csv but it was a bit redundant. So I can correct it by go through all the 
> records row by row. So that was a one time trip.
>
> I just realized after the process I had no memory left. So now I'm 
> investigating what happened... 
>
> 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a 
> következőt írta:
>>
>> Yes but in this case it is not for the entire reecords...
>>
>> Why would you return a full list of all the records?
>>
>> I don't understand what is the purpose of listar that you return in 
>> the view under a html table, why do you need to return all the 10+ 
>> entries?
>>
>> Richard
>>
>> On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:
>>
>>> In book it is a recommended way to iterate over sql results:
>>>
>>> http://web2py.com/books/**defaul**t/chapter/29/6
>>>
>>> You can do all the steps in one statement:
>>>
>>> 1.
>>> 2.
>>> 3.
>>>
>>> >>> for row in db(db.person.name=='Alex').select():
>>>
>>>
>>>
>>> print row.name
>>>
>>>
>>> Alex
>>>
>>>
>>>
>>>
>>> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
>>> következőt írta:
>>>
 Sorry, you really need to read more about how python works. If you 
 learn how for loops work and memory you will understand the problem. 

 One solution do the query before the for loop then loop through the 
 objects. This may help at bit. Research xrange vs range


 On Thu, May 10, 2012 at 11:30 AM, szimszon wrote:

> Sorry I don't understand. What do you mean "achieve with join"?
>
> There is an empty for loop with db.executesql() without join. And 
> it is eating up the memory. :(
>
> 2012. május 10., cs

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Martín Mulone
The problem is fetchall(), is getting all the records to memory.
If you want to iterate a large dataset or not so heavy, but have a table
with many fields, perhaps you can do it by blocks.

For example using raw sql with dal: example: this fetch by 1000 records

queryraw = "select * from mytable %(limit_query)s"
def records(query):
rows = db.executesql(query)
if rows:
return True
else:
return False

rec = 0
stage = 0
block = 1000while True:
offset = stage * block
query = queryraw % {'limitquery': 'LIMIT %s, %s' % (offset, block)}
if not records(query):
break

rows = db.executesql(query)
for row in rows:
rec += 1
# do something

stage += 1

This is not fast but keep memory low.


2012/5/10 szimszon 

> Could it be related?
> https://groups.google.com/forum/#!topic/web2py/hmsupVHdDHo/discussion (Memory
> leak in standalone DAL (issue #731), can you please help test?)
>
> 2012. május 10., csütörtök 22:32:53 UTC+2 időpontban szimszon a következőt
> írta:
>
>> Okay. It's clear.
>>
>> I'm only puzzled about why the memory didn't get freed or reused after
>> execution is finished. And if I execute the controller function in 1-2min
>> interval mem is still not reused.
>>
>> So I understand it can eat up the memory but why is all memory locked
>> forever and didn't get reused - in my understanding (it's not much) in my
>> case GC do not free mem at all in python level. So some reference is still
>> intact after function is finished.
>>
>>
>> 2012. május 10., csütörtök 21:53:06 UTC+2 időpontban Richard a következőt
>> írta:
>>>
>>> Ok, you don't need it to works all the time.
>>>
>>> Did you get it to update your records?
>>>
>>> If not, and if as you said it is a one trip, you can just treat the
>>> whole records batch by batch...
>>>
>>> Look here : http://web2py.com/books/**default/chapter/29/14#**
>>> Populating-database-with-**dummy-data
>>>
>>> for i in range(10):
>>>
>>> populate(db.mytable,100)
>>>
>>> db.commit()
>>>
>>>
>>> In this example only 100 records are populated at a time between
>>> db.commit()... So maybe you just have to wrap for loop that will
>>> db.commit() a couples of time during your processing.
>>>
>>> Richard
>>>
>>>
>>> On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:
>>>
 I had to store files and a lot of properties for it. It was in csv. But
 after I processed it we figured out that not all value was correct in csv
 but it was a bit redundant. So I can correct it by go through all the
 records row by row. So that was a one time trip.

 I just realized after the process I had no memory left. So now I'm
 investigating what happened...

 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a
 következőt írta:
>
> Yes but in this case it is not for the entire reecords...
>
> Why would you return a full list of all the records?
>
> I don't understand what is the purpose of listar that you return in
> the view under a html table, why do you need to return all the 10+
> entries?
>
> Richard
>
> On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:
>
>> In book it is a recommended way to iterate over sql results:
>>
>> http://web2py.com/books/**defaul**t/chapter/29/6
>>
>> You can do all the steps in one statement:
>>
>> 1.
>> 2.
>> 3.
>>
>> >>> for row in db(db.person.name=='Alex').select():
>>
>>
>> print row.name
>>
>> Alex
>>
>>
>>
>>
>> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a
>> következőt írta:
>>
>>> Sorry, you really need to read more about how python works. If you
>>> learn how for loops work and memory you will understand the problem.
>>>
>>> One solution do the query before the for loop then loop through the
>>> objects. This may help at bit. Research xrange vs range
>>>
>>>
>>> On Thu, May 10, 2012 at 11:30 AM, szimszon wrote:
>>>
 Sorry I don't understand. What do you mean "achieve with join"?

 There is an empty for loop with db.executesql() without join. And
 it is eating up the memory. :(

 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a
 következőt írta:

> You can't manage what you want to achieve with join?
>
> Richard
>
> On Thu, May 10, 2012 at 10:48 AM, szimszon wrote:
>
>> Sorry for my dumbness but if something is wrong with my code
>> please point me the right line. I'm not so good in English if it 
>> comes to
>> "object instance count" and so. Yeah I know I should go and do some
>> milkmaid job :) but I'm curious.
>>
>>>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Could it be related?  
https://groups.google.com/forum/#!topic/web2py/hmsupVHdDHo/discussion (Memory 
leak in standalone DAL (issue #731), can you please help test?)

2012. május 10., csütörtök 22:32:53 UTC+2 időpontban szimszon a következőt 
írta:
>
> Okay. It's clear.
>
> I'm only puzzled about why the memory didn't get freed or reused after 
> execution is finished. And if I execute the controller function in 1-2min 
> interval mem is still not reused.
>
> So I understand it can eat up the memory but why is all memory locked 
> forever and didn't get reused - in my understanding (it's not much) in my 
> case GC do not free mem at all in python level. So some reference is still 
> intact after function is finished.
>
>
> 2012. május 10., csütörtök 21:53:06 UTC+2 időpontban Richard a következőt 
> írta:
>>
>> Ok, you don't need it to works all the time.
>>
>> Did you get it to update your records?
>>
>> If not, and if as you said it is a one trip, you can just treat the whole 
>> records batch by batch...
>>
>> Look here : 
>> http://web2py.com/books/default/chapter/29/14#Populating-database-with-dummy-data
>>
>> for i in range(10):
>>
>> populate(db.mytable,100)
>>
>> db.commit()
>>
>>
>> In this example only 100 records are populated at a time between 
>> db.commit()... So maybe you just have to wrap for loop that will 
>> db.commit() a couples of time during your processing.
>>
>> Richard
>>
>>
>> On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:
>>
>>> I had to store files and a lot of properties for it. It was in csv. But 
>>> after I processed it we figured out that not all value was correct in csv 
>>> but it was a bit redundant. So I can correct it by go through all the 
>>> records row by row. So that was a one time trip.
>>>
>>> I just realized after the process I had no memory left. So now I'm 
>>> investigating what happened... 
>>>
>>> 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a 
>>> következőt írta:

 Yes but in this case it is not for the entire reecords...

 Why would you return a full list of all the records?

 I don't understand what is the purpose of listar that you return in the 
 view under a html table, why do you need to return all the 10+ entries?

 Richard

 On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:

> In book it is a recommended way to iterate over sql results:
>
> http://web2py.com/books/**default/chapter/29/6
>
> You can do all the steps in one statement:
>
> 1.
> 2.
> 3.
>
> >>> for row in db(db.person.name=='Alex').sel**ect():
>
>
> print row.name
>
> Alex
>
>
>
>
> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
> következőt írta:
>
>> Sorry, you really need to read more about how python works. If you 
>> learn how for loops work and memory you will understand the problem. 
>>
>> One solution do the query before the for loop then loop through the 
>> objects. This may help at bit. Research xrange vs range
>>
>>
>> On Thu, May 10, 2012 at 11:30 AM, szimszon wrote:
>>
>>> Sorry I don't understand. What do you mean "achieve with join"?
>>>
>>> There is an empty for loop with db.executesql() without join. And it 
>>> is eating up the memory. :(
>>>
>>> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a 
>>> következőt írta:
>>>
 You can't manage what you want to achieve with join?

 Richard

 On Thu, May 10, 2012 at 10:48 AM, szimszon wrote:

> Sorry for my dumbness but if something is wrong with my code 
> please point me the right line. I'm not so good in English if it 
> comes to 
> "object instance count" and so. Yeah I know I should go and do some 
> milkmaid job :) but I'm curious.
>
> I'm just define some variable:
>
> lista = list()
> last_row = None
> next_page_number = 0
>
> Go in a for loop that just assign the db query result one-by-one 
> to row variable.
>
> After that I assign the TABLE() helper to a list variable. That is 
> it.
>
>
>
> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
> következőt írta:
>
>> Using direct sql query or DAL is going to cause the exact same 
>> problem in this situation.
>>
>> On Thu, May 10, 2012 at 7:28 AM, szimszon wrote:
>>
>>> It's postgres:// in a 
>>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>>
>>> Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
>>> [GCC 4.6.3] on linux2
>>>
>>> python-psycopg2 2.4.5-1
>>>
>>>
>>> 2012. máj

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Okay. It's clear.

I'm only puzzled about why the memory didn't get freed or reused after 
execution is finished. And if I execute the controller function in 1-2min 
interval mem is still not reused.

So I understand it can eat up the memory but why is all memory locked 
forever and didn't get reused - in my understanding (it's not much) in my 
case GC do not free mem at all in python level. So some reference is still 
intact after function is finished.


2012. május 10., csütörtök 21:53:06 UTC+2 időpontban Richard a következőt 
írta:
>
> Ok, you don't need it to works all the time.
>
> Did you get it to update your records?
>
> If not, and if as you said it is a one trip, you can just treat the whole 
> records batch by batch...
>
> Look here : 
> http://web2py.com/books/default/chapter/29/14#Populating-database-with-dummy-data
>
> for i in range(10):
>
> populate(db.mytable,100)
>
> db.commit()
>
>
> In this example only 100 records are populated at a time between 
> db.commit()... So maybe you just have to wrap for loop that will 
> db.commit() a couples of time during your processing.
>
> Richard
>
>
> On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:
>
>> I had to store files and a lot of properties for it. It was in csv. But 
>> after I processed it we figured out that not all value was correct in csv 
>> but it was a bit redundant. So I can correct it by go through all the 
>> records row by row. So that was a one time trip.
>>
>> I just realized after the process I had no memory left. So now I'm 
>> investigating what happened... 
>>
>> 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a következőt 
>> írta:
>>>
>>> Yes but in this case it is not for the entire reecords...
>>>
>>> Why would you return a full list of all the records?
>>>
>>> I don't understand what is the purpose of listar that you return in the 
>>> view under a html table, why do you need to return all the 10+ entries?
>>>
>>> Richard
>>>
>>> On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:
>>>
 In book it is a recommended way to iterate over sql results:

 http://web2py.com/books/**default/chapter/29/6

 You can do all the steps in one statement:

 1.
 2.
 3.

 >>> for row in db(db.person.name=='Alex').sel**ect():


 print row.name

 Alex




 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> Sorry, you really need to read more about how python works. If you 
> learn how for loops work and memory you will understand the problem. 
>
> One solution do the query before the for loop then loop through the 
> objects. This may help at bit. Research xrange vs range
>
>
> On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:
>
>> Sorry I don't understand. What do you mean "achieve with join"?
>>
>> There is an empty for loop with db.executesql() without join. And it 
>> is eating up the memory. :(
>>
>> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a 
>> következőt írta:
>>
>>> You can't manage what you want to achieve with join?
>>>
>>> Richard
>>>
>>> On Thu, May 10, 2012 at 10:48 AM, szimszon wrote:
>>>
 Sorry for my dumbness but if something is wrong with my code please 
 point me the right line. I'm not so good in English if it comes to 
 "object 
 instance count" and so. Yeah I know I should go and do some milkmaid 
 job :) 
 but I'm curious.

 I'm just define some variable:

 lista = list()
 last_row = None
 next_page_number = 0

 Go in a for loop that just assign the db query result one-by-one to 
 row variable.

 After that I assign the TABLE() helper to a list variable. That is 
 it.



 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> Using direct sql query or DAL is going to cause the exact same 
> problem in this situation.
>
> On Thu, May 10, 2012 at 7:28 AM, szimszon wrote:
>
>> It's postgres:// in a 
>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>
>> Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
>> [GCC 4.6.3] on linux2
>>
>> python-psycopg2 2.4.5-1
>>
>>
>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno 
>> a következőt írta:
>>
>>> Just for curiosity, what happens if you do it in pure sql?
>>>
>>>
>>> for row in db.executesql("**select * from 
>>> file_properties where id > 0"):
>>>
>>> # do something
>>>
>>>
>>> Does it have a low

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Richard Vézina
Ok, you don't need it to works all the time.

Did you get it to update your records?

If not, and if as you said it is a one trip, you can just treat the whole
records batch by batch...

Look here :
http://web2py.com/books/default/chapter/29/14#Populating-database-with-dummy-data

for i in range(10):
populate(db.mytable,100)
db.commit()


In this example only 100 records are populated at a time between
db.commit()... So maybe you just have to wrap for loop that will
db.commit() a couples of time during your processing.

Richard


On Thu, May 10, 2012 at 3:32 PM, szimszon  wrote:

> I had to store files and a lot of properties for it. It was in csv. But
> after I processed it we figured out that not all value was correct in csv
> but it was a bit redundant. So I can correct it by go through all the
> records row by row. So that was a one time trip.
>
> I just realized after the process I had no memory left. So now I'm
> investigating what happened...
>
> 2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a következőt
> írta:
>>
>> Yes but in this case it is not for the entire reecords...
>>
>> Why would you return a full list of all the records?
>>
>> I don't understand what is the purpose of listar that you return in the
>> view under a html table, why do you need to return all the 10+ entries?
>>
>> Richard
>>
>> On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:
>>
>>> In book it is a recommended way to iterate over sql results:
>>>
>>> http://web2py.com/books/**default/chapter/29/6
>>>
>>> You can do all the steps in one statement:
>>>
>>> 1.
>>> 2.
>>> 3.
>>>
>>> >>> for row in db(db.person.name=='Alex').sel**ect():
>>>
>>> print row.name
>>> Alex
>>>
>>>
>>>
>>>
>>> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 Sorry, you really need to read more about how python works. If you
 learn how for loops work and memory you will understand the problem.

 One solution do the query before the for loop then loop through the
 objects. This may help at bit. Research xrange vs range


 On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:

> Sorry I don't understand. What do you mean "achieve with join"?
>
> There is an empty for loop with db.executesql() without join. And it
> is eating up the memory. :(
>
> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a
> következőt írta:
>
>> You can't manage what you want to achieve with join?
>>
>> Richard
>>
>> On Thu, May 10, 2012 at 10:48 AM, szimszon wrote:
>>
>>> Sorry for my dumbness but if something is wrong with my code please
>>> point me the right line. I'm not so good in English if it comes to 
>>> "object
>>> instance count" and so. Yeah I know I should go and do some milkmaid 
>>> job :)
>>> but I'm curious.
>>>
>>> I'm just define some variable:
>>>
>>> lista = list()
>>> last_row = None
>>> next_page_number = 0
>>>
>>> Go in a for loop that just assign the db query result one-by-one to
>>> row variable.
>>>
>>> After that I assign the TABLE() helper to a list variable. That is
>>> it.
>>>
>>>
>>>
>>> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 Using direct sql query or DAL is going to cause the exact same
 problem in this situation.

 On Thu, May 10, 2012 at 7:28 AM, szimszon wrote:

> It's postgres:// in a
> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>
> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
> [GCC 4.6.3] on linux2
>
> python-psycopg2 2.4.5-1
>
>
> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
> következőt írta:
>
>> Just for curiosity, what happens if you do it in pure sql?
>>
>>
>> for row in db.executesql("**select * from
>> file_properties where id > 0"):
>>
>> # do something
>>
>>
>> Does it have a lower memory usage?
>>
>>
>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade <
>> > wrote:
>>
>>> for row in db( db.file_**properties.id > 0 ).select(
>>
>>
>>
>>
>> --
>>
>> Bruno Rocha
>> [http://rochacbruno.com.br]
>>
>>


 --
 --
 Regards,
 Bruce Wade
 http://ca.linkedin.com/in/**bruc**elwade
 http://www.wadecybertech.com
 http://www.fittraineronline.**co**m-
  Fitness Personal Trainers Online
 http://www.warplydesigned.com

>>>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
I had to store files and a lot of properties for it. It was in csv. But 
after I processed it we figured out that not all value was correct in csv 
but it was a bit redundant. So I can correct it by go through all the 
records row by row. So that was a one time trip.

I just realized after the process I had no memory left. So now I'm 
investigating what happened... 

2012. május 10., csütörtök 21:00:05 UTC+2 időpontban Richard a következőt 
írta:
>
> Yes but in this case it is not for the entire reecords...
>
> Why would you return a full list of all the records?
>
> I don't understand what is the purpose of listar that you return in the 
> view under a html table, why do you need to return all the 10+ entries?
>
> Richard
>
> On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:
>
>> In book it is a recommended way to iterate over sql results:
>>
>> http://web2py.com/books/default/chapter/29/6
>>
>> You can do all the steps in one statement:
>>
>> 1.
>> 2.
>> 3.
>>
>> >>> for row in db(db.person.name=='Alex').select():
>>
>> print row.name
>> Alex
>>
>>
>>
>>
>> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
>> következőt írta:
>>
>>> Sorry, you really need to read more about how python works. If you learn 
>>> how for loops work and memory you will understand the problem. 
>>>
>>> One solution do the query before the for loop then loop through the 
>>> objects. This may help at bit. Research xrange vs range
>>>
>>>
>>> On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:
>>>
 Sorry I don't understand. What do you mean "achieve with join"?

 There is an empty for loop with db.executesql() without join. And it is 
 eating up the memory. :(

 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a 
 következőt írta:

> You can't manage what you want to achieve with join?
>
> Richard
>
> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>
>> Sorry for my dumbness but if something is wrong with my code please 
>> point me the right line. I'm not so good in English if it comes to 
>> "object 
>> instance count" and so. Yeah I know I should go and do some milkmaid job 
>> :) 
>> but I'm curious.
>>
>> I'm just define some variable:
>>
>> lista = list()
>> last_row = None
>> next_page_number = 0
>>
>> Go in a for loop that just assign the db query result one-by-one to 
>> row variable.
>>
>> After that I assign the TABLE() helper to a list variable. That is it.
>>
>>
>>
>> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
>> következőt írta:
>>
>>> Using direct sql query or DAL is going to cause the exact same 
>>> problem in this situation.
>>>
>>> On Thu, May 10, 2012 at 7:28 AM, szimszon wrote:
>>>
 It's postgres:// in a 
 Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and

 Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
 [GCC 4.6.3] on linux2

 python-psycopg2 2.4.5-1


 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
 következőt írta:

> Just for curiosity, what happens if you do it in pure sql?
>
>
> for row in db.executesql("**sele**ct * from file_properties 
> where id > 0"):
>
> # do something
>
>
> Does it have a lower memory usage?
>
>
> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade <
> > wrote:
>
>> for row in db( db.file_**propert**ies.id > 0 ).select(
>
>
>
>
> -- 
>
> Bruno Rocha
> [http://rochacbruno.com.br]
>
>
>>>
>>>
>>> -- 
>>> -- 
>>> Regards,
>>> Bruce Wade
>>> http://ca.linkedin.com/in/**brucelwade
>>> http://www.wadecybertech.com
>>> http://www.fittraineronline.**com- 
>>> Fitness Personal Trainers Online
>>> http://www.warplydesigned.com
>>>
>>>
>
>>>
>>>
>>> -- 
>>> -- 
>>> Regards,
>>> Bruce Wade
>>> http://ca.linkedin.com/in/**brucelwade
>>> http://www.wadecybertech.com
>>> http://www.fittraineronline.**com  - 
>>> Fitness Personal Trainers Online
>>> http://www.warplydesigned.com
>>>
>>>  
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Richard Vézina
Yes but in this case it is not for the entire records...

Why would you return a full list of all the records?

I don't understand what is the purpose of listar that you return in the
view under a html table, why do you need to return all the 10+ entries?

Richard

On Thu, May 10, 2012 at 2:56 PM, szimszon  wrote:

> In book it is a recommended way to iterate over sql results:
>
> http://web2py.com/books/default/chapter/29/6
>
> You can do all the steps in one statement:
>
> 1.
> 2.
> 3.
>
> >>> for row in db(db.person.name=='Alex').select():
> print row.name
> Alex
>
>
>
>
> 2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> Sorry, you really need to read more about how python works. If you learn
>> how for loops work and memory you will understand the problem.
>>
>> One solution do the query before the for loop then loop through the
>> objects. This may help at bit. Research xrange vs range
>>
>>
>> On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:
>>
>>> Sorry I don't understand. What do you mean "achieve with join"?
>>>
>>> There is an empty for loop with db.executesql() without join. And it is
>>> eating up the memory. :(
>>>
>>> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a
>>> következőt írta:
>>>
 You can't manage what you want to achieve with join?

 Richard

 On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:

> Sorry for my dumbness but if something is wrong with my code please
> point me the right line. I'm not so good in English if it comes to "object
> instance count" and so. Yeah I know I should go and do some milkmaid job 
> :)
> but I'm curious.
>
> I'm just define some variable:
>
> lista = list()
> last_row = None
> next_page_number = 0
>
> Go in a for loop that just assign the db query result one-by-one to
> row variable.
>
> After that I assign the TABLE() helper to a list variable. That is it.
>
>
>
> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> Using direct sql query or DAL is going to cause the exact same
>> problem in this situation.
>>
>> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>>
>>> It's postgres:// in a
>>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>>
>>> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
>>> [GCC 4.6.3] on linux2
>>>
>>> python-psycopg2 2.4.5-1
>>>
>>>
>>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
>>> következőt írta:
>>>
 Just for curiosity, what happens if you do it in pure sql?


 for row in db.executesql("**sele**ct * from file_properties
 where id > 0"):

 # do something


 Does it have a lower memory usage?


 On Thu, May 10, 2012 at 4:14 AM, Bruce Wade <
 > wrote:

> for row in db( db.file_**propert**ies.id > 0 ).select(




 --

 Bruno Rocha
 [http://rochacbruno.com.br]


>>
>>
>> --
>> --
>> Regards,
>> Bruce Wade
>> http://ca.linkedin.com/in/**brucelwade
>> http://www.wadecybertech.com
>> http://www.fittraineronline.**com- 
>> Fitness Personal Trainers Online
>> http://www.warplydesigned.com
>>
>>

>>
>>
>> --
>> --
>> Regards,
>> Bruce Wade
>> http://ca.linkedin.com/in/**brucelwade
>> http://www.wadecybertech.com
>> http://www.fittraineronline.**com  -
>> Fitness Personal Trainers Online
>> http://www.warplydesigned.com
>>
>>


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Richard Vézina
I didn't read enough your logic, but since it was implicated a other table
I thougth you just want to do a kind of computed field.

Also, what you seems to do is a kind of paging fonction, why you can't
 achieve this with count()?

Richard

On Thu, May 10, 2012 at 2:30 PM, szimszon  wrote:

> Sorry I don't understand. What do you mean "achieve with join"?
>
> There is an empty for loop with db.executesql() without join. And it is
> eating up the memory. :(
>
> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt
> írta:
>
>> You can't manage what you want to achieve with join?
>>
>> Richard
>>
>> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>>
>>> Sorry for my dumbness but if something is wrong with my code please
>>> point me the right line. I'm not so good in English if it comes to "object
>>> instance count" and so. Yeah I know I should go and do some milkmaid job :)
>>> but I'm curious.
>>>
>>> I'm just define some variable:
>>>
>>> lista = list()
>>> last_row = None
>>> next_page_number = 0
>>>
>>> Go in a for loop that just assign the db query result one-by-one to row
>>> variable.
>>>
>>> After that I assign the TABLE() helper to a list variable. That is it.
>>>
>>>
>>>
>>> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 Using direct sql query or DAL is going to cause the exact same problem
 in this situation.

 On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:

> It's postgres:// in a
> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>
> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
> [GCC 4.6.3] on linux2
>
> python-psycopg2 2.4.5-1
>
>
> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
> következőt írta:
>
>> Just for curiosity, what happens if you do it in pure sql?
>>
>>
>> for row in db.executesql("**select * from file_properties where
>> id > 0"):
>>
>> # do something
>>
>>
>> Does it have a lower memory usage?
>>
>>
>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < 
>> >wrote:
>>
>>> for row in db( db.file_**properties.id > 0 ).select(
>>
>>
>>
>>
>> --
>>
>> Bruno Rocha
>> [http://rochacbruno.com.br]
>>
>>


 --
 --
 Regards,
 Bruce Wade
 http://ca.linkedin.com/in/**bruc**elwade
 http://www.wadecybertech.com
 http://www.fittraineronline.**co**m - 
 Fitness Personal Trainers Online
 http://www.warplydesigned.com


>>


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
In book it is a recommended way to iterate over sql results:

http://web2py.com/books/default/chapter/29/6

You can do all the steps in one statement:

1.
2.
3.

>>> for row in db(db.person.name=='Alex').select():
print row.name
Alex




2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
következőt írta:
>
> Sorry, you really need to read more about how python works. If you learn 
> how for loops work and memory you will understand the problem. 
>
> One solution do the query before the for loop then loop through the 
> objects. This may help at bit. Research xrange vs range
>
>
> On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:
>
>> Sorry I don't understand. What do you mean "achieve with join"?
>>
>> There is an empty for loop with db.executesql() without join. And it is 
>> eating up the memory. :(
>>
>> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt 
>> írta:
>>
>>> You can't manage what you want to achieve with join?
>>>
>>> Richard
>>>
>>> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>>>
 Sorry for my dumbness but if something is wrong with my code please 
 point me the right line. I'm not so good in English if it comes to "object 
 instance count" and so. Yeah I know I should go and do some milkmaid job 
 :) 
 but I'm curious.

 I'm just define some variable:

 lista = list()
 last_row = None
 next_page_number = 0

 Go in a for loop that just assign the db query result one-by-one to row 
 variable.

 After that I assign the TABLE() helper to a list variable. That is it.



 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> Using direct sql query or DAL is going to cause the exact same problem 
> in this situation.
>
> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>
>> It's postgres:// in a 
>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>
>> Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
>> [GCC 4.6.3] on linux2
>>
>> python-psycopg2 2.4.5-1
>>
>>
>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
>> következőt írta:
>>
>>> Just for curiosity, what happens if you do it in pure sql?
>>>
>>>
>>> for row in db.executesql("**select * from file_properties where 
>>> id > 0"):
>>>
>>> # do something
>>>
>>>
>>> Does it have a lower memory usage?
>>>
>>>
>>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < 
>>> > wrote:
>>>
 for row in db( db.file_**properties.id > 0 ).select(
>>>
>>>
>>>
>>>
>>> -- 
>>>
>>> Bruno Rocha
>>> [http://rochacbruno.com.br]
>>>
>>>
>
>
> -- 
> -- 
> Regards,
> Bruce Wade
> http://ca.linkedin.com/in/**bruc**elwade
> http://www.wadecybertech.com
> http://www.fittraineronline.**co**m - 
> Fitness Personal Trainers Online
> http://www.warplydesigned.com
>
>
>>>
>
>
> -- 
> -- 
> Regards,
> Bruce Wade
> http://ca.linkedin.com/in/brucelwade
> http://www.wadecybertech.com
> http://www.fittraineronline.com - Fitness Personal Trainers Online
> http://www.warplydesigned.com
>
> 

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Is there a doc already known by you explaining this? Can you post a link?

2012. május 10., csütörtök 20:42:22 UTC+2 időpontban Bruce Wade a 
következőt írta:
>
> Sorry, you really need to read more about how python works. If you learn 
> how for loops work and memory you will understand the problem. 
>
> One solution do the query before the for loop then loop through the 
> objects. This may help at bit. Research xrange vs range
>
>
> On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:
>
>> Sorry I don't understand. What do you mean "achieve with join"?
>>
>> There is an empty for loop with db.executesql() without join. And it is 
>> eating up the memory. :(
>>
>> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt 
>> írta:
>>
>>> You can't manage what you want to achieve with join?
>>>
>>> Richard
>>>
>>> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>>>
 Sorry for my dumbness but if something is wrong with my code please 
 point me the right line. I'm not so good in English if it comes to "object 
 instance count" and so. Yeah I know I should go and do some milkmaid job 
 :) 
 but I'm curious.

 I'm just define some variable:

 lista = list()
 last_row = None
 next_page_number = 0

 Go in a for loop that just assign the db query result one-by-one to row 
 variable.

 After that I assign the TABLE() helper to a list variable. That is it.



 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> Using direct sql query or DAL is going to cause the exact same problem 
> in this situation.
>
> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>
>> It's postgres:// in a 
>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>
>> Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
>> [GCC 4.6.3] on linux2
>>
>> python-psycopg2 2.4.5-1
>>
>>
>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
>> következőt írta:
>>
>>> Just for curiosity, what happens if you do it in pure sql?
>>>
>>>
>>> for row in db.executesql("**select * from file_properties where 
>>> id > 0"):
>>>
>>> # do something
>>>
>>>
>>> Does it have a lower memory usage?
>>>
>>>
>>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < 
>>> > wrote:
>>>
 for row in db( db.file_**properties.id > 0 ).select(
>>>
>>>
>>>
>>>
>>> -- 
>>>
>>> Bruno Rocha
>>> [http://rochacbruno.com.br]
>>>
>>>
>
>
> -- 
> -- 
> Regards,
> Bruce Wade
> http://ca.linkedin.com/in/**bruc**elwade
> http://www.wadecybertech.com
> http://www.fittraineronline.**co**m - 
> Fitness Personal Trainers Online
> http://www.warplydesigned.com
>
>
>>>
>
>
> -- 
> -- 
> Regards,
> Bruce Wade
> http://ca.linkedin.com/in/brucelwade
> http://www.wadecybertech.com
> http://www.fittraineronline.com - Fitness Personal Trainers Online
> http://www.warplydesigned.com
>
> 

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
Sorry, you really need to read more about how python works. If you learn
how for loops work and memory you will understand the problem.

One solution do the query before the for loop then loop through the
objects. This may help at bit. Research xrange vs range


On Thu, May 10, 2012 at 11:30 AM, szimszon  wrote:

> Sorry I don't understand. What do you mean "achieve with join"?
>
> There is an empty for loop with db.executesql() without join. And it is
> eating up the memory. :(
>
> 2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt
> írta:
>
>> You can't manage what you want to achieve with join?
>>
>> Richard
>>
>> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>>
>>> Sorry for my dumbness but if something is wrong with my code please
>>> point me the right line. I'm not so good in English if it comes to "object
>>> instance count" and so. Yeah I know I should go and do some milkmaid job :)
>>> but I'm curious.
>>>
>>> I'm just define some variable:
>>>
>>> lista = list()
>>> last_row = None
>>> next_page_number = 0
>>>
>>> Go in a for loop that just assign the db query result one-by-one to row
>>> variable.
>>>
>>> After that I assign the TABLE() helper to a list variable. That is it.
>>>
>>>
>>>
>>> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 Using direct sql query or DAL is going to cause the exact same problem
 in this situation.

 On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:

> It's postgres:// in a
> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>
> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
> [GCC 4.6.3] on linux2
>
> python-psycopg2 2.4.5-1
>
>
> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
> következőt írta:
>
>> Just for curiosity, what happens if you do it in pure sql?
>>
>>
>> for row in db.executesql("**select * from file_properties where
>> id > 0"):
>>
>> # do something
>>
>>
>> Does it have a lower memory usage?
>>
>>
>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < 
>> >wrote:
>>
>>> for row in db( db.file_**properties.id > 0 ).select(
>>
>>
>>
>>
>> --
>>
>> Bruno Rocha
>> [http://rochacbruno.com.br]
>>
>>


 --
 --
 Regards,
 Bruce Wade
 http://ca.linkedin.com/in/**bruc**elwade
 http://www.wadecybertech.com
 http://www.fittraineronline.**co**m - 
 Fitness Personal Trainers Online
 http://www.warplydesigned.com


>>


-- 
-- 
Regards,
Bruce Wade
http://ca.linkedin.com/in/brucelwade
http://www.wadecybertech.com
http://www.fittraineronline.com - Fitness Personal Trainers Online
http://www.warplydesigned.com


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Sorry I don't understand. What do you mean "achieve with join"?

There is an empty for loop with db.executesql() without join. And it is 
eating up the memory. :(

2012. május 10., csütörtök 19:12:30 UTC+2 időpontban Richard a következőt 
írta:
>
> You can't manage what you want to achieve with join?
>
> Richard
>
> On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:
>
>> Sorry for my dumbness but if something is wrong with my code please point 
>> me the right line. I'm not so good in English if it comes to "object 
>> instance count" and so. Yeah I know I should go and do some milkmaid job :) 
>> but I'm curious.
>>
>> I'm just define some variable:
>>
>> lista = list()
>> last_row = None
>> next_page_number = 0
>>
>> Go in a for loop that just assign the db query result one-by-one to row 
>> variable.
>>
>> After that I assign the TABLE() helper to a list variable. That is it.
>>
>>
>>
>> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
>> következőt írta:
>>
>>> Using direct sql query or DAL is going to cause the exact same problem 
>>> in this situation.
>>>
>>> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>>>
 It's postgres:// in a 
 Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and

 Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
 [GCC 4.6.3] on linux2

 python-psycopg2 2.4.5-1


 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
 következőt írta:

> Just for curiosity, what happens if you do it in pure sql?
>
>
> for row in db.executesql("**sele**ct * from file_properties where id 
> > 0"):
>
> # do something
>
>
> Does it have a lower memory usage?
>
>
> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < 
> >wrote:
>
>> for row in db( db.file_**propert**ies.id > 0 ).select(
>
>
>
>
> -- 
>
> Bruno Rocha
> [http://rochacbruno.com.br]
>
>
>>>
>>>
>>> -- 
>>> -- 
>>> Regards,
>>> Bruce Wade
>>> http://ca.linkedin.com/in/**brucelwade
>>> http://www.wadecybertech.com
>>> http://www.fittraineronline.**com  - 
>>> Fitness Personal Trainers Online
>>> http://www.warplydesigned.com
>>>
>>>
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Richard Vézina
You can't manage what you want to achieve with join?

Richard

On Thu, May 10, 2012 at 10:48 AM, szimszon  wrote:

> Sorry for my dumbness but if something is wrong with my code please point
> me the right line. I'm not so good in English if it comes to "object
> instance count" and so. Yeah I know I should go and do some milkmaid job :)
> but I'm curious.
>
> I'm just define some variable:
>
> lista = list()
> last_row = None
> next_page_number = 0
>
> Go in a for loop that just assign the db query result one-by-one to row
> variable.
>
> After that I assign the TABLE() helper to a list variable. That is it.
>
>
>
> 2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> Using direct sql query or DAL is going to cause the exact same problem in
>> this situation.
>>
>> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>>
>>> It's postgres:// in a
>>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>>
>>> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
>>> [GCC 4.6.3] on linux2
>>>
>>> python-psycopg2 2.4.5-1
>>>
>>>
>>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
>>> következőt írta:
>>>
 Just for curiosity, what happens if you do it in pure sql?


 for row in db.executesql("**sele**ct * from file_properties where id >
 0"):

 # do something


 Does it have a lower memory usage?


 On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < >wrote:

> for row in db( db.file_**propert**ies.id > 0 ).select(




 --

 Bruno Rocha
 [http://rochacbruno.com.br]


>>
>>
>> --
>> --
>> Regards,
>> Bruce Wade
>> http://ca.linkedin.com/in/**brucelwade
>> http://www.wadecybertech.com
>> http://www.fittraineronline.**com  -
>> Fitness Personal Trainers Online
>> http://www.warplydesigned.com
>>
>>


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Sorry for my dumbness but if something is wrong with my code please point 
me the right line. I'm not so good in English if it comes to "object 
instance count" and so. Yeah I know I should go and do some milkmaid job :) 
but I'm curious.

I'm just define some variable:
lista = list()
last_row = None
next_page_number = 0

Go in a for loop that just assign the db query result one-by-one to row 
variable.

After that I assign the TABLE() helper to a list variable. That is it.



2012. május 10., csütörtök 16:34:49 UTC+2 időpontban Bruce Wade a 
következőt írta:
>
> Using direct sql query or DAL is going to cause the exact same problem in 
> this situation.
>
> On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:
>
>> It's postgres:// in a 
>> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>>
>> Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
>> [GCC 4.6.3] on linux2
>>
>> python-psycopg2 2.4.5-1
>>
>>
>> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
>> következőt írta:
>>
>>> Just for curiosity, what happens if you do it in pure sql?
>>>
>>>
>>> for row in db.executesql("**select * from file_properties where id > 
>>> 0"):
>>>
>>> # do something
>>>
>>>
>>> Does it have a lower memory usage?
>>>
>>>
>>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < >wrote:
>>>
 for row in db( db.file_**properties.id > 0 ).select(
>>>
>>>
>>>
>>>
>>> -- 
>>>
>>> Bruno Rocha
>>> [http://rochacbruno.com.br]
>>>
>>>
>
>
> -- 
> -- 
> Regards,
> Bruce Wade
> http://ca.linkedin.com/in/brucelwade
> http://www.wadecybertech.com
> http://www.fittraineronline.com - Fitness Personal Trainers Online
> http://www.warplydesigned.com
>
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
Using direct sql query or DAL is going to cause the exact same problem in
this situation.

On Thu, May 10, 2012 at 7:28 AM, szimszon  wrote:

> It's postgres:// in a
> Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and
>
> Python 2.7.3 (default, Apr 20 2012, 22:44:07)
> [GCC 4.6.3] on linux2
>
> python-psycopg2 2.4.5-1
>
>
> 2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a
> következőt írta:
>
>> Just for curiosity, what happens if you do it in pure sql?
>>
>>
>> for row in db.executesql("**select * from file_properties where id > 0"):
>>
>> # do something
>>
>>
>> Does it have a lower memory usage?
>>
>>
>> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < >wrote:
>>
>>> for row in db( db.file_**properties.id > 0 ).select(
>>
>>
>>
>>
>> --
>>
>> Bruno Rocha
>> [http://rochacbruno.com.br]
>>
>>


-- 
-- 
Regards,
Bruce Wade
http://ca.linkedin.com/in/brucelwade
http://www.wadecybertech.com
http://www.fittraineronline.com - Fitness Personal Trainers Online
http://www.warplydesigned.com


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
It's postgres:// in a 
Version 1.99.7 (2012-04-23 11:26:23) dev of web2py, and

Python 2.7.3 (default, Apr 20 2012, 22:44:07) 
[GCC 4.6.3] on linux2

python-psycopg2 2.4.5-1

2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
következőt írta:
>
> Just for curiosity, what happens if you do it in pure sql?
>
> for row in db.executesql("select * from file_properties where id > 0"):
>
> # do something
>
>
> Does it have a lower memory usage?
>
>
> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade < >wrote:
>
>> for row in db( db.file_properties.id > 0 ).select(
>
>
>
>
> -- 
>
> Bruno Rocha
> [http://rochacbruno.com.br]
>
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
I reduced the code in controller to:

def autoadjust():
lista = list()
last_row = None
next_page_number = 0
for row in db.executesql( "select * from file_properties where id > 0" 
):
pass
lista = TABLE( *lista )
return dict( lista = lista )


And I still have memleak :(

I start web2py:

USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
1008017340  4.6  0.3 156896 15372 pts/9Sl+  16:19   0:00 python ./
web2py.py

After first run:

USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
1008017340  0.7  2.1 237932 87304 pts/9Sl+  16:19   0:00 python ./
web2py.py

2th run:
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
1008017340  0.8  3.7 302328 151600 pts/9   Sl+  16:19   0:01 python ./
web2py.py

and so on... I'm a bit puzzled. I'm not an expert regarding python mem 
problems :( So please be patient with me.


2012. május 10., csütörtök 15:40:36 UTC+2 időpontban rochacbruno a 
következőt írta:
>
> Just for curiosity, what happens if you do it in pure sql?
>
> for row in db.executesql("select * from file_properties where id > 0"):
>
> # do something
>
>
> Does it have a lower memory usage?
>
>
> On Thu, May 10, 2012 at 4:14 AM, Bruce Wade <> wrote:
>
>> for row in db( db.file_properties.id > 0 ).select(
>
>
>
>
> -- 
>
> Bruno Rocha
> [http://rochacbruno.com.br]
>
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruno Rocha
Just for curiosity, what happens if you do it in pure sql?

for row in db.executesql("select * from file_properties where id > 0"):

# do something


Does it have a lower memory usage?


On Thu, May 10, 2012 at 4:14 AM, Bruce Wade  wrote:

> for row in db( db.file_properties.id > 0 ).select(




-- 

Bruno Rocha
[http://rochacbruno.com.br]


Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
In your for loop, every time you iterate the object instance count is
increased. Now if you call that loop again before python garbage collector
has time to release memory then your for loop will cause more instances to
be increased. Python will not release memory to the os for an object until
every instance pointing to that object is deleted. This is what is known as
a memory leak and yes garbage collected language are at times the worst for
this situation if you don't watch how you code.

Now with a little research you will find you can make that for loop load
only 1 instance in memory instead of 1 instance per record in the database.
If you make it load only a single instance your problem will go away. I am
pretty sure this might be covered in the web2py book.

On Thu, May 10, 2012 at 6:14 AM, Bruce Wade  wrote:

> Their is no grantee that python will use that exact same memory. It would
> also depend how frequently you use that function.
>
> http://mg.pov.lt/blog/hunting-python-memleaks.html
>
>
> On Thu, May 10, 2012 at 12:57 AM, szimszon  wrote:
>
>> If I understand well then python release the mem. for reuse by itself but
>> not releasing at OS level. Fine.
>>
>> But then if I rerun the function I'll expect that the mem allocated for
>> python is not growing. Because python reuse it.
>>
>> I executed the controller function and my mem usage at OS level 2x as
>> after first run. It sounds not right.
>>
>> 2012. május 10., csütörtök 9:41:31 UTC+2 időpontban Bruce Wade a
>> következőt írta:
>>
>>> http://effbot.org/pyfaq/why-**doesnt-python-release-the-**
>>> memory-when-i-delete-a-large-**object.htm
>>>
>>> That is a little more details
>>>
>>> On Thu, May 10, 2012 at 12:37 AM, szimszon  wrote:
>>>
 Should the garbage collector not free up the memory?

 2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a
 következőt írta:

> That is how python is. If you want something to clear the memory as
> soon as you are done with it you need C++ :D
>
> On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:
>
>> Yes I know but it happens over type the mem usage is linearly growing
>> and after the successful execution never released and that is why I ask 
>> :(
>>
>> 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a
>> következőt írta:
>>
>>> WOW not a good idea:
>>> for row in db( db.file_**properties.id > 0 ).select(
>>>
>>> If you have a lot of records that is going to kill your memory.
>>>
>>> On Thu, May 10, 2012 at 12:10 AM, szimszon wrote:
>>>
 I wonder if somebody could help me.

 The following code has eaten up ~1,5GB ram and after ended
 successfully it is not released :(

 file_properties table has 137314 rows.

 Model:
 db.define_table( 'files_store',
 Field( 'name', 'string',
 label = 'Fájlnév' ,
 ),
 Field( 'content', 'upload',
 uploadseparate = True,
 autodelete = True,
 label = 'Aláírt fájl',
 ),
 Field( 'status', 'string',
 label = 'Állapot',
 requires = IS_IN_SET( [
 'empty', 'notsigned', 'senttosign', 'signed'],

 [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve',
 'Aláírt'] ),
 represent = lambda r,row
 :T( r )
 ),
 Field( 'state', 'string',
 label = 'Státusz',
 requires = IS_IN_SET( [
 'ok', 'error'],

 [ 'Rendben', 'Hiba'] ),
 represent = lambda r,row
 :T( r )
 ),
 Field( 'data_count', 'integer',
 label = 'Adat
 bejegyzések',
 requires = IS_NOT_EMPTY
 (),
 default = 0,
 ),
 Field( 'dirname', 'string',
   

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
Their is no grantee that python will use that exact same memory. It would
also depend how frequently you use that function.

http://mg.pov.lt/blog/hunting-python-memleaks.html

On Thu, May 10, 2012 at 12:57 AM, szimszon  wrote:

> If I understand well then python release the mem. for reuse by itself but
> not releasing at OS level. Fine.
>
> But then if I rerun the function I'll expect that the mem allocated for
> python is not growing. Because python reuse it.
>
> I executed the controller function and my mem usage at OS level 2x as
> after first run. It sounds not right.
>
> 2012. május 10., csütörtök 9:41:31 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> http://effbot.org/pyfaq/why-**doesnt-python-release-the-**
>> memory-when-i-delete-a-large-**object.htm
>>
>> That is a little more details
>>
>> On Thu, May 10, 2012 at 12:37 AM, szimszon  wrote:
>>
>>> Should the garbage collector not free up the memory?
>>>
>>> 2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 That is how python is. If you want something to clear the memory as
 soon as you are done with it you need C++ :D

 On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:

> Yes I know but it happens over type the mem usage is linearly growing
> and after the successful execution never released and that is why I ask :(
>
> 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> WOW not a good idea:
>> for row in db( db.file_**properties.id > 0 ).select(
>>
>> If you have a lot of records that is going to kill your memory.
>>
>> On Thu, May 10, 2012 at 12:10 AM, szimszon wrote:
>>
>>> I wonder if somebody could help me.
>>>
>>> The following code has eaten up ~1,5GB ram and after ended
>>> successfully it is not released :(
>>>
>>> file_properties table has 137314 rows.
>>>
>>> Model:
>>> db.define_table( 'files_store',
>>> Field( 'name', 'string',
>>> label = 'Fájlnév' ,
>>> ),
>>> Field( 'content', 'upload',
>>> uploadseparate = True,
>>> autodelete = True,
>>> label = 'Aláírt fájl',
>>> ),
>>> Field( 'status', 'string',
>>> label = 'Állapot',
>>> requires = IS_IN_SET( [
>>> 'empty', 'notsigned', 'senttosign', 'signed'],
>>>
>>> [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve',
>>> 'Aláírt'] ),
>>> represent = lambda r,row
>>> :T( r )
>>> ),
>>> Field( 'state', 'string',
>>> label = 'Státusz',
>>> requires = IS_IN_SET( [
>>> 'ok', 'error'],
>>>
>>> [ 'Rendben', 'Hiba'] ),
>>> represent = lambda r,row
>>> :T( r )
>>> ),
>>> Field( 'data_count', 'integer',
>>> label = 'Adat
>>> bejegyzések',
>>> requires = IS_NOT_EMPTY
>>> (),
>>> default = 0,
>>> ),
>>> Field( 'dirname', 'string',
>>> label = 'Könyvtár név' ,
>>> requires = IS_NOT_EMPTY
>>> (),
>>> ),
>>> Field( 'md5sum', 'string',
>>> label = T( 'MD5 Sum' ),
>>> represent = lambdamd5sum
>>> , junk: '[%s]' % md5sum,
>>> writable = False ),
>>> auth.signature,
>>> format = '%(dirname)s/%(name)s (%(id)s)'
>>>  )
>>> db.files_store.created_on.read**able = True
>>> db.define_table( 'file_properties',
>>> Field( 'files_store_id', db.
>>> files_store,
>>> label = 'Aláírt fájl' ,
>>>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
If I understand well then python release the mem. for reuse by itself but 
not releasing at OS level. Fine.

But then if I rerun the function I'll expect that the mem allocated for 
python is not growing. Because python reuse it.

I executed the controller function and my mem usage at OS level 2x as after 
first run. It sounds not right.

2012. május 10., csütörtök 9:41:31 UTC+2 időpontban Bruce Wade a következőt 
írta:
>
>
> http://effbot.org/pyfaq/why-doesnt-python-release-the-memory-when-i-delete-a-large-object.htm
>
> That is a little more details
>
> On Thu, May 10, 2012 at 12:37 AM, szimszon  wrote:
>
>> Should the garbage collector not free up the memory?
>>
>> 2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a 
>> következőt írta:
>>
>>> That is how python is. If you want something to clear the memory as soon 
>>> as you are done with it you need C++ :D
>>>
>>> On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:
>>>
 Yes I know but it happens over type the mem usage is linearly growing 
 and after the successful execution never released and that is why I ask :(

 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a 
 következőt írta:

> WOW not a good idea: 
> for row in db( db.file_**propert**ies.id > 0 ).select(
>
> If you have a lot of records that is going to kill your memory.
>
> On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:
>
>> I wonder if somebody could help me.
>>
>> The following code has eaten up ~1,5GB ram and after ended 
>> successfully it is not released :(
>>
>> file_properties table has 137314 rows.
>>
>> Model:
>> db.define_table( 'files_store',
>> Field( 'name', 'string',
>> label = 'Fájlnév' ,
>> ),
>> Field( 'content', 'upload',
>> uploadseparate = True,
>> autodelete = True,
>> label = 'Aláírt fájl',
>> ),
>> Field( 'status', 'string',
>> label = 'Állapot',
>> requires = IS_IN_SET( [
>> 'empty', 'notsigned', 'senttosign', 'signed'],
>>   
>>   [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve', 
>> 'Aláírt'] ),
>> represent = lambda r, row
>> :T( r )
>> ),
>> Field( 'state', 'string',
>> label = 'Státusz',
>> requires = IS_IN_SET( [
>> 'ok', 'error'],
>>   
>>   [ 'Rendben', 'Hiba'] ),
>> represent = lambda r, row
>> :T( r )
>> ),
>> Field( 'data_count', 'integer',
>> label = 'Adat 
>> bejegyzések',
>> requires = IS_NOT_EMPTY
>> (),
>> default = 0,
>> ),
>> Field( 'dirname', 'string',
>> label = 'Könyvtár név' ,
>> requires = IS_NOT_EMPTY
>> (),
>> ),
>> Field( 'md5sum', 'string',
>> label = T( 'MD5 Sum' ),
>> represent = lambda md5sum
>> , junk: '[%s]' % md5sum,
>> writable = False ),
>> auth.signature,
>> format = '%(dirname)s/%(name)s (%(id)s)'
>>  )
>> db.files_store.created_on.readable = True
>> db.define_table( 'file_properties',
>> Field( 'files_store_id', db.
>> files_store,
>> label = 'Aláírt fájl' ,
>> ),
>> Field( 'kezdo_oldal', 'integer',
>> label = 'Számla kezdő 
>> oldal',
>> requires = IS_NOT_EMPTY
>> (),
>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
http://effbot.org/pyfaq/why-doesnt-python-release-the-memory-when-i-delete-a-large-object.htm

That is a little more details

On Thu, May 10, 2012 at 12:37 AM, szimszon  wrote:

> Should the garbage collector not free up the memory?
>
> 2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> That is how python is. If you want something to clear the memory as soon
>> as you are done with it you need C++ :D
>>
>> On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:
>>
>>> Yes I know but it happens over type the mem usage is linearly growing
>>> and after the successful execution never released and that is why I ask :(
>>>
>>> 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a
>>> következőt írta:
>>>
 WOW not a good idea:
 for row in db( db.file_**propert**ies.id > 0 ).select(

 If you have a lot of records that is going to kill your memory.

 On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:

> I wonder if somebody could help me.
>
> The following code has eaten up ~1,5GB ram and after ended
> successfully it is not released :(
>
> file_properties table has 137314 rows.
>
> Model:
> db.define_table( 'files_store',
> Field( 'name', 'string',
> label = 'Fájlnév' ,
> ),
> Field( 'content', 'upload',
> uploadseparate = True,
> autodelete = True,
> label = 'Aláírt fájl',
> ),
> Field( 'status', 'string',
> label = 'Állapot',
> requires = IS_IN_SET( [
> 'empty', 'notsigned', 'senttosign', 'signed'],
>
>   [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve',
> 'Aláírt'] ),
> represent = lambda r, row:
> T( r )
> ),
> Field( 'state', 'string',
> label = 'Státusz',
> requires = IS_IN_SET( [
> 'ok', 'error'],
>
>   [ 'Rendben', 'Hiba'] ),
> represent = lambda r, row:
> T( r )
> ),
> Field( 'data_count', 'integer',
> label = 'Adat bejegyzések'
> ,
> requires = IS_NOT_EMPTY(),
> default = 0,
> ),
> Field( 'dirname', 'string',
> label = 'Könyvtár név' ,
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'md5sum', 'string',
> label = T( 'MD5 Sum' ),
> represent = lambda md5sum,junk
> : '[%s]' % md5sum,
> writable = False ),
> auth.signature,
> format = '%(dirname)s/%(name)s (%(id)s)'
>  )
> db.files_store.created_on.readable = True
> db.define_table( 'file_properties',
> Field( 'files_store_id', db.
> files_store,
> label = 'Aláírt fájl' ,
> ),
> Field( 'kezdo_oldal', 'integer',
> label = 'Számla kezdő
> oldal',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'oldalszam', 'integer',
> label = 'Számla
> oldalainak száma',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'szamla_sorszama', 'string',
> label = 'Számla sorszáma',
> #requires = IS_NOT_IN_DB(
> db, 'file_properties' ),
> unique = True,
> 

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Should the garbage collector not free up the memory?

2012. május 10., csütörtök 9:28:48 UTC+2 időpontban Bruce Wade a következőt 
írta:
>
> That is how python is. If you want something to clear the memory as soon 
> as you are done with it you need C++ :D
>
> On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:
>
>> Yes I know but it happens over type the mem usage is linearly growing and 
>> after the successful execution never released and that is why I ask :(
>>
>> 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a 
>> következőt írta:
>>
>>> WOW not a good idea: 
>>> for row in db( db.file_**properties.id > 0 ).select(
>>>
>>> If you have a lot of records that is going to kill your memory.
>>>
>>> On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:
>>>
 I wonder if somebody could help me.

 The following code has eaten up ~1,5GB ram and after ended successfully 
 it is not released :(

 file_properties table has 137314 rows.

 Model:
 db.define_table( 'files_store',
 Field( 'name', 'string',
 label = 'Fájlnév' ,
 ),
 Field( 'content', 'upload',
 uploadseparate = True,
 autodelete = True,
 label = 'Aláírt fájl',
 ),
 Field( 'status', 'string',
 label = 'Állapot',
 requires = IS_IN_SET( [
 'empty', 'notsigned', 'senttosign', 'signed'],
 
 [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve', 'Aláírt'] 
 ),
 represent = lambda r, row:T
 ( r )
 ),
 Field( 'state', 'string',
 label = 'Státusz',
 requires = IS_IN_SET( ['ok'
 , 'error'],
 
 [ 'Rendben', 'Hiba'] ),
 represent = lambda r, row:T
 ( r )
 ),
 Field( 'data_count', 'integer',
 label = 'Adat bejegyzések',
 requires = IS_NOT_EMPTY(),
 default = 0,
 ),
 Field( 'dirname', 'string',
 label = 'Könyvtár név' ,
 requires = IS_NOT_EMPTY(),
 ),
 Field( 'md5sum', 'string',
 label = T( 'MD5 Sum' ),
 represent = lambda md5sum,junk
 : '[%s]' % md5sum,
 writable = False ),
 auth.signature,
 format = '%(dirname)s/%(name)s (%(id)s)'
  )
 db.files_store.created_on.read**able = True
 db.define_table( 'file_properties',
 Field( 'files_store_id', db.files_store
 ,
 label = 'Aláírt fájl' ,
 ),
 Field( 'kezdo_oldal', 'integer',
 label = 'Számla kezdő 
 oldal',
 requires = IS_NOT_EMPTY(),
 ),
 Field( 'oldalszam', 'integer',
 label = 'Számla oldalainak 
 száma',
 requires = IS_NOT_EMPTY(),
 ),
 Field( 'szamla_sorszama', 'string',
 label = 'Számla sorszáma',
 #requires = IS_NOT_IN_DB( 
 db, 'file_properties' ),
 unique = True,
 ),
 Field( 'ugyfel_azonosito', 'string',

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
That is how python is. If you want something to clear the memory as soon as
you are done with it you need C++ :D

On Thu, May 10, 2012 at 12:27 AM, szimszon  wrote:

> Yes I know but it happens over type the mem usage is linearly growing and
> after the successful execution never released and that is why I ask :(
>
> 2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a
> következőt írta:
>
>> WOW not a good idea:
>> for row in db( db.file_**properties.id > 0 ).select(
>>
>> If you have a lot of records that is going to kill your memory.
>>
>> On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:
>>
>>> I wonder if somebody could help me.
>>>
>>> The following code has eaten up ~1,5GB ram and after ended successfully
>>> it is not released :(
>>>
>>> file_properties table has 137314 rows.
>>>
>>> Model:
>>> db.define_table( 'files_store',
>>> Field( 'name', 'string',
>>> label = 'Fájlnév' ,
>>> ),
>>> Field( 'content', 'upload',
>>> uploadseparate = True,
>>> autodelete = True,
>>> label = 'Aláírt fájl',
>>> ),
>>> Field( 'status', 'string',
>>> label = 'Állapot',
>>> requires = IS_IN_SET( [
>>> 'empty', 'notsigned', 'senttosign', 'signed'],
>>>
>>> [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve', 'Aláírt']
>>> ),
>>> represent = lambda r, row:T(r
>>> )
>>> ),
>>> Field( 'state', 'string',
>>> label = 'Státusz',
>>> requires = IS_IN_SET( ['ok',
>>> 'error'],
>>>
>>> [ 'Rendben', 'Hiba'] ),
>>> represent = lambda r, row:T(r
>>> )
>>> ),
>>> Field( 'data_count', 'integer',
>>> label = 'Adat bejegyzések',
>>> requires = IS_NOT_EMPTY(),
>>> default = 0,
>>> ),
>>> Field( 'dirname', 'string',
>>> label = 'Könyvtár név' ,
>>> requires = IS_NOT_EMPTY(),
>>> ),
>>> Field( 'md5sum', 'string',
>>> label = T( 'MD5 Sum' ),
>>> represent = lambda md5sum,junk
>>> : '[%s]' % md5sum,
>>> writable = False ),
>>> auth.signature,
>>> format = '%(dirname)s/%(name)s (%(id)s)'
>>>  )
>>> db.files_store.created_on.read**able = True
>>> db.define_table( 'file_properties',
>>> Field( 'files_store_id', db.files_store,
>>> label = 'Aláírt fájl' ,
>>> ),
>>> Field( 'kezdo_oldal', 'integer',
>>> label = 'Számla kezdő oldal'
>>> ,
>>> requires = IS_NOT_EMPTY(),
>>> ),
>>> Field( 'oldalszam', 'integer',
>>> label = 'Számla oldalainak
>>> száma',
>>> requires = IS_NOT_EMPTY(),
>>> ),
>>> Field( 'szamla_sorszama', 'string',
>>> label = 'Számla sorszáma',
>>> #requires = IS_NOT_IN_DB(
>>> db, 'file_properties' ),
>>> unique = True,
>>> ),
>>> Field( 'ugyfel_azonosito', 'string',
>>> label = 'Ügyfél azonosító',
>>> requires = IS_NOT_EMPTY(),
>>> ),
>>> Field( 'folyoszamla_azonosito', 'string'
>>> ,
>>> label = 'Folyószámla
>>> azonosító',
>>> requires = IS_NOT_EMPTY(),
>>>

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread szimszon
Yes I know but it happens over type the mem usage is linearly growing and 
after the successful execution never released and that is why I ask :(

2012. május 10., csütörtök 9:14:14 UTC+2 időpontban Bruce Wade a következőt 
írta:
>
> WOW not a good idea: 
> for row in db( db.file_properties.id > 0 ).select(
>
> If you have a lot of records that is going to kill your memory.
>
> On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:
>
>> I wonder if somebody could help me.
>>
>> The following code has eaten up ~1,5GB ram and after ended successfully 
>> it is not released :(
>>
>> file_properties table has 137314 rows.
>>
>> Model:
>> db.define_table( 'files_store',
>> Field( 'name', 'string',
>> label = 'Fájlnév' ,
>> ),
>> Field( 'content', 'upload',
>> uploadseparate = True,
>> autodelete = True,
>> label = 'Aláírt fájl',
>> ),
>> Field( 'status', 'string',
>> label = 'Állapot',
>> requires = IS_IN_SET( [
>> 'empty', 'notsigned', 'senttosign', 'signed'],
>>   
>>   [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve', 'Aláírt'] ),
>> represent = lambda r, row:T(r 
>> )
>> ),
>> Field( 'state', 'string',
>> label = 'Státusz',
>> requires = IS_IN_SET( ['ok', 
>> 'error'],
>>   
>>   [ 'Rendben', 'Hiba'] ),
>> represent = lambda r, row:T(r 
>> )
>> ),
>> Field( 'data_count', 'integer',
>> label = 'Adat bejegyzések',
>> requires = IS_NOT_EMPTY(),
>> default = 0,
>> ),
>> Field( 'dirname', 'string',
>> label = 'Könyvtár név' ,
>> requires = IS_NOT_EMPTY(),
>> ),
>> Field( 'md5sum', 'string',
>> label = T( 'MD5 Sum' ),
>> represent = lambda md5sum,junk
>> : '[%s]' % md5sum,
>> writable = False ),
>> auth.signature,
>> format = '%(dirname)s/%(name)s (%(id)s)'
>>  )
>> db.files_store.created_on.readable = True
>> db.define_table( 'file_properties',
>> Field( 'files_store_id', db.files_store,
>> label = 'Aláírt fájl' ,
>> ),
>> Field( 'kezdo_oldal', 'integer',
>> label = 'Számla kezdő oldal',
>> requires = IS_NOT_EMPTY(),
>> ),
>> Field( 'oldalszam', 'integer',
>> label = 'Számla oldalainak 
>> száma',
>> requires = IS_NOT_EMPTY(),
>> ),
>> Field( 'szamla_sorszama', 'string',
>> label = 'Számla sorszáma',
>> #requires = IS_NOT_IN_DB( 
>> db, 'file_properties' ),
>> unique = True,
>> ),
>> Field( 'ugyfel_azonosito', 'string',
>> label = 'Ügyfél azonosító',
>> requires = IS_NOT_EMPTY(),
>> ),
>> Field( 'folyoszamla_azonosito', 'string',
>> label = 'Folyószámla 
>> azonosító',
>> requires = IS_NOT_EMPTY(),
>> ),
>> Field( 'teljesites_idopontja', 'date',
>> 

Re: [web2py] Need a little help in code review (a function eating up all memory)

2012-05-10 Thread Bruce Wade
WOW not a good idea:
for row in db( db.file_properties.id > 0 ).select(

If you have a lot of records that is going to kill your memory.

On Thu, May 10, 2012 at 12:10 AM, szimszon  wrote:

> I wonder if somebody could help me.
>
> The following code has eaten up ~1,5GB ram and after ended successfully it
> is not released :(
>
> file_properties table has 137314 rows.
>
> Model:
> db.define_table( 'files_store',
> Field( 'name', 'string',
> label = 'Fájlnév' ,
> ),
> Field( 'content', 'upload',
> uploadseparate = True,
> autodelete = True,
> label = 'Aláírt fájl',
> ),
> Field( 'status', 'string',
> label = 'Állapot',
> requires = IS_IN_SET( ['empty'
> , 'notsigned', 'senttosign', 'signed'],
>
>   [ 'Üres', 'Aláiratlan', 'Aláírásra elküldve', 'Aláírt'] ),
> represent = lambda r, row:T(r
> )
> ),
> Field( 'state', 'string',
> label = 'Státusz',
> requires = IS_IN_SET( ['ok',
> 'error'],
>
>   [ 'Rendben', 'Hiba'] ),
> represent = lambda r, row:T(r
> )
> ),
> Field( 'data_count', 'integer',
> label = 'Adat bejegyzések',
> requires = IS_NOT_EMPTY(),
> default = 0,
> ),
> Field( 'dirname', 'string',
> label = 'Könyvtár név' ,
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'md5sum', 'string',
> label = T( 'MD5 Sum' ),
> represent = lambda md5sum,junk
> : '[%s]' % md5sum,
> writable = False ),
> auth.signature,
> format = '%(dirname)s/%(name)s (%(id)s)'
>  )
> db.files_store.created_on.readable = True
> db.define_table( 'file_properties',
> Field( 'files_store_id', db.files_store,
> label = 'Aláírt fájl' ,
> ),
> Field( 'kezdo_oldal', 'integer',
> label = 'Számla kezdő oldal',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'oldalszam', 'integer',
> label = 'Számla oldalainak
> száma',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'szamla_sorszama', 'string',
> label = 'Számla sorszáma',
> #requires = IS_NOT_IN_DB( db,
> 'file_properties' ),
> unique = True,
> ),
> Field( 'ugyfel_azonosito', 'string',
> label = 'Ügyfél azonosító',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'folyoszamla_azonosito', 'string',
> label = 'Folyószámla
> azonosító',
> requires = IS_NOT_EMPTY(),
> ),
> Field( 'teljesites_idopontja', 'date',
> label = 'Teljesítés időpontja'
> ,
> requires = IS_DATE(
> '%Y.%m.%d.' ) ),
> auth.signature,
> ),
>
>
>
>
> Controller:
>
> def autoadjust():
> lista = list()
> last_row = None
> next_page_number = 0
> for row in db( db.file_properties.id > 0 ).select(
>
>   orderby = ( db.file_properties.files_sto