Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread horridohobbyist
Something very strange is going on. After I've run the Welcome test where 
the results are consistently fast (ie, ~1.6 seconds), if I wait an hour or 
so and run the test again, I get something like the following:

Begin...
Elapsed time: 97.1873888969
Percentage fill: 41.9664268585
Begin...
Elapsed time: 1.63321781158
Percentage fill: 41.9664268585
Begin...
Elapsed time: 13.2418119907
Percentage fill: 41.9664268585
Begin...
Elapsed time: 1.62313604355
Percentage fill: 41.9664268585
Begin...
Elapsed time: 13.3058979511
Percentage fill: 41.9664268585

The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
between fast and slow (ie, 1.6 seconds vs 13 seconds).

To reiterate:  This happens if I give the server lots of time before I 
resume testing. Please note that nothing much else is happening on the 
server; it gets very little traffic.

If I restart Apache, then I get back to the initial situation where the 
results are consistently fast. *This pattern is repeatable*.

FYI, I'm using processes=2 and threads=1.


On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance problem.

 BTW, I'm having a dickens of a time reproducing the problem in my servers 
 (either the actual server or the VM). I have not been able to discover how 
 to reset the state of my tests, so I have to blindly go around trying to 
 reproduce the problem. I thought it might be a caching problem in the 
 browser, but clearing the browser cache doesn't seem to reset the state. 
 Restarting Apache doesn't always reset the state, either. Restarting the 
 browser doesn't reset the state. In desperation, I've even tried rebooting 
 the systems. Nada.

 This is very frustrating. I shall have to continue my investigation before 
 coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every request 
 which is routed through web2py will try to start a new thread. Every web 
 page will potentially generate multiple requests (for assets like images, 
 scripts etc). So you can potentially need a lot of threads. When you 
 started two processes, you may not have specified threads which meant you 
 had a pool of 30 threads (and then you saw better performance). Using few 
 threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread horridohobbyist
Scratch my solution. It's not correct. My test results are all over the 
place. You don't even have to wait an hour. Within the span of 15 minutes, 
I've gone from fast, fast, fast, fast, fast, fast to super-slow (90+ 
seconds), super-slow to slow, slow, slow, slow. The variability seems to be 
pseudo-random.

I should also mention that threads=30 doesn't always work. This is 
probably part of the pseudo-random nature of the problem.

I don't think the solution lies in configuring processes and threads in 
the Apache web2py configuration. At this point, I don't know what else to 
do or try.


On Saturday, 22 March 2014 11:01:16 UTC-4, horridohobbyist wrote:

 Something very strange is going on. After I've run the Welcome test where 
 the results are consistently fast (ie, ~1.6 seconds), if I wait an hour or 
 so and run the test again, I get something like the following:

 Begin...
 Elapsed time: 97.1873888969
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.63321781158
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.2418119907
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.62313604355
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.3058979511
 Percentage fill: 41.9664268585

 The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
 between fast and slow (ie, 1.6 seconds vs 13 seconds).

 To reiterate:  This happens if I give the server lots of time before I 
 resume testing. Please note that nothing much else is happening on the 
 server; it gets very little traffic.

 If I restart Apache, then I get back to the initial situation where the 
 results are consistently fast. *This pattern is repeatable*.

 FYI, I'm using processes=2 and threads=1.


 On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance problem.

 BTW, I'm having a dickens of a time reproducing the problem in my servers 
 (either the actual server or the VM). I have not been able to discover how 
 to reset the state of my tests, so I have to blindly go around trying to 
 reproduce the problem. I thought it might be a caching problem in the 
 browser, but clearing the browser cache doesn't seem to reset the state. 
 Restarting Apache doesn't always reset the state, either. Restarting the 
 browser doesn't reset the state. In desperation, I've even tried rebooting 
 the systems. Nada.

 This is very frustrating. I shall have to continue my investigation 
 before coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every request 
 which is routed through web2py will try to start a new thread. Every web 
 page will potentially generate multiple requests (for assets like images, 
 scripts etc). So you can potentially need a lot of threads. When you 
 started two processes, you may not have specified threads which meant you 
 had a pool of 30 threads (and then you saw better performance). Using few 
 threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread Massimo Di Pierro
Have you checked memory consumption?

On Saturday, 22 March 2014 10:15:59 UTC-5, horridohobbyist wrote:

 Scratch my solution. It's not correct. My test results are all over the 
 place. You don't even have to wait an hour. Within the span of 15 minutes, 
 I've gone from fast, fast, fast, fast, fast, fast to super-slow (90+ 
 seconds), super-slow to slow, slow, slow, slow. The variability seems to be 
 pseudo-random.

 I should also mention that threads=30 doesn't always work. This is 
 probably part of the pseudo-random nature of the problem.

 I don't think the solution lies in configuring processes and threads 
 in the Apache web2py configuration. At this point, I don't know what else 
 to do or try.


 On Saturday, 22 March 2014 11:01:16 UTC-4, horridohobbyist wrote:

 Something very strange is going on. After I've run the Welcome test where 
 the results are consistently fast (ie, ~1.6 seconds), if I wait an hour or 
 so and run the test again, I get something like the following:

 Begin...
 Elapsed time: 97.1873888969
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.63321781158
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.2418119907
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.62313604355
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.3058979511
 Percentage fill: 41.9664268585

 The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
 between fast and slow (ie, 1.6 seconds vs 13 seconds).

 To reiterate:  This happens if I give the server lots of time before I 
 resume testing. Please note that nothing much else is happening on the 
 server; it gets very little traffic.

 If I restart Apache, then I get back to the initial situation where the 
 results are consistently fast. *This pattern is repeatable*.

 FYI, I'm using processes=2 and threads=1.


 On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance problem.

 BTW, I'm having a dickens of a time reproducing the problem in my 
 servers (either the actual server or the VM). I have not been able to 
 discover how to reset the state of my tests, so I have to blindly go around 
 trying to reproduce the problem. I thought it might be a caching problem in 
 the browser, but clearing the browser cache doesn't seem to reset the 
 state. Restarting Apache doesn't always reset the state, either. Restarting 
 the browser doesn't reset the state. In desperation, I've even tried 
 rebooting the systems. Nada.

 This is very frustrating. I shall have to continue my investigation 
 before coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every request 
 which is routed through web2py will try to start a new thread. Every web 
 page will potentially generate multiple requests (for assets like images, 
 scripts etc). So you can potentially need a lot of threads. When you 
 started two processes, you may not have specified threads which meant you 
 had a pool of 30 threads (and then you saw better performance). Using few 
 threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread horridohobbyist
Well, according to the 'free' command, even when I'm getting these 
slowdowns, I'm nowhere close to the memory limits:

 total  used   free  shared buffers cached
Mem:   39252443929003532344   0   23608 123856

Like I said, my Linux server doesn't do much. It doesn't get much traffic, 
either. So it has plenty of free memory.


On Saturday, 22 March 2014 12:49:21 UTC-4, Massimo Di Pierro wrote:

 Have you checked memory consumption?

 On Saturday, 22 March 2014 10:15:59 UTC-5, horridohobbyist wrote:

 Scratch my solution. It's not correct. My test results are all over the 
 place. You don't even have to wait an hour. Within the span of 15 minutes, 
 I've gone from fast, fast, fast, fast, fast, fast to super-slow (90+ 
 seconds), super-slow to slow, slow, slow, slow. The variability seems to be 
 pseudo-random.

 I should also mention that threads=30 doesn't always work. This is 
 probably part of the pseudo-random nature of the problem.

 I don't think the solution lies in configuring processes and threads 
 in the Apache web2py configuration. At this point, I don't know what else 
 to do or try.


 On Saturday, 22 March 2014 11:01:16 UTC-4, horridohobbyist wrote:

 Something very strange is going on. After I've run the Welcome test 
 where the results are consistently fast (ie, ~1.6 seconds), if I wait an 
 hour or so and run the test again, I get something like the following:

 Begin...
 Elapsed time: 97.1873888969
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.63321781158
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.2418119907
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.62313604355
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.3058979511
 Percentage fill: 41.9664268585

 The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
 between fast and slow (ie, 1.6 seconds vs 13 seconds).

 To reiterate:  This happens if I give the server lots of time before I 
 resume testing. Please note that nothing much else is happening on the 
 server; it gets very little traffic.

 If I restart Apache, then I get back to the initial situation where the 
 results are consistently fast. *This pattern is repeatable*.

 FYI, I'm using processes=2 and threads=1.


 On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance problem.

 BTW, I'm having a dickens of a time reproducing the problem in my 
 servers (either the actual server or the VM). I have not been able to 
 discover how to reset the state of my tests, so I have to blindly go 
 around 
 trying to reproduce the problem. I thought it might be a caching problem 
 in 
 the browser, but clearing the browser cache doesn't seem to reset the 
 state. Restarting Apache doesn't always reset the state, either. 
 Restarting 
 the browser doesn't reset the state. In desperation, I've even tried 
 rebooting the systems. Nada.

 This is very frustrating. I shall have to continue my investigation 
 before coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every 
 request which is routed through web2py will try to start a new thread. 
 Every web page will potentially generate multiple requests (for assets 
 like 
 images, scripts etc). So you can potentially need a lot of threads. When 
 you started two processes, you may not have specified threads which meant 
 you had a pool of 30 threads (and then you saw better performance). Using 
 few threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread horridohobbyist
I'm considering delving into DTrace to find out what's going on, but any 
such instrumentation is apparently very problematic in Linux (eg, poor 
support, poor documentation, etc.). Is there any other way to find out what 
the hell is going on?


On Saturday, 22 March 2014 16:24:20 UTC-4, horridohobbyist wrote:

 Well, according to the 'free' command, even when I'm getting these 
 slowdowns, I'm nowhere close to the memory limits:

  total  used   free  shared buffers cached
 Mem:   39252443929003532344   0   23608 123856

 Like I said, my Linux server doesn't do much. It doesn't get much traffic, 
 either. So it has plenty of free memory.


 On Saturday, 22 March 2014 12:49:21 UTC-4, Massimo Di Pierro wrote:

 Have you checked memory consumption?

 On Saturday, 22 March 2014 10:15:59 UTC-5, horridohobbyist wrote:

 Scratch my solution. It's not correct. My test results are all over the 
 place. You don't even have to wait an hour. Within the span of 15 minutes, 
 I've gone from fast, fast, fast, fast, fast, fast to super-slow (90+ 
 seconds), super-slow to slow, slow, slow, slow. The variability seems to be 
 pseudo-random.

 I should also mention that threads=30 doesn't always work. This is 
 probably part of the pseudo-random nature of the problem.

 I don't think the solution lies in configuring processes and threads 
 in the Apache web2py configuration. At this point, I don't know what else 
 to do or try.


 On Saturday, 22 March 2014 11:01:16 UTC-4, horridohobbyist wrote:

 Something very strange is going on. After I've run the Welcome test 
 where the results are consistently fast (ie, ~1.6 seconds), if I wait an 
 hour or so and run the test again, I get something like the following:

 Begin...
 Elapsed time: 97.1873888969
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.63321781158
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.2418119907
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.62313604355
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.3058979511
 Percentage fill: 41.9664268585

 The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
 between fast and slow (ie, 1.6 seconds vs 13 seconds).

 To reiterate:  This happens if I give the server lots of time before I 
 resume testing. Please note that nothing much else is happening on the 
 server; it gets very little traffic.

 If I restart Apache, then I get back to the initial situation where the 
 results are consistently fast. *This pattern is repeatable*.

 FYI, I'm using processes=2 and threads=1.


 On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance problem.

 BTW, I'm having a dickens of a time reproducing the problem in my 
 servers (either the actual server or the VM). I have not been able to 
 discover how to reset the state of my tests, so I have to blindly go 
 around 
 trying to reproduce the problem. I thought it might be a caching problem 
 in 
 the browser, but clearing the browser cache doesn't seem to reset the 
 state. Restarting Apache doesn't always reset the state, either. 
 Restarting 
 the browser doesn't reset the state. In desperation, I've even tried 
 rebooting the systems. Nada.

 This is very frustrating. I shall have to continue my investigation 
 before coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every 
 request which is routed through web2py will try to start a new thread. 
 Every web page will potentially generate multiple requests (for assets 
 like 
 images, scripts etc). So you can potentially need a lot of threads. When 
 you started two processes, you may not have specified threads which 
 meant 
 you had a pool of 30 threads (and then you saw better performance). 
 Using 
 few threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-22 Thread horridohobbyist
I don't understand why the Flask version of the Welcome test doesn't 
exhibit this slowdown under Apache. It's executing the same application 
code. It's configured with the same processes=1 and threads=1 WSGI 
parameters. It's running the same Python interpreter (and presumably using 
the same GIL).

I'm not sure it's entirely Apache's fault. I suspect it's in the 
*interaction* between Apache and web2py. The interaction between Apache and 
Flask seems to avoid this problem. However, I am ill-equipped to follow up 
on this.


On Saturday, 22 March 2014 16:28:22 UTC-4, horridohobbyist wrote:

 I'm considering delving into DTrace to find out what's going on, but any 
 such instrumentation is apparently very problematic in Linux (eg, poor 
 support, poor documentation, etc.). Is there any other way to find out what 
 the hell is going on?


 On Saturday, 22 March 2014 16:24:20 UTC-4, horridohobbyist wrote:

 Well, according to the 'free' command, even when I'm getting these 
 slowdowns, I'm nowhere close to the memory limits:

  total  used   free  shared buffers cached
 Mem:   39252443929003532344   0   23608 123856

 Like I said, my Linux server doesn't do much. It doesn't get much 
 traffic, either. So it has plenty of free memory.


 On Saturday, 22 March 2014 12:49:21 UTC-4, Massimo Di Pierro wrote:

 Have you checked memory consumption?

 On Saturday, 22 March 2014 10:15:59 UTC-5, horridohobbyist wrote:

 Scratch my solution. It's not correct. My test results are all over the 
 place. You don't even have to wait an hour. Within the span of 15 minutes, 
 I've gone from fast, fast, fast, fast, fast, fast to super-slow (90+ 
 seconds), super-slow to slow, slow, slow, slow. The variability seems to 
 be 
 pseudo-random.

 I should also mention that threads=30 doesn't always work. This is 
 probably part of the pseudo-random nature of the problem.

 I don't think the solution lies in configuring processes and 
 threads in the Apache web2py configuration. At this point, I don't know 
 what else to do or try.


 On Saturday, 22 March 2014 11:01:16 UTC-4, horridohobbyist wrote:

 Something very strange is going on. After I've run the Welcome test 
 where the results are consistently fast (ie, ~1.6 seconds), if I wait an 
 hour or so and run the test again, I get something like the following:

 Begin...
 Elapsed time: 97.1873888969
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.63321781158
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.2418119907
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 1.62313604355
 Percentage fill: 41.9664268585
 Begin...
 Elapsed time: 13.3058979511
 Percentage fill: 41.9664268585

 The first run is ENORMOUSLY slow. Subsequently, the runtimes alternate 
 between fast and slow (ie, 1.6 seconds vs 13 seconds).

 To reiterate:  This happens if I give the server lots of time before I 
 resume testing. Please note that nothing much else is happening on the 
 server; it gets very little traffic.

 If I restart Apache, then I get back to the initial situation where 
 the results are consistently fast. *This pattern is repeatable*.

 FYI, I'm using processes=2 and threads=1.


 On Thursday, 20 March 2014 11:34:03 UTC-4, horridohobbyist wrote:

 processes=1 and threads=30 also seems to solve the performance 
 problem.

 BTW, I'm having a dickens of a time reproducing the problem in my 
 servers (either the actual server or the VM). I have not been able to 
 discover how to reset the state of my tests, so I have to blindly go 
 around 
 trying to reproduce the problem. I thought it might be a caching problem 
 in 
 the browser, but clearing the browser cache doesn't seem to reset the 
 state. Restarting Apache doesn't always reset the state, either. 
 Restarting 
 the browser doesn't reset the state. In desperation, I've even tried 
 rebooting the systems. Nada.

 This is very frustrating. I shall have to continue my investigation 
 before coming to a definitive conclusion.


 On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every 
 request which is routed through web2py will try to start a new thread. 
 Every web page will potentially generate multiple requests (for assets 
 like 
 images, scripts etc). So you can potentially need a lot of threads. 
 When 
 you started two processes, you may not have specified threads which 
 meant 
 you had a pool of 30 threads (and then you saw better performance). 
 Using 
 few threads than that isn't going to conclude very much, I think.



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an 

Re: [web2py] Python Performance Issue, Part 2

2014-03-20 Thread horridohobbyist
processes=1 and threads=30 also seems to solve the performance problem.

BTW, I'm having a dickens of a time reproducing the problem in my servers 
(either the actual server or the VM). I have not been able to discover how 
to reset the state of my tests, so I have to blindly go around trying to 
reproduce the problem. I thought it might be a caching problem in the 
browser, but clearing the browser cache doesn't seem to reset the state. 
Restarting Apache doesn't always reset the state, either. Restarting the 
browser doesn't reset the state. In desperation, I've even tried rebooting 
the systems. Nada.

This is very frustrating. I shall have to continue my investigation before 
coming to a definitive conclusion.


On Wednesday, 19 March 2014 21:06:02 UTC-4, Tim Richardson wrote:

 Try threads = 30 or 50 or 100; that would be interesting. Every request 
 which is routed through web2py will try to start a new thread. Every web 
 page will potentially generate multiple requests (for assets like images, 
 scripts etc). So you can potentially need a lot of threads. When you 
 started two processes, you may not have specified threads which meant you 
 had a pool of 30 threads (and then you saw better performance). Using few 
 threads than that isn't going to conclude very much, I think.


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-20 Thread 黄祥
i think it make the other users more clear, if you can also provide the 
configuration and procedures also for what are you doing.

best regards,
stifan

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread Michele Comitini
if threads=0 does not work use threads=1 and make mod_wsgi happy.  If
you remove threads it defaults to 15.

2014-03-19 4:34 GMT+01:00 horridohobbyist horrido.hobb...@gmail.com:
 threads=0 is no good–Apache restart upchucks on this.

 BTW, I haven't experimented with the threads value. Might this also improve
 performance (with respect to GIL)?

 Also, I was wondering. Is the processes= solution related to whether you
 are using the prefork MPM or the worker MPM? I know that Apache is
 normally compiled to use the prefork MPM.


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5

 with web2py try the following instead:
 WSGIDaemonProcess hello user=www-data group=www-data processes=number
 of cores + 1 threads=(0 or 1)

 If it's faster, then the GIL must be the cause.  flask by default has
 much less features active (session for instance)



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com:
  I took the shipping code that I ran in Flask (without Apache) and
  adapted it
  to run under Apache as a Flask app. That way, I'm comparing apples to
  apples. I'm comparing the performance of the shipping code between Flask
  and
  web2py.
 
  Below, I've included the 'default' file from Apache2/sites-available for
  Flask.
 
  Basically, the code in Flask executes 10x faster than the same code in
  web2py. So my question is:  if Apache is at fault for the web2py app's
  slow
  performance, why doesn't Apache hurt the Flask app's performance? (This
  doesn't seem to be related to GIL or WSGI.)
 
 
  VirtualHost *:80
ServerName 10.211.55.7
WSGIDaemonProcess hello user=www-data group=www-data threads=5
WSGIScriptAlias / /home/richard/welcome/hello.wsgi
 
Directory /home/richard/welcome
  Order Allow,Deny
  Allow from all
/Directory
  /VirtualHost
 
  --
  Resources:
  - http://web2py.com
  - http://web2py.com/book (Documentation)
  - http://github.com/web2py/web2py (Source code)
  - https://code.google.com/p/web2py/issues/list (Report Issues)
  ---
  You received this message because you are subscribed to the Google
  Groups
  web2py-users group.
  To unsubscribe from this group and stop receiving emails from it, send
  an
  email to web2py+un...@googlegroups.com.
  For more options, visit https://groups.google.com/d/optout.

 --
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 ---
 You received this message because you are subscribed to the Google Groups
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread Tim Richardson
Did you explicitly set the number of threads as well? By default you get 15 
threads per process. The documentation implies that this is a hard limit, 
but I'm not sure.
Maybe you have simply found a bottleneck in threads. Did you also try 
increasing the number of threads instead of adding more processes? 
Multi-threaded apache is supposed to be faster than multi-process apache 
under real load (i.e. multiple users) because starting processes is 
expensive in time and memory.
So any conclusion that you need more processes is dubious, I think. I can't 
recall how many simultaneous users your benchmarking is testing. 
Bear in mind that the fastest servers, the greenlet or co-operative async 
servers, are not only limited to one process, but even to one thread. 











On Wednesday, 19 March 2014 14:25:47 UTC+11, horridohobbyist wrote:

 I shall do that. Thanks.

 With the knowledge about processes=, I've tuned my actual Linux server 
 to eliminate the 10x slowdown. As it turns out, for my 2.4GHz quad-core 
 Xeon with 4GB RAM, processes=2 works best. I found that any other value 
 (3, 4, 5) gave very inconsistent results–sometimes I would get 1x (the 
 ideal) and sometimes I would get 10x. Very bizarre.

 processes=2 is counter-intuitive. After all, I have 4 cores. Why 
 shouldn't processes=4 be good?

 Anyway, not only is the shipping code fast, but I find that my overall 
 web2py app feels a lot snappier. Is it just my imagination?

 If processes=2 is boosting the speed of Python in general, then you 
 would expect all of web2py to benefit. So maybe it's not my imagination.

 Anyway, the takeaway, I think, is that you must tune the Apache 
 configuration for the particular server hardware that you have. The default 
 processes=1 is not good enough.


 On Tuesday, 18 March 2014 22:37:58 UTC-4, Massimo Di Pierro wrote:

 Thank you for all your tests. You should write a summary of your results 
 with recommendations for Apache users.

 On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this is 
 in a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com: 
  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples to 
  apples. I'm comparing the performance of the shipping code between 
 Flask and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available 
 for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code 
 in 
  web2py. So my question is:  if Apache is at fault for the web2py 
 app's slow 
  performance, why doesn't Apache hurt the Flask app's performance? 
 (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, 
 send an 
  email to web2py+un...@googlegroups.com. 
  For more options, visit https://groups.google.com/d/optout. 



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread Michele Comitini
 Multi-threaded apache is supposed to be faster than multi-process apache 
 under real load (i.e. multiple users) because starting processes is expensive 
 in time and memory.


IMHO under linux the difference is really negligible.  Popularity of
threads rose in mid '90 because a very popular OS was not able to do
forks properly. Java developed threading API, instead of a
multiprocess and message passing API as a consequence of that flaw.
Today there is no need of threading in general concurrent programming,
unless one is stuck in Java.



2014-03-19 10:24 GMT+01:00 Tim Richardson t...@growthpath.com.au:
 Did you explicitly set the number of threads as well? By default you get 15
 threads per process. The documentation implies that this is a hard limit,
 but I'm not sure.
 Maybe you have simply found a bottleneck in threads. Did you also try
 increasing the number of threads instead of adding more processes?
 Multi-threaded apache is supposed to be faster than multi-process apache
 under real load (i.e. multiple users) because starting processes is
 expensive in time and memory.
 So any conclusion that you need more processes is dubious, I think. I can't
 recall how many simultaneous users your benchmarking is testing.
 Bear in mind that the fastest servers, the greenlet or co-operative async
 servers, are not only limited to one process, but even to one thread.











 On Wednesday, 19 March 2014 14:25:47 UTC+11, horridohobbyist wrote:

 I shall do that. Thanks.

 With the knowledge about processes=, I've tuned my actual Linux server
 to eliminate the 10x slowdown. As it turns out, for my 2.4GHz quad-core Xeon
 with 4GB RAM, processes=2 works best. I found that any other value (3, 4,
 5) gave very inconsistent results–sometimes I would get 1x (the ideal) and
 sometimes I would get 10x. Very bizarre.

 processes=2 is counter-intuitive. After all, I have 4 cores. Why
 shouldn't processes=4 be good?

 Anyway, not only is the shipping code fast, but I find that my overall
 web2py app feels a lot snappier. Is it just my imagination?

 If processes=2 is boosting the speed of Python in general, then you
 would expect all of web2py to benefit. So maybe it's not my imagination.

 Anyway, the takeaway, I think, is that you must tune the Apache
 configuration for the particular server hardware that you have. The default
 processes=1 is not good enough.


 On Tuesday, 18 March 2014 22:37:58 UTC-4, Massimo Di Pierro wrote:

 Thank you for all your tests. You should write a summary of your results
 with recommendations for Apache users.

 On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this is
 in a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5

 with web2py try the following instead:
 WSGIDaemonProcess hello user=www-data group=www-data processes=number
 of cores + 1 threads=(0 or 1)

 If it's faster, then the GIL must be the cause.  flask by default has
 much less features active (session for instance)



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com:
  I took the shipping code that I ran in Flask (without Apache) and
  adapted it
  to run under Apache as a Flask app. That way, I'm comparing apples to
  apples. I'm comparing the performance of the shipping code between
  Flask and
  web2py.
 
  Below, I've included the 'default' file from Apache2/sites-available
  for
  Flask.
 
  Basically, the code in Flask executes 10x faster than the same code
  in
  web2py. So my question is:  if Apache is at fault for the web2py
  app's slow
  performance, why doesn't Apache hurt the Flask app's performance?
  (This
  doesn't seem to be related to GIL or WSGI.)
 
 
  VirtualHost *:80
ServerName 10.211.55.7
WSGIDaemonProcess hello user=www-data group=www-data threads=5
WSGIScriptAlias / /home/richard/welcome/hello.wsgi
 
Directory /home/richard/welcome
  Order Allow,Deny
  Allow from all
/Directory
  /VirtualHost
 
  --
  Resources:
  - http://web2py.com
  - http://web2py.com/book (Documentation)
  - http://github.com/web2py/web2py (Source code)
  - https://code.google.com/p/web2py/issues/list (Report Issues)
  ---
  You received this message because you are subscribed to the Google
  Groups
  web2py-users group.
  To unsubscribe from this group and stop receiving emails from it,
  send an
  email to web2py+un...@googlegroups.com.
  For more options, visit https://groups.google.com/d/optout.

 --
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 ---
 You received this message because you are subscribed to the Google Groups
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to 

Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread horridohobbyist
Yes, processes=3 and threads=1.

I tried processes=1 and threads=3, and performance was still 10x bad. 
So I guess that answers my question:  the threads parameter is not helpful.


On Wednesday, 19 March 2014 05:24:01 UTC-4, Tim Richardson wrote:

 Did you explicitly set the number of threads as well? By default you get 
 15 threads per process. The documentation implies that this is a hard 
 limit, but I'm not sure.
 Maybe you have simply found a bottleneck in threads. Did you also try 
 increasing the number of threads instead of adding more processes? 
 Multi-threaded apache is supposed to be faster than multi-process apache 
 under real load (i.e. multiple users) because starting (and switching) 
 processes is expensive in time and memory.*
 So any conclusion that you need more processes is dubious, I think. I 
 can't recall how many simultaneous users your benchmarking is testing. 
 Bear in mind that the fastest servers, the greenlet or co-operative async 
 servers, are not only limited to one process, but even to one thread. 

 http://nichol.as/benchmark-of-python-web-servers










 On Wednesday, 19 March 2014 14:25:47 UTC+11, horridohobbyist wrote:

 I shall do that. Thanks.

 With the knowledge about processes=, I've tuned my actual Linux server 
 to eliminate the 10x slowdown. As it turns out, for my 2.4GHz quad-core 
 Xeon with 4GB RAM, processes=2 works best. I found that any other value 
 (3, 4, 5) gave very inconsistent results–sometimes I would get 1x (the 
 ideal) and sometimes I would get 10x. Very bizarre.

 processes=2 is counter-intuitive. After all, I have 4 cores. Why 
 shouldn't processes=4 be good?

 Anyway, not only is the shipping code fast, but I find that my overall 
 web2py app feels a lot snappier. Is it just my imagination?

 If processes=2 is boosting the speed of Python in general, then you 
 would expect all of web2py to benefit. So maybe it's not my imagination.

 Anyway, the takeaway, I think, is that you must tune the Apache 
 configuration for the particular server hardware that you have. The default 
 processes=1 is not good enough.


 On Tuesday, 18 March 2014 22:37:58 UTC-4, Massimo Di Pierro wrote:

 Thank you for all your tests. You should write a summary of your results 
 with recommendations for Apache users.

 On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this is 
 in a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com: 
  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples 
 to 
  apples. I'm comparing the performance of the shipping code between 
 Flask and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available 
 for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code 
 in 
  web2py. So my question is:  if Apache is at fault for the web2py 
 app's slow 
  performance, why doesn't Apache hurt the Flask app's performance? 
 (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, 
 send an 
  email to web2py+un...@googlegroups.com. 
  For more options, visit https://groups.google.com/d/optout. 



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread horridohobbyist
In 2007, I wrote my first web application using Smalltalk/Seaside. I chose 
Seaside because it was a very easy-to-learn, easy-to-program, 
easy-to-deploy, highly productive, self-contained all-in-one web framework. 
(It still is, today.) Unfortunately, web2py hadn't been born yet, but 
clearly the two frameworks had similar goals. (Born in 2002, Seaside has 5 
years over web2py.)

I deployed the Seaside app under Apache on the same hardware that I'm using 
today for web2py. Yes, my 2.4GHz quad-core Xeon Dell server with 4GB RAM is 
over 7 years old!!

Recently, I switched over to web2py because I had heard so many good things 
about it. I can now say that web2py is far superior to Seaside. It's even 
more easy-to-learn and easy-to-program; it's even more productive. And 
Seaside was pretty darn good in this respect!

I believe web2py is the best available web framework in the world today, 
regardless of language (ie, Java, Ruby, PHP, etc.). I am 100% in agreement 
with its philosophy and goals.

Please, keep up the good work!


On Wednesday, 19 March 2014 07:27:54 UTC-4, horridohobbyist wrote:

 Yes, processes=3 and threads=1.

 I tried processes=1 and threads=3, and performance was still 10x bad. 
 So I guess that answers my question:  the threads parameter is not helpful.


 On Wednesday, 19 March 2014 05:24:01 UTC-4, Tim Richardson wrote:

 Did you explicitly set the number of threads as well? By default you get 
 15 threads per process. The documentation implies that this is a hard 
 limit, but I'm not sure.
 Maybe you have simply found a bottleneck in threads. Did you also try 
 increasing the number of threads instead of adding more processes? 
 Multi-threaded apache is supposed to be faster than multi-process apache 
 under real load (i.e. multiple users) because starting (and switching) 
 processes is expensive in time and memory.*
 So any conclusion that you need more processes is dubious, I think. I 
 can't recall how many simultaneous users your benchmarking is testing. 
 Bear in mind that the fastest servers, the greenlet or co-operative async 
 servers, are not only limited to one process, but even to one thread. 

 http://nichol.as/benchmark-of-python-web-servers










 On Wednesday, 19 March 2014 14:25:47 UTC+11, horridohobbyist wrote:

 I shall do that. Thanks.

 With the knowledge about processes=, I've tuned my actual Linux server 
 to eliminate the 10x slowdown. As it turns out, for my 2.4GHz quad-core 
 Xeon with 4GB RAM, processes=2 works best. I found that any other value 
 (3, 4, 5) gave very inconsistent results–sometimes I would get 1x (the 
 ideal) and sometimes I would get 10x. Very bizarre.

 processes=2 is counter-intuitive. After all, I have 4 cores. Why 
 shouldn't processes=4 be good?

 Anyway, not only is the shipping code fast, but I find that my overall 
 web2py app feels a lot snappier. Is it just my imagination?

 If processes=2 is boosting the speed of Python in general, then you 
 would expect all of web2py to benefit. So maybe it's not my imagination.

 Anyway, the takeaway, I think, is that you must tune the Apache 
 configuration for the particular server hardware that you have. The default 
 processes=1 is not good enough.


 On Tuesday, 18 March 2014 22:37:58 UTC-4, Massimo Di Pierro wrote:

 Thank you for all your tests. You should write a summary of your 
 results with recommendations for Apache users.

 On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this 
 is in a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data 
 processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com: 
  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples 
 to 
  apples. I'm comparing the performance of the shipping code between 
 Flask and 
  web2py. 
  
  Below, I've included the 'default' file from 
 Apache2/sites-available for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code 
 in 
  web2py. So my question is:  if Apache is at fault for the web2py 
 app's slow 
  performance, why doesn't Apache hurt the Flask app's performance? 
 (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  

Re: [web2py] Python Performance Issue, Part 2

2014-03-19 Thread Tim Richardson
Try threads = 30 or 50 or 100; that would be interesting.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Python Performance Issue, Part 2

2014-03-18 Thread horridohobbyist
I took the shipping code that I ran in Flask (without Apache) and adapted 
it to run under Apache as a Flask app. That way, I'm comparing apples to 
apples. I'm comparing the performance of the shipping code between Flask 
and web2py.

Below, I've included the 'default' file from Apache2/sites-available for 
Flask.

Basically, the code in Flask executes 10x faster than the same code in 
web2py. So my question is:  if Apache is at fault for the web2py app's slow 
performance, why doesn't Apache hurt the Flask app's performance? (This 
doesn't seem to be related to GIL or WSGI.)


VirtualHost *:80
  ServerName 10.211.55.7
  WSGIDaemonProcess hello user=www-data group=www-data threads=5
  WSGIScriptAlias / /home/richard/welcome/hello.wsgi

  Directory /home/richard/welcome
Order Allow,Deny
Allow from all
  /Directory
/VirtualHost

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
430x300x200 430x300x200 400x370x330 390x285x140 585x285x200
430x300x200 400x370x330 553x261x152 290x210x160 390x285x140


debug.out
Description: Binary data
from flask import Flask
app = Flask(__name__)

import time
import sys
import os
debug_path = '/home/richard/welcome/debug.out'
def debug(str):
f = open(debug_path,'a')
f.write(str+'\n')
f.close()
return

#
# pyShipping 1.8a
#
import time
import random
from shippackage import Package

def packstrip(bin, p):
Creates a Strip which fits into bin.

Returns the Packages to be used in the strip, the dimensions of the strip as a 3-tuple
and a list of left over packages.

# This code is somewhat optimized and somewhat unreadable
s = []# strip
r = []# rest
ss = sw = sl = 0  # stripsize
bs = bin.heigth   # binsize
sapp = s.append   # speedup
rapp = r.append   # speedup
ppop = p.pop  # speedup
while p and (ss = bs):
n = ppop(0)
nh, nw, nl = n.size
if ss + nh = bs:
ss += nh
sapp(n)
if nw  sw:
sw = nw
if nl  sl:
sl = nl
else:
rapp(n)
return s, (ss, sw, sl), r + p


def packlayer(bin, packages):
strips = []
layersize = 0
layerx = 0
layery = 0
binsize = bin.width
while packages:
strip, (sizex, stripsize, sizez), rest = packstrip(bin, packages)
if layersize + stripsize = binsize:
if not strip:
# we were not able to pack anything
break
layersize += stripsize
layerx = max([sizex, layerx])
layery = max([sizez, layery])
strips.extend(strip)
packages = rest
else:
# Next Layer please
packages = strip + rest
break
return strips, (layerx, layersize, layery), packages


def packbin(bin, packages):
packages.sort()
layers = []
contentheigth = 0
contentx = 0
contenty = 0
binsize = bin.length
while packages:
layer, (sizex, sizey, layersize), rest = packlayer(bin, packages)
if contentheigth + layersize = binsize:
if not layer:
# we were not able to pack anything
break
contentheigth += layersize
contentx = max([contentx, sizex])
contenty = max([contenty, sizey])
layers.extend(layer)
packages = rest
else:
# Next Bin please
packages = layer + rest
break
return layers, (contentx, contenty, contentheigth), packages


def packit(bin, originalpackages):
packedbins = []
packages = sorted(originalpackages)
while packages:
packagesinbin, (binx, biny, binz), rest = packbin(bin, packages)
if not packagesinbin:
# we were not able to pack anything
break
packedbins.append(packagesinbin)
packages = rest
# we now have a result, try to get a better result by rotating some bins

return packedbins, rest


# In newer Python versions these van be imported:
# from itertools import permutations
def product(*args, **kwds):
# product('ABCD', 'xy') -- Ax Ay Bx By Cx Cy Dx Dy
# product(range(2), repeat=3) -- 000 001 010 011 100 101 110 111
pools = map(tuple, args) * kwds.get('repeat', 1)
result = [[]]
for pool in pools:
result = [x + [y] for x in result for y in pool]
for prod in result:
yield tuple(prod)


def permutations(iterable, r=None):
pool = tuple(iterable)
n = 

Re: [web2py] Python Performance Issue, Part 2

2014-03-18 Thread Michele Comitini
 WSGIDaemonProcess hello user=www-data group=www-data threads=5

with web2py try the following instead:
WSGIDaemonProcess hello user=www-data group=www-data processes=number
of cores + 1 threads=(0 or 1)

If it's faster, then the GIL must be the cause.  flask by default has
much less features active (session for instance)



2014-03-18 21:04 GMT+01:00 horridohobbyist horrido.hobb...@gmail.com:
 I took the shipping code that I ran in Flask (without Apache) and adapted it
 to run under Apache as a Flask app. That way, I'm comparing apples to
 apples. I'm comparing the performance of the shipping code between Flask and
 web2py.

 Below, I've included the 'default' file from Apache2/sites-available for
 Flask.

 Basically, the code in Flask executes 10x faster than the same code in
 web2py. So my question is:  if Apache is at fault for the web2py app's slow
 performance, why doesn't Apache hurt the Flask app's performance? (This
 doesn't seem to be related to GIL or WSGI.)


 VirtualHost *:80
   ServerName 10.211.55.7
   WSGIDaemonProcess hello user=www-data group=www-data threads=5
   WSGIScriptAlias / /home/richard/welcome/hello.wsgi

   Directory /home/richard/welcome
 Order Allow,Deny
 Allow from all
   /Directory
 /VirtualHost

 --
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 ---
 You received this message because you are subscribed to the Google Groups
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-18 Thread horridohobbyist
Done. With processes=3, the 10x discrepancy is eliminated! (And this is in 
a Linux VM configured for 1 CPU.)


On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist 
 horrido...@gmail.comjavascript:: 

  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples to 
  apples. I'm comparing the performance of the shipping code between Flask 
 and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code in 
  web2py. So my question is:  if Apache is at fault for the web2py app's 
 slow 
  performance, why doesn't Apache hurt the Flask app's performance? (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, send 
 an 
  email to web2py+un...@googlegroups.com javascript:. 
  For more options, visit https://groups.google.com/d/optout. 


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-18 Thread Massimo Di Pierro
Thank you for all your tests. You should write a summary of your results 
with recommendations for Apache users.

On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this is in 
 a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com: 
  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples to 
  apples. I'm comparing the performance of the shipping code between 
 Flask and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available 
 for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code in 
  web2py. So my question is:  if Apache is at fault for the web2py app's 
 slow 
  performance, why doesn't Apache hurt the Flask app's performance? (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, send 
 an 
  email to web2py+un...@googlegroups.com. 
  For more options, visit https://groups.google.com/d/optout. 



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-18 Thread horridohobbyist
I shall do that. Thanks.

With the knowledge about processes=, I've tuned my actual Linux server to 
eliminate the 10x slowdown. As it turns out, for my 2.4GHz quad-core Xeon 
with 4GB RAM, processes=2 works best. I found that any other value (3, 4, 
5) gave very inconsistent results–sometimes I would get 1x (the ideal) and 
sometimes I would get 10x. Very bizarre.

processes=2 is counter-intuitive. After all, I have 4 cores. Why 
shouldn't processes=4 be good?

Anyway, not only is the shipping code fast, but I find that my overall 
web2py app feels a lot snappier. Is it just my imagination?

If processes=2 is boosting the speed of Python in general, then you would 
expect all of web2py to benefit. So maybe it's not my imagination.

Anyway, the takeaway, I think, is that you must tune the Apache 
configuration for the particular server hardware that you have. The default 
processes=1 is not good enough.


On Tuesday, 18 March 2014 22:37:58 UTC-4, Massimo Di Pierro wrote:

 Thank you for all your tests. You should write a summary of your results 
 with recommendations for Apache users.

 On Tuesday, 18 March 2014 19:44:29 UTC-5, horridohobbyist wrote:

 Done. With processes=3, the 10x discrepancy is eliminated! (And this is 
 in a Linux VM configured for 1 CPU.)


 On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist horrido...@gmail.com: 
  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples to 
  apples. I'm comparing the performance of the shipping code between 
 Flask and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available 
 for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code in 
  web2py. So my question is:  if Apache is at fault for the web2py app's 
 slow 
  performance, why doesn't Apache hurt the Flask app's performance? 
 (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, send 
 an 
  email to web2py+un...@googlegroups.com. 
  For more options, visit https://groups.google.com/d/optout. 



-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue, Part 2

2014-03-18 Thread horridohobbyist
threads=0 is no good–Apache restart upchucks on this.

BTW, I haven't experimented with the threads value. Might this also improve 
performance (with respect to GIL)?

Also, I was wondering. Is the processes= solution related to whether you 
are using the prefork MPM or the worker MPM? I know that Apache is 
normally compiled to use the prefork MPM.


On Tuesday, 18 March 2014 16:26:24 UTC-4, Michele Comitini wrote:

  WSGIDaemonProcess hello user=www-data group=www-data threads=5 

 with web2py try the following instead: 
 WSGIDaemonProcess hello user=www-data group=www-data processes=number 
 of cores + 1 threads=(0 or 1) 

 If it's faster, then the GIL must be the cause.  flask by default has 
 much less features active (session for instance) 



 2014-03-18 21:04 GMT+01:00 horridohobbyist 
 horrido...@gmail.comjavascript:: 

  I took the shipping code that I ran in Flask (without Apache) and 
 adapted it 
  to run under Apache as a Flask app. That way, I'm comparing apples to 
  apples. I'm comparing the performance of the shipping code between Flask 
 and 
  web2py. 
  
  Below, I've included the 'default' file from Apache2/sites-available for 
  Flask. 
  
  Basically, the code in Flask executes 10x faster than the same code in 
  web2py. So my question is:  if Apache is at fault for the web2py app's 
 slow 
  performance, why doesn't Apache hurt the Flask app's performance? (This 
  doesn't seem to be related to GIL or WSGI.) 
  
  
  VirtualHost *:80 
ServerName 10.211.55.7 
WSGIDaemonProcess hello user=www-data group=www-data threads=5 
WSGIScriptAlias / /home/richard/welcome/hello.wsgi 
  
Directory /home/richard/welcome 
  Order Allow,Deny 
  Allow from all 
/Directory 
  /VirtualHost 
  
  -- 
  Resources: 
  - http://web2py.com 
  - http://web2py.com/book (Documentation) 
  - http://github.com/web2py/web2py (Source code) 
  - https://code.google.com/p/web2py/issues/list (Report Issues) 
  --- 
  You received this message because you are subscribed to the Google 
 Groups 
  web2py-users group. 
  To unsubscribe from this group and stop receiving emails from it, send 
 an 
  email to web2py+un...@googlegroups.com javascript:. 
  For more options, visit https://groups.google.com/d/optout. 


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-17 Thread Ricardo Pedroso
On Thu, Mar 13, 2014 at 7:48 PM, horridohobbyist
horrido.hobb...@gmail.comwrote:

I have a rather peculiar Python performance issue with web2py. I'm using
 pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The
 standalone program from the command line works quickly. However, after I've
 incorporated the code into my web2py application, the same pyShipping code
 takes orders of magnitude longer to execute!!! How can this be?!

 I presume in both instances that pre-compiled code is being run.



I don't know if you already did (this thread is already too long to
follow), but can you pack a simple app that shows the problem?

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-17 Thread Jonathan Lundell
On 17 Mar 2014, at 9:21 AM, horridohobbyist horrido.hobb...@gmail.com wrote:
 WTF. Now, both Apache and Gunicorn are slow. Equally slow!
 

I really think it'd simplify matters to reproduce this outside the context of a 
web server. If the problem is really the GIL, then all these environment are 
doing is using a web server to create some idle threads, which seems like 
overkill to me.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-15 Thread Jonathan Lundell
On 15 Mar 2014, at 8:19 AM, Niphlod niph...@gmail.com wrote:
 @mcm: you got me worried. Your test function was clocking a hell lower than 
 the original script. But then I found out why; one order of magnitude less 
 (5000 vs 5). Once that was corrected, you got the exact same clock times 
 as my app (i.e. function directly in the controller). I also stripped out 
 the logging part making the app just return the result and no visible changes 
 to the timings happened.
 
 @hh: glad at least we got some grounds to hold on. 
 @mariano: compiled or not, it doesn't seem to change the mean. a compiled 
 app has just lower variance. 
 
 @all: jlundell definitively hit something. Times are much more lower when 
 threads are 1.

In a normal web2py-wsgi installation, we'd expect all the threads to be idle, 
waiting for an incoming connection, right? So here's another test. IIRC, Rocket 
is built to run 10 threads minimum by default. So in the standalone test, fire 
off 10 threads that open a socket and wait for a connection, and then run the 
test loop. To make it easier, I'd first try having them just sleep until 
they're woken up and terminated by the test thread; it's not clear why waiting 
on a socket would be more overhead than being in any other idle state (OTOH 
it's not clear why idle threads should create overhead at all)

And that suggests yet another simple test: raise Rockets minimum thread count 
from 10 to 100 and see what difference that makes.

 
 BTW: if I change originalscript.py to 
 
 # -*- coding: utf-8 -*-
 import time
 import threading
 
 def test():
 start = time.time()
 x = 0.0
 for i in range(1,5):
 x += (float(i+10)*(i+25)+175.0)/3.14
 res = str(time.time()-start)
 print elapsed time: + res + '\n'
 
 if __name__ == '__main__':
 t = threading.Thread(target=test)
 t.start()
 t.join()
 
 I'm getting really close timings to wsgi environment, 1 thread only tests, 
 i.e. 
 0.23 min, 0.26 max, ~0.24 mean
 
 
 -- 
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 --- 
 You received this message because you are subscribed to the Google Groups 
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[web2py] Python Performance Issue

2014-03-13 Thread horridohobbyist
I have a rather peculiar Python performance issue with web2py. I'm using 
pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The 
standalone program from the command line works quickly. However, after I've 
incorporated the code into my web2py application, the same pyShipping code 
takes orders of magnitude longer to execute!!! How can this be?!

I presume in both instances that pre-compiled code is being run.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-13 Thread Jonathan Lundell
On 13 Mar 2014, at 12:48 PM, horridohobbyist horrido.hobb...@gmail.com wrote:
 I have a rather peculiar Python performance issue with web2py. I'm using 
 pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The 
 standalone program from the command line works quickly. However, after I've 
 incorporated the code into my web2py application, the same pyShipping code 
 takes orders of magnitude longer to execute!!! How can this be?!
 
 I presume in both instances that pre-compiled code is being run.
 
 

Same machine, same Python installation? If not, maybe C vs Python supporting 
libraries?

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-13 Thread horridohobbyist
Yes, same machine, same installation. All I did was move the module from my 
test directory to web2py's site-packages folder. Then I copied the main 
program into my default application controller. *The same code is executing*
.

Just to be sure I'm not going out of mind, I printed out the elapsed time 
for each iteration in the main program (for both the command line execution 
and the web2py app execution). Lo and behold, the elapsed time for each 
iteration is much longer under web2py.

Note that pyShipping is a pure Python implementation. The Python supporting 
libraries **should** be the same in both instances.

I do note, however, that when I tried to incorporate the code into web2py, 
I found a namespace clash (class Package appears elsewhere in the web2py 
installation). I resolved this by renaming the module file. Otherwise, 
there should be no difference between command line execution and web2py 
execution.

Thanks.


On Thursday, 13 March 2014 15:54:37 UTC-4, Jonathan Lundell wrote:

 On 13 Mar 2014, at 12:48 PM, horridohobbyist 
 horrido...@gmail.comjavascript: 
 wrote:

 I have a rather peculiar Python performance issue with web2py. I'm using 
 pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The 
 standalone program from the command line works quickly. However, after I've 
 incorporated the code into my web2py application, the same pyShipping code 
 takes orders of magnitude longer to execute!!! How can this be?!

 I presume in both instances that pre-compiled code is being run.



 Same machine, same Python installation? If not, maybe C vs Python 
 supporting libraries?


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-13 Thread Jonathan Lundell
On 13 Mar 2014, at 1:38 PM, horridohobbyist horrido.hobb...@gmail.com wrote:
 Yes, same machine, same installation. All I did was move the module from my 
 test directory to web2py's site-packages folder. Then I copied the main 
 program into my default application controller. The same code is executing.
 
 Just to be sure I'm not going out of mind, I printed out the elapsed time for 
 each iteration in the main program (for both the command line execution and 
 the web2py app execution). Lo and behold, the elapsed time for each iteration 
 is much longer under web2py.
 
 Note that pyShipping is a pure Python implementation. The Python supporting 
 libraries **should** be the same in both instances.

Probably, but not necessarily. It could be that because of differences in 
sys.path there's a difference in whether some basic libraries like pickle vs 
cpickle (to pick a really random example) are getting loaded. 

Not a diagnosis; just a possibility, maybe a remote one. If you're doing 
elapsed-time measurements, you might build a list of timestamps of intermediate 
steps, and then print that.

 
 I do note, however, that when I tried to incorporate the code into web2py, I 
 found a namespace clash (class Package appears elsewhere in the web2py 
 installation). I resolved this by renaming the module file. Otherwise, there 
 should be no difference between command line execution and web2py execution.
 
 Thanks.
 
 
 On Thursday, 13 March 2014 15:54:37 UTC-4, Jonathan Lundell wrote:
 On 13 Mar 2014, at 12:48 PM, horridohobbyist horrido...@gmail.com wrote:
 I have a rather peculiar Python performance issue with web2py. I'm using 
 pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The 
 standalone program from the command line works quickly. However, after I've 
 incorporated the code into my web2py application, the same pyShipping code 
 takes orders of magnitude longer to execute!!! How can this be?!
 
 I presume in both instances that pre-compiled code is being run.
 
 
 
 Same machine, same Python installation? If not, maybe C vs Python supporting 
 libraries?
 
 -- 
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 --- 
 You received this message because you are subscribed to the Google Groups 
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an 
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [web2py] Python Performance Issue

2014-03-13 Thread Michele Comitini
Other things to look for are related to concurrency:

- Table, record locks
- concurrent threads
- long db transactions

one thing you should try first is to stop web2py completely and see if
using a single process from the commandline makes any difference.
I mean using command line like:

$ python web2py.py -M -S your app name here

and then run your code.




2014-03-13 21:47 GMT+01:00 Jonathan Lundell jlund...@pobox.com:

 On 13 Mar 2014, at 1:38 PM, horridohobbyist horrido.hobb...@gmail.com
 wrote:

 Yes, same machine, same installation. All I did was move the module from
 my test directory to web2py's site-packages folder. Then I copied the main
 program into my default application controller. *The same code is
 executing*.

 Just to be sure I'm not going out of mind, I printed out the elapsed time
 for each iteration in the main program (for both the command line execution
 and the web2py app execution). Lo and behold, the elapsed time for each
 iteration is much longer under web2py.

 Note that pyShipping is a pure Python implementation. The Python
 supporting libraries **should** be the same in both instances.


 Probably, but not necessarily. It could be that because of differences in
 sys.path there's a difference in whether some basic libraries like pickle
 vs cpickle (to pick a really random example) are getting loaded.

 Not a diagnosis; just a possibility, maybe a remote one. If you're doing
 elapsed-time measurements, you might build a list of timestamps of
 intermediate steps, and then print that.


 I do note, however, that when I tried to incorporate the code into web2py,
 I found a namespace clash (class Package appears elsewhere in the web2py
 installation). I resolved this by renaming the module file. Otherwise,
 there should be no difference between command line execution and web2py
 execution.

 Thanks.


 On Thursday, 13 March 2014 15:54:37 UTC-4, Jonathan Lundell wrote:

 On 13 Mar 2014, at 12:48 PM, horridohobbyist horrido...@gmail.com
 wrote:

 I have a rather peculiar Python performance issue with web2py. I'm using
 pyShipping 1.8a (from http://pydoc.net/Python/pyShipping/1.8a/). The
 standalone program from the command line works quickly. However, after I've
 incorporated the code into my web2py application, the same pyShipping code
 takes orders of magnitude longer to execute!!! How can this be?!

 I presume in both instances that pre-compiled code is being run.



 Same machine, same Python installation? If not, maybe C vs Python
 supporting libraries?


 --
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 ---
 You received this message because you are subscribed to the Google Groups
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.



  --
 Resources:
 - http://web2py.com
 - http://web2py.com/book (Documentation)
 - http://github.com/web2py/web2py (Source code)
 - https://code.google.com/p/web2py/issues/list (Report Issues)
 ---
 You received this message because you are subscribed to the Google Groups
 web2py-users group.
 To unsubscribe from this group and stop receiving emails from it, send an
 email to web2py+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout.


-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
web2py-users group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.