no, the problem here is definitely that you're generating a lot of
garbage that survives the young generation. you can always try playing
with PYPY_GC_NURSERY envirtonment var (defaults to 4M I think)
On Mon, Mar 17, 2014 at 9:04 PM, Martin Koch wrote:
> Well, it would appear that we have the pro
not sure how more we can help without looking into the code
On Mon, Mar 17, 2014 at 6:05 PM, Martin Koch wrote:
> Thanks :)
>
> /Martin
>
>
>> On 17/03/2014, at 16.41, Maciej Fijalkowski wrote:
>>
>> ok.
>>
>> so as you can probably see, the max is not that big, which means the
>> GC is really i
Hi all,
My name in Rajul, and I am a final year undergraduate student at the Indian
Institute of Technology Kharagpur. I wish to participate in
Google Summer of Code 2014, and while going through the list of
organisations, I came across PyPy. I am proficient with programming
languages C/C++, Pytho
Ah. I had misunderstood. I'll get back to you on that :) thanks
/Martin
> On 17/03/2014, at 15.21, Maciej Fijalkowski wrote:
>
> eh, this is not what I need
>
> I need a max of TIME it took for a gc-minor and the TOTAL time it took
> for a gc-minor (per query) (ideally same for gc-walkroots a
Thanks :)
/Martin
> On 17/03/2014, at 16.41, Maciej Fijalkowski wrote:
>
> ok.
>
> so as you can probably see, the max is not that big, which means the
> GC is really incremental. What happens is you get tons of garbage that
> survives minor collection every now and then. I don't exactly know
ok.
so as you can probably see, the max is not that big, which means the
GC is really incremental. What happens is you get tons of garbage that
survives minor collection every now and then. I don't exactly know
why, but you should look what objects can potentially survive for too
long.
On Mon, Ma
eh, this is not what I need
I need a max of TIME it took for a gc-minor and the TOTAL time it took
for a gc-minor (per query) (ideally same for gc-walkroots and
gc-collect-step)
On Mon, Mar 17, 2014 at 4:19 PM, Martin Koch wrote:
> Here are the collated results of running each query. For each ru
Here are the collated results of running each query. For each run, I count
how many of each of the pypy debug lines i get. I.e. there were 668 runs
that printed 58 loglines that contain "{gc-minor" which was eventually
followed by "gc-minor}". I have also counted if the query was slow;
interestingl
Based On Maciej's suggestion, I tried the following
PYPYLOG=- pypy mem.py 1000 > out
This generates a logfile which looks something like this
start-->
[2b99f1981b527e] {gc-minor
[2b99f1981ba680] {gc-minor-walkroots
[2b99f1981c2e02] gc-minor-walkroots}
[2b99f19890d750] gc-minor}
[snip]
...
<-
On Mon, Mar 17, 2014 at 3:20 PM, Maciej Fijalkowski wrote:
> are you *sure* it's the walkroots that take that long and not
> something else (like gc-minor)? More of those mean that you allocate a
> lot more surviving objects. Can you do two things:
>
> a) take a max of gc-minor (and gc-minor-stack
Well, then it works out to around 2.5GHz, which seems reasonable. But it
doesn't alter the conclusion from the previous email: The slow queries then
all have a duration around 34*10^9 units, 'normal' queries 1*10^9 units, or
.4 seconds at this conversion. Also, the log shows that a slow query
perfo
are you *sure* it's the walkroots that take that long and not
something else (like gc-minor)? More of those mean that you allocate a
lot more surviving objects. Can you do two things:
a) take a max of gc-minor (and gc-minor-stackwalk), per request
b) take the sum of those
and plot them
On Mon, M
What is the unit? Perhaps I'm being thick here, but I can't correlate it
with seconds (which the program does print out). Slow runs are around 13
seconds, but are around 34*10^9(dec), 0x8 timestamp units (e.g.
from 0x2b994c9d31889c to 0x2b9944ab8c4f49).
On Mon, Mar 17, 2014 at 12:09 PM,
I think it's the cycles of your CPU
On Mon, Mar 17, 2014 at 2:48 PM, Martin Koch wrote:
> What is the unit? Perhaps I'm being thick here, but I can't correlate it
> with seconds (which the program does print out). Slow runs are around 13
> seconds, but are around 34*10^9(dec), 0x8 timesta
The number of lines is nonsense. This is a timestamp in hex.
On Mon, Mar 17, 2014 at 12:46 PM, Martin Koch wrote:
> Based On Maciej's suggestion, I tried the following
>
> PYPYLOG=- pypy mem.py 1000 > out
>
> This generates a logfile which looks something like this
>
> start-->
> [2b99f1981b5
there is an environment variable PYPYLOG=gc:- (where - is stdout)
which will do that for you btw.
maybe you can find out what's that using profiling or valgrind?
On Sun, Mar 16, 2014 at 11:34 PM, Martin Koch wrote:
> I have tried getting the pypy source and building my own version of pypy. I
> h
16 matches
Mail list logo