Roman added the comment:
I've checked it one more time.
And you're right (Sorry for trouble). I left old pyconfig.h in one place, so my
new python compilation was not just what I wanted.
Now I belive that everything with memory is ok.
Thank you very much for your help.
--
status:
Changes by STINNER Victor victor.stin...@gmail.com:
--
resolution: - invalid
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19893
___
___
STINNER Victor added the comment:
Sorry, but I still don't understand this issue.
Invalid read of size 4 is a known false positive. It can be worked around
using ./configure --with-valgrind and the suppression list, or using
./configure --without-pymalloc. If you still get the warning, you
messages: 205283
nosy: rstarostecki
priority: normal
severity: normal
status: open
title: Python cApi memory problem. Py_Initialize memory leak
type: resource usage
versions: Python 3.2, Python 3.3
Added file: http://bugs.python.org/file32983/pytest.tgz
___
Python
Changes by Antoine Pitrou pit...@free.fr:
--
nosy: +haypo, skrah
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19893
___
___
Python-bugs-list
Stefan Krah added the comment:
Did you use --suppressions=Misc/valgrind-python.supp?
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19893
___
Stefan Krah added the comment:
Also, you have zero definitely lost. possibly lost is not
particularly informative in the context of running Python:
These are almost certainly false positives.
--
___
Python tracker rep...@bugs.python.org
Roman added the comment:
I didn't use --suppressions=Misc/valgrind-python.supp before . (But I've done
it now). Nothing important has changed.
I understand that possibly lost is not particularly informative. I'm rather
worried about Invalid read of size 4 etc. Isn't it potentially dangerous?
Stefan Krah added the comment:
Did you compile Python --with-valgrind?
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19893
___
___
Roman added the comment:
I've just done it. Python 3.3.3 --with-valgrind. I can't see the difference.
Output appended.
--
Added file: http://bugs.python.org/file32986/vgrind3.3.3vc.out
___
Python tracker rep...@bugs.python.org
STINNER Victor added the comment:
You shoud configure --with-valgrind *and* use the suppression list, or
disable pymalloc using configure.
Victor
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19893
Roman added the comment:
I compiled python --with-valgrind --without-pymalloc, and used valgrind with
suppressions.
valgrind --suppressions=../Misc/valgrind-python.supp --leak-check=full
--show-reachable=no --show-possibly-lost=no --track-origins=yes
--log-file=vgrindNext.out ./test
)
messages: 115702
nosy: Trigve.Siver
priority: normal
severity: normal
status: open
title: _PyUnicode_New(), throw and memory problem
type: crash
versions: Python 3.1
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9785
Trigve Siver trig...@gmail.com added the comment:
I've tried to examine it in more depth.
Setting the python exception isn't necessary, only throwing the C++ exception
is needed for demonstrating the problem.
Also sometimes the address is 6 bytes lower than should be.
--
Trigve Siver trig...@gmail.com added the comment:
Nevermind, I found out it was problem with ref count while using
PyTuple_SetItem()
--
resolution: - invalid
status: open - closed
___
Python tracker rep...@bugs.python.org
Hello!
I came across a problem in running a .exe file through ipython.
os.system(' ~.exe') was used.
The error message popped up says
Memory error: insufficient physical memory available
What should I try further?
--
http://mail.python.org/mailman/listinfo/python-list
On Thu, 15 Jul 2010 00:54:19 -0700, youngung wrote:
Hello!
I came across a problem in running a .exe file through ipython.
os.system(' ~.exe') was used. The error message popped up says
Memory error: insufficient physical memory available
What should I try further?
More memory?
this happens and if there's a way to avoid this
memory problem?
First the line(s) python code I executed.
Then the memory usage of the process:
Mem usage after creation/populating of big_list
sys.getsizeof(big_list)
Mem usage after deletion of big_list
big_list = [0.0] * 2700*3250
40
35
6
big_list = [0.0
Allard Warrink, 13.01.2010 15:24:
I found out that when i populated the lists with floats using a for ... in
range() loop a lot of overhead memory is used
Note that range() returns a list in Python 2.x. For iteration, use
xrange(), or switch to Python 3 where range() returns an iterable.
Allard Warrink, 13.01.2010 15:24:
so I did some investigation on the memory use of the script. I found
out that when i populated the lists with floats using a for ... in
range() loop a lot of overhead memory is used and that this memory is
not freed after populating the list and is also not
or
xrange is responsible for the memory overhead.
Does anybody know why this happens and if there's a way to avoid this
memory problem?
First the line(s) python code I executed.
Then the memory usage of the process:
Mem usage after creation/populating of big_list
sys.getsizeof(big_list)
Mem usage after
million floats only wastes 35 MB or
so. That's wasteful, but not excessively so.
Does anybody know why
this happens and if there's a way to avoid this memory problem?
First the line(s) python code I executed. Then the memory usage of the
process: Mem usage after creation/populating of big_list
On Thu, 14 Jan 2010 02:03:52 +, Steven D'Aprano wrote:
Again, the technique you are using does a pointless amount of extra
work. The values in the xrange object are already floats, calling float
on them just wastes time.
Er what?
Sorry, please ignore that. This is completely untrue --
Hi!
I made a string parser program, it has a main function and a working
thread class. When it is running in 24h non-stop, the memory run out.
I dont Know why. Do anybody know somekind of debugger that can i see
what is eating the memory? Maybe there is a list or value or
dictionary that is
On Jan 16, 5:24 am, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
Hi!
I made a string parser program, it has a main function and a working
thread class. When it is running in 24h non-stop, the memory run out.
I dont Know why. Do anybody know somekind of debugger that can i see
what is eating
Hi,
thank you for your comments and your hints (I probably deserve some kind
of subtle irony). I found the problem:
I thought a numpy array A has shape (n,) but actually it had shape
(n,1). In the loop I sampled a value from that array:
v.append(A[i])
So what happened was that I got a view
Hi,
I have a strange (for me) memory problem. When running a loop in a
Python program memory usage increases from about 4% up to 100%. I do a
gc.collect() every loop cycle but this doesn't help. There are about
67000 objects that are tracked by the garbage collector. This number
does vary
Rolf Wester wrote:
Hi,
I have a strange (for me) memory problem. When running a loop in a
Python program memory usage increases from about 4% up to 100%. I do a
gc.collect() every loop cycle but this doesn't help. There are about
67000 objects that are tracked by the garbage collector
hint on what I could try.
Thank you in advance
Rolf
Diez B. Roggisch wrote:
Rolf Wester wrote:
Hi,
I have a strange (for me) memory problem. When running a loop in a
Python program memory usage increases from about 4% up to 100%. I do a
gc.collect() every loop cycle but this doesn't help
Rolf Wester wrote:
Diez B. Roggisch wrote:
Rolf Wester wrote:
I have a strange (for me) memory problem. When running a loop in a
Python program memory usage increases from about 4% up to 100%. I do a
gc.collect() every loop cycle but this doesn't help. There are about
67000 objects
En Thu, 15 Nov 2007 10:10:06 -0300, Peter Otten [EMAIL PROTECTED]
escribió:
Rolf Wester wrote:
Sorry, of course your are wright. I'm running Python2.5 on Linux, my
program imports numpy, matplotlib, sys and a python module of my own.
This module uses numpy and scipy.weave for imbedded
En Tue, 18 Sep 2007 12:24:46 -0300, Christoph Scheit
[EMAIL PROTECTED] escribi�:
# add row i and increment number of rows
self.rows.append(DBRow(self, self.nRows))
self.nRows += 1
This looks suspicious, and may indicate that your structure contains
cycles, and Python cannot
Hi,
I have a short script/prog in order to read out binary files from a numerical
simulation. This binary files still need some post-processing, which is
summing up results from different cpu's, filtering out non-valid entrys
and bringing the data in some special order.
Reading the binary data
On Tue, 18 Sep 2007 14:06:22 +0200, Christoph Scheit wrote:
Then the data is added to a table, which I use for the actual Post-Processing.
The table is actually a Class with several Columns, each column internally
being represented by array.
Array or list?
# create reader
breader =
On Tuesday 18 September 2007 15:10, Marc 'BlackJack' Rintsch wrote:
On Tue, 18 Sep 2007 14:06:22 +0200, Christoph Scheit wrote:
Then the data is added to a table, which I use for the actual
Post-Processing. The table is actually a Class with several Columns,
each column internally being
En Tue, 18 Sep 2007 10:58:42 -0300, Christoph Scheit
[EMAIL PROTECTED] escribi�:
I have to deal with several millions of data, actually I'm trying an
example
with
360 grid points and 1 time steps, i.e. 3 600 000 entries (and each
row
consits of 4 int and one float)
Of course, the
Christoph Scheit a écrit :
On Tuesday 18 September 2007 15:10, Marc 'BlackJack' Rintsch wrote:
On Tue, 18 Sep 2007 14:06:22 +0200, Christoph Scheit wrote:
Then the data is added to a table, which I use for the actual
Post-Processing. The table is actually a Class with several Columns,
each
Hi, Thank you all very much,
so I will consider using a database. Anyway I would like
how to detect cycles, if there are.
# add row i and increment number of rows
self.rows.append(DBRow(self, self.nRows))
self.nRows += 1
This looks suspicious, and may indicate that your structure
On Jun 17, 8:51 pm, Squzer Crawler [EMAIL PROTECTED] wrote:
i am developing distributed environment in my college using Python. I
am using therads in client for downloading wepages. Even though i am
reusing the thread, memory usage get increased. I don know why.? I am
using BerkelyDB for
On Jun 18, 11:06 am, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
On Jun 17, 8:51 pm, Squzer Crawler [EMAIL PROTECTED] wrote:
i am developing distributed environment in my college using Python. I
am using therads in client for downloading wepages. Even though i am
reusing the thread, memory
Squzer Crawler wrote:
On Jun 18, 11:06 am, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
On Jun 17, 8:51 pm, Squzer Crawler [EMAIL PROTECTED] wrote:
i am developing distributed environment in my college using Python. I
am using therads in client for downloading wepages. Even though i am
reusing
i am developing distributed environment in my college using Python. I
am using therads in client for downloading wepages. Even though i am
reusing the thread, memory usage get increased. I don know why.? I am
using BerkelyDB for URLQueue, BeautifulShop for Parsing the webpages.
Any idea of
In [EMAIL PROTECTED], Yi Xing wrote:
Is there a way that I can define a two-dimensional array in
array.array()? Thanks.
If you need more than one dimension you really should take a look at
`numarray` or `numpy`. What are you going to do with the data once it's
loaded into memory?
Ciao,
I used the array module and loaded all the data into an array.
Everything works fine now.
On Aug 14, 2006, at 4:01 PM, John Machin wrote:
Yi Xing wrote:
Thanks! I just found that that I have no problem with
x=[[10.0]*2560*2560]*500, but x=range(1*2560*2560*30) doesn't work.
Hi,
I need to read a large amount of data into a list. So I am trying to
see if I'll have any memory problem. When I do
x=range(2700*2700*3) I got the following message:
Traceback (most recent call last):
File stdin, line 1, in ?
MemoryError
Any way to get around this problem? I have
Yi Xing wrote:
I need to read a large amount of data into a list. So I am trying to
see if I'll have any memory problem. When I do
x=range(2700*2700*3) I got the following message:
Traceback (most recent call last):
File stdin, line 1, in ?
MemoryError
Any way to get around
Yi Xing wrote:
Hi,
I need to read a large amount of data into a list. So I am trying to
see if I'll have any memory problem. When I do
x=range(2700*2700*3) I got the following message:
Traceback (most recent call last):
File stdin, line 1, in ?
MemoryError
Any way to get around
[EMAIL PROTECTED] wrote:
If you know that you need floats only, then you can use a typed array
(an array.array) instead of an untyped array (a Python list):
import array
a = array.array(f)
Clarification: typecode 'f' stores a Python float (64-bits, equivalent
to a C double) as a 32-bit FP
I need to read a large amount of data into a list. So I am trying to
see if I'll have any memory problem. When I do
x=range(2700*2700*3) I got the following message:
Traceback (most recent call last):
File stdin, line 1, in ?
MemoryError
Any way to get around this problem? I have
I tried the following code:
i=0
n=2600*2600*30
a=array.array(f)
while (i=n):
.. i=i+1
.. a.append(float(i))
..
Traceback (most recent call last):
File stdin, line 3, in ?
MemoryError
to see the size of the array at the time of memory error:
len(a)
8539248.
I use Windows XP x64
Yi Xing wrote:
Hi,
I need to read a large amount of data into a list. So I am trying to see
if I'll have any memory problem. When I do
x=range(2700*2700*3) I got the following message:
Traceback (most recent call last):
File stdin, line 1, in ?
MemoryError
Any way to get around
On a related question: how do I initialize a list or an array with a
pre-specified number of elements, something like
int p[100] in C? I can do append() for 100 times but this looks silly...
Thanks.
Yi Xing
--
http://mail.python.org/mailman/listinfo/python-list
Yi Xing wrote:
On a related question: how do I initialize a list or an array with a
pre-specified number of elements, something like
int p[100] in C? I can do append() for 100 times but this looks silly...
Thanks.
Yi Xing
You seldom need to do that in python, but it's easy enough:
Yi Xing wrote:
On a related question: how do I initialize a list or an array with a
pre-specified number of elements, something like
int p[100] in C? I can do append() for 100 times but this looks silly...
Thanks.
Yi Xing
Use [0]*100 for a list.
THN
--
Yi Xing wrote:
I tried the following code:
i=0
n=2600*2600*30
a=array.array(f)
while (i=n):
.. i=i+1
.. a.append(float(i))
Not a good idea. The array has to be resized, which may mean that a
realloc won't work because of fragmentation, you're out of luck because
plan B is to
Yi Xing wrote:
On a related question: how do I initialize a list or an array with a
pre-specified number of elements, something like
int p[100] in C? I can do append() for 100 times but this looks silly...
Thanks.
Yi Xing
Unlike other languages this is seldom done in Python. I think
Thanks! I just found that that I have no problem with
x=[[10.0]*2560*2560]*500, but x=range(1*2560*2560*30) doesn't work.
-Yi
On Aug 14, 2006, at 3:08 PM, Larry Bates wrote:
Yi Xing wrote:
On a related question: how do I initialize a list or an array with a
pre-specified number of elements,
Is there a way that I can define a two-dimensional array in
array.array()? Thanks.
On Aug 14, 2006, at 2:28 PM, John Machin wrote:
Yi Xing wrote:
I tried the following code:
i=0
n=2600*2600*30
a=array.array(f)
while (i=n):
.. i=i+1
.. a.append(float(i))
Not a good idea. The
Yi Xing wrote:
On a related question: how do I initialize a list or an array with a
pre-specified number of elements, something like
int p[100] in C? I can do append() for 100 times but this looks silly...
Thanks.
Yi Xing
In the case of an array, you may wish to consider the fromfile()
John Machin wrote:
Incredible. That's only 34 MB. What is the size of your paging file?
What memory guzzlers were you running at the same time? What was the
Task Manager Performance pane showing while your test was running?
What version of Python?
He didn't say Windows (so far). AFAICT, his
Yi Xing wrote:
Thanks! I just found that that I have no problem with
x=[[10.0]*2560*2560]*500, but x=range(1*2560*2560*30) doesn't work.
That's no surprise. In the first case, try
x[0][0] = 20.0
print x[1][0]
You have the very same (identical) list of 2560*2560 values in x
500 times.
To
Yi Xing wrote:
Thanks! I just found that that I have no problem with
x=[[10.0]*2560*2560]*500, but x=range(1*2560*2560*30) doesn't work.
range(1*2560*2560*30) is creating a list of 196M *unique* ints.
Assuming 32-bit ints and pointers: that's 4 bytes each for the value, 4
for the type pointer,
Martin v. Löwis wrote:
John Machin wrote:
Incredible. That's only 34 MB. What is the size of your paging file?
What memory guzzlers were you running at the same time? What was the
Task Manager Performance pane showing while your test was running?
What version of Python?
He didn't say
Hi
last week I posted a problem with running out of memory when changing
values in NumPy arrays. Since then I have tried many different
approaches and
work-arounds but to no avail.
I was able to reduce the code (see below) to its smallest size and
still
have the problem, albeit at a slower rate.
sonjaa wrote:
Hi
last week I posted a problem with running out of memory when changing
values in NumPy arrays. Since then I have tried many different
approaches and
work-arounds but to no avail.
I was able to reduce the code (see below) to its smallest size and
still
have the problem,
sonjaa wrote:
Also, are there other python methods/extensions that can create
multi-deminsional arrays?
if this example is typical for the code you're writing, you might as
well use nested Python lists:
def make_array(width, height, value):
out = []
for y in
sonjaa wrote:
Hi
last week I posted a problem with running out of memory when changing
values in NumPy arrays. Since then I have tried many different
approaches and
work-arounds but to no avail.
[...]
Based on the numpy-discussion this seems to be fixed in the SVN now(?).
Anyway, you can
I've been in contact with Travis O, and he said it was fixed in the
SVN.
thanks for the suggestions, I'll try them out now.
best
Sonja
Filip Wasilewski wrote:
sonjaa wrote:
Hi
last week I posted a problem with running out of memory when changing
values in NumPy arrays. Since then I
of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Parser/Compiler
Group: Python 2.4
Status: Open
Resolution: None
Priority: 5
Submitted By: Darek Ostolski (ostolski)
Assigned to: Nobody/Anonymous (nobody)
Summary: Encoding memory
thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Parser/Compiler
Group: Python 2.4
Status: Open
Resolution: None
Priority: 9
Submitted By: Darek Ostolski (ostolski)
Assigned to: Nobody/Anonymous (nobody)
Summary: Encoding memory problem.
Initial
thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Parser/Compiler
Group: Python 2.4
Status: Closed
Resolution: Out of Date
Priority: 5
Submitted By: Darek Ostolski (ostolski)
Assigned to: Nobody/Anonymous (nobody)
Summary: Encoding memory
71 matches
Mail list logo