Hi Martin,
Hi,
I knw you asked for memory profiling but I could not resist and did CPU
profiling on your testcase. I have attached some screenshots and in words:
thanks for these tips about profiling.
Stepan
Hi Chris,
I've used some hacky tricks to get around this, which mostly involve
downsampling the image on the fly based on screen resolution. One such
effort is at https://github.com/ChrisBeaumont/mpl-modest-image
(https://github.com/ChrisBeaumont/mpl-modest-image).
I tried your code
Hi,
I would like to plot multiple overlayed 4096x4096 images in one axes. If I
run this code the plot takes 300 MB of memory:
import numpy as np
import matplotlib.pyplot as plt
if __name__ == '__main__':
img = np.zeros((4096, 4096))
img[100: 300, 100:1500] = 200
You could, before plotting, sum the different image arrays? Depending on
whether you are plotting RGB(A) images or greyscale images, you could take
the sum of the color channels, or take a weighted average.
The method you use here depends strongly on the image type, but it will
reduce memory
Hi,
You could, before plotting, sum the different image arrays? Depending on
whether you are plotting RGB(A) images or greyscale images, you could take
the sum of the color channels, or take a weighted average.
Yes, I will probably merge the images (RGBA) before plotting. I want to
Those numbers actually make a lot of sense.
For a 4k by 4k 2D array of 64-bit floats, you're using 128MiB of memory,
just to store them. Displaying such an array with mpl would take a copy of
that and add some objects for housekeeping (on my machine about 150MB to
display one such array together
You could look at whether or not you actually need 64-bit precision. Often
times, 8-bit precision per color channel is justifiable, even in grayscale.
My advice is to play with the dtype of your array or, as you mentioned,
resample.
thanks, this helped me significantly, uint8
I've been burned by this before as well. MPL stores some intermediate data
products (for example, scaled RGB copies) at full resolution, even though
the final rendered image is downsampled depending on screen resolution.
I've used some hacky tricks to get around this, which mostly involve
On 08/27/2013 09:49 AM, Chris Beaumont wrote:
I've been burned by this before as well. MPL stores some intermediate
data products (for example, scaled RGB copies) at full resolution,
even though the final rendered image is downsampled depending on
screen resolution.
I've used some hacky
I am trying to use imshow to plot some semi-large fits images.
Here is the code:
from math import *
import numpy as np
from matplotlib import pyplot as plt
from matplotlib import cm as cm
import pyfits
cat = /Volumes/Sweden/PS1SC/Data/PS20090603-3/MD09/skycell.092/
fname =
Because the image is so large, and matplotlib carries out various
operations on the image (scaling, resampling, etc), it uses a lot of
memory. This is one area where a specialized display program will be
more efficient. If you need to use matplotlib, decide whether you
really only want to
That is what I was assuming, but it still seems a little odd that
matplotlib generates
that large of a memory footprint. Loading the fits file into the
program using pyfits,
with the code only uses 19MB of real memory and 600MB of virtual
memory (strangly
adding the line img = hdu[1].data,
On Jun 30, 2009, at 7:54 PM, Tommy Grav wrote:
That is what I was assuming, but it still seems a little odd that
matplotlib generates
that large of a memory footprint. Loading the fits file into the
program using pyfits,
with the code only uses 19MB of real memory and 600MB of virtual
13 matches
Mail list logo