Hi Numpy dev community,

I'm keyvis, a statistical data scientist.

I'm currently using numpy in python 3.8.2 64-bit for a clustering problem,
on a machine with 1.9 TB RAM. When I try using np.zeros to create a 600,000
by 600,000 matrix of dtype=np.float32 it says
"Unable to allocate 1.31 TiB for an array with shape (600000, 600000) and
data type float32"

I used psutils to determine how much RAM python thinks it has access to and
it return with 1.8 TB approx.

Is there some way I can fix numpy to create these large arrays?
Thanks for your time and consideration
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion

Reply via email to