This is now track #10262
(memory leak in scalar*vector multiplication)

Dima

On Nov 13, 5:10 pm, Jason Grout <jason-s...@creativetrax.com> wrote:
> On 11/12/10 10:30 PM, Robert Bradshaw wrote:
>
> > On Fri, Nov 12, 2010 at 7:10 PM, Jason Grout
> > It only has to construct an element if it can't figure out what to do
> > after consulting the Parents themselves.
>
> Ah, okay.
>
>
>
>
>
>
>
> >> And then there's the matter you talk about; why is an element so big?
>
> > The example above is quite strange. No idea why it should be so big.
> > (Note that these are are arbitrary precision integers, so the relative
> > overhead for small ones is still quite large).
>
> > As for the original example,
>
> > sage: type(vector(RR, range(100)))
> > <type 'sage.modules.free_module_element.FreeModuleElement_generic_dense'>
>
> > Which means that each element is a full Python RealNumber, quite a bit
> > more than a double. Still doesn't explain the memory usage. For almost
> > any kind of linear algebra, you're better off using RDF, or even numpy
> > directly.
>
> RDF also seems to have a big problem:
>
> sage: v=vector(RDF, range(10000))
> sage: get_memory_usage()
> 206.84765625
> sage: w=v.parent().an_element()
> sage: get_memory_usage()
> 983.60546875
>
> This is very strange, as RDF vectors are literally light wrappers around
> numpy arrays.  I wonder how much Sage library code has to be loaded to
> do this conversion?
>
> Note that an_element() is cached in the parent.  Still, that seems odd
> that it goes up by ~780MB.
>
> Jason

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to