Re: [julia-users] Help needed with excessive memory allocation for Tuples
I created [1] to make specific note of this issue. Bill. [1] https://github.com/JuliaLang/julia/issues/18084
Re: [julia-users] Help needed with excessive memory allocation for Tuples
On Wednesday, 17 August 2016 20:22:56 UTC+2, Kristoffer Carlsson wrote: > > See pr 8134. I dont think you have any worse cache locality than with your > original tuple. You are right. I just have more memory usage. The heap will in fact have better cache locality. I'll just have exp appearing in every object in the "chained" heap, rather than per chain. I was trying hard to keep memory usage down, since some of the large benchmarks we want to do will barely fit on the machine, otherwise. I looked at the PR, and it seems relevant, but I'm not sure it directly deals with this issue. The problem with immutables indeed seems to be that you can't make immutables that contain pointers stack allocated since they contain pointers. But the standard way of dealing with this is to have the GC conservatively scan the stack. Jeff seems to make this point. On the other hand, he seems to suggest that such a mechanism already exists but earlier seems to suggest that you should only stack allocate something if it doesn't contain pointers. So I'm totally confused after reading that. I suppose if the compiler is clever enough to elide the creation of tuples that are being written directly into heap allocated resources (such as arrays), then Jeff can get his way and problems such as the one I have wont occur. But I'm very sceptical that this would be a complete solution. Bill.
Re: [julia-users] Help needed with excessive memory allocation for Tuples
See pr 8134. I dont think you have any worse cache locality than with your original tuple.
Re: [julia-users] Help needed with excessive memory allocation for Tuples
I basically extended the heap_s struct to also include the exp field. This uses twice the memory, but gets rid of the massive number of heap allocations. Unfortunately I lose the cache efficiency of the original solution, so it's still about 20-50% slower than it should be. It'll have to do for now. I've been looking around for a specific ticket for this tuple/immutable issue, but haven't managed to find one. I see "temporary tuple elision" on Jeff's compiler optimization tracker marked as done, but I don't know what the details are. There's a bunch of tickets around the place, mostly about fixed sized arrays and so on, which seem to depend on efficient tuples. But this is all stuff that went up before the tupocalypse (which I thought made tuples efficient). I'd really like to make sure this issue has a ticket somewhere since it is so fundamental. Do you by chance know where you saw this already? If not, I'll make another ticket for it so it can be tracked explicitly. The best I can find is [1]. But I don't see any links to open tickets there. There is a pull request, but it's not totally clear to me what this fixes (though it might explain why this hasn't been fixed already). I don't understand how the Julia GC can work at all without marking roots in the stack, so I'm not sure how to understand that PR. Bill. [1] https://groups.google.com/forum/#!topic/julia-users/LthfABeDN50 On 17 August 2016 at 19:11, Kristoffer Carlsson wrote: > Just make a new type instead of the tuple and directly set the fields of > it?
Re: [julia-users] Help needed with excessive memory allocation for Tuples
Just make a new type instead of the tuple and directly set the fields of it?
Re: [julia-users] Help needed with excessive memory allocation for Tuples
Wowsers! That's one hell of a serious issue. On 17 August 2016 at 18:33, Kristoffer Carlsson wrote: > Immutables and tuples that contain references to heap allocated objects > are currently themselves allocated on the heap. There is an issue about it. > >
Re: [julia-users] Help needed with excessive memory allocation for Tuples
Immutables and tuples that contain references to heap allocated objects are currently themselves allocated on the heap. There is an issue about it.
Re: [julia-users] Help needed with excessive memory allocation for Tuples
Wait, I understand why heap_s would be allocated on the heap. But why would a tuple involving a heap_s need to be allocated on the heap assuming the heap_s already exists? The tuple should just have the NTuple and one pointer (to a heap_s) at the machine level. This is clearly the problem, but I don't understand why this would happen (or how to work around it). Bill. On 17 August 2016 at 18:21, Kristoffer Carlsson wrote: > Unless heap_s is a bitstype then the tuple will be allocated on the heap. > But you are saying it is getting allocated twice? >
Re: [julia-users] Help needed with excessive memory allocation for Tuples
Unless heap_s is a bitstype then the tuple will be allocated on the heap. But you are saying it is getting allocated twice?
Re: [julia-users] Help needed with excessive memory allocation for Tuples
It's very, very hard to produce working code that illustrates a problem when working on a very large system like ours. There's nearly 1000 lines of code just to set up the data! If you want to take a look, it's here, though I'm not sure how enlightening it will be: https://github.com/wbhart/Nemo.jl/blob/mpoly/src/generic/MPoly.jl Specifically the issue is at lines 578 and 584. To run this in Nemo, you would need to check out the mpoly branch, build Nemo and use the following lines of code: using Nemo R, x, y, z, t = PolynomialRing(ZZ, ["x", "y", "z", "t"]) f = 1 + x + y + z + t p = f^20; q = Nemo.mul(p, p + 1); It is this final line of code that in turn invokes Nemo.mul_johnson which in turn invokes heapinsert!. I am quite certain the line of code that I gave is at issue here. Instead of simply writing the tuple directly into the array, it first allocates a tuple on the heap, then copies that into the array. That obviously shouldn't happen. Bill. On 17 August 2016 at 17:54, Kristoffer Carlsson wrote: > As always, it is very helpful if there is some code that is actually > runnable in the post that contains the problem. It is easier to run Julia > code in Julia than in your brain :)
Re: [julia-users] Help needed with excessive memory allocation for Tuples
N = 1 in the case I'm profiling/timing. It's typically < 10 in applications, usually much less. Bil. On 17 August 2016 at 17:50, Kristoffer Carlsson wrote: > What is N typically?