[julia-users] Parallel loop, what wroong ? Parallel is slower then normal
Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
Re: [julia-users] Re: Parallel loop, what wroong ? Parallel is slower then normal
How many worker threads did you start? Can you make D a SharedArray or DArray? On Sat Jan 31 2015 at 10:55:51 AM Paul Analyst paul.anal...@mail.com wrote: Realy ? Is imposibly smoething like : take 1. column and copmute on 1. core, wihout waiting for end of 1. oparation take 2. column and copmpute on 2. cores .etc ? Paul W dniu 2015-01-31 o 16:32, Tim Holy pisze: Paul, until the threads branch gets merged, I recommend that you just accept the fact that you'll only have 1 core active for most operations. --Tim On Saturday, January 31, 2015 07:15:25 AM paul analyst wrote: Thx, but, no. For sparse matrix 10^5,10^4,0.002 is the same . Time for both whiles is about 48 sek, only 11% o cores is used. I vave 8 cores, 7 sleeps:/ Paul W dniu sobota, 31 stycznia 2015 15:50:02 UTC+1 użytkownik Sam Kaplan napisał: Hi Paul, If D is allocated on the master, then Julia will need to pass D from the master to the workers. I'm guessing that this communication might be more expensive than the compute in your loops. It may be useful to take a look at distributed arrays in the parallel section of the Julia docs. Hope it helps. Sam On Saturday, January 31, 2015 at 7:38:22 AM UTC-6, paul analyst wrote: Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
Re: [julia-users] Re: How to read structure of HDF5 file ?
file IS empty? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) julia names(fid) 4-element Array{Symbol,1}: :handle :ios :name :mark julia dump(fid) ERROR: invalid arguments to dump in dump at show.jl:866 (repeats 2 times) Paul W dniu sobota, 31 stycznia 2015 13:11:30 UTC+1 użytkownik Tim Holy napisał: Keith's tips are good. Within julia, you can also try names(fid) or dump(fid). --Tim On Saturday, January 31, 2015 03:52:01 AM Keith Campbell wrote: You can use the freely available HDFView, from the HDF Group. The HDF Group also provides a set of command line tools for inspecting files. Keith On Saturday, January 31, 2015 at 6:04:11 AM UTC-5, paul analyst wrote: How to read structure of HDF5 file? Look what is inside ? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) I d`not kwnow what is in this file , How to read it (strukture) ? Paul
[julia-users] Re: Parallel loop, what wroong ? Parallel is slower then normal
Hi Paul, If D is allocated on the master, then Julia will need to pass D from the master to the workers. I'm guessing that this communication might be more expensive than the compute in your loops. It may be useful to take a look at distributed arrays in the parallel section of the Julia docs. Hope it helps. Sam On Saturday, January 31, 2015 at 7:38:22 AM UTC-6, paul analyst wrote: Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
Re: [julia-users] Re: Parallel loop, what wroong ? Parallel is slower then normal
Nash, big thx julia procs() 1-element Array{Int64,1}: 1 julia addprocs(7) 7-element Array{Any,1}: 2 3 4 5 6 7 8 Now is 3-4 times faster !!! Paul W dniu 2015-01-31 o 17:00, Jameson Nash pisze: How many worker threads did you start? Can you make D a SharedArray or DArray? On Sat Jan 31 2015 at 10:55:51 AM Paul Analyst paul.anal...@mail.com mailto:paul.anal...@mail.com wrote: Realy ? Is imposibly smoething like : take 1. column and copmute on 1. core, wihout waiting for end of 1. oparation take 2. column and copmpute on 2. cores .etc ? Paul W dniu 2015-01-31 o 16:32, Tim Holy pisze: Paul, until the threads branch gets merged, I recommend that you just accept the fact that you'll only have 1 core active for most operations. --Tim On Saturday, January 31, 2015 07:15:25 AM paul analyst wrote: Thx, but, no. For sparse matrix 10^5,10^4,0.002 is the same . Time for both whiles is about 48 sek, only 11% o cores is used. I vave 8 cores, 7 sleeps:/ Paul W dniu sobota, 31 stycznia 2015 15:50:02 UTC+1 użytkownik Sam Kaplan napisał: Hi Paul, If D is allocated on the master, then Julia will need to pass D from the master to the workers. I'm guessing that this communication might be more expensive than the compute in your loops. It may be useful to take a look at distributed arrays in the parallel section of the Julia docs. Hope it helps. Sam On Saturday, January 31, 2015 at 7:38:22 AM UTC-6, paul analyst wrote: Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
Re: [julia-users] Re: How to read structure of HDF5 file ?
You have to open the file with h5open. --Tim On Saturday, January 31, 2015 05:35:57 AM paul analyst wrote: file IS empty? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) julia names(fid) 4-element Array{Symbol,1}: :handle :ios :name :mark julia dump(fid) ERROR: invalid arguments to dump in dump at show.jl:866 (repeats 2 times) Paul W dniu sobota, 31 stycznia 2015 13:11:30 UTC+1 użytkownik Tim Holy napisał: Keith's tips are good. Within julia, you can also try names(fid) or dump(fid). --Tim On Saturday, January 31, 2015 03:52:01 AM Keith Campbell wrote: You can use the freely available HDFView, from the HDF Group. The HDF Group also provides a set of command line tools for inspecting files. Keith On Saturday, January 31, 2015 at 6:04:11 AM UTC-5, paul analyst wrote: How to read structure of HDF5 file? Look what is inside ? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) I d`not kwnow what is in this file , How to read it (strukture) ? Paul
Re: [julia-users] Re: Parallel loop, what wroong ? Parallel is slower then normal
Paul, until the threads branch gets merged, I recommend that you just accept the fact that you'll only have 1 core active for most operations. --Tim On Saturday, January 31, 2015 07:15:25 AM paul analyst wrote: Thx, but, no. For sparse matrix 10^5,10^4,0.002 is the same . Time for both whiles is about 48 sek, only 11% o cores is used. I vave 8 cores, 7 sleeps:/ Paul W dniu sobota, 31 stycznia 2015 15:50:02 UTC+1 użytkownik Sam Kaplan napisał: Hi Paul, If D is allocated on the master, then Julia will need to pass D from the master to the workers. I'm guessing that this communication might be more expensive than the compute in your loops. It may be useful to take a look at distributed arrays in the parallel section of the Julia docs. Hope it helps. Sam On Saturday, January 31, 2015 at 7:38:22 AM UTC-6, paul analyst wrote: Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
[julia-users] Re: poly2mask in Julia?
Andrew, thanks for the link! I spent quite some time to rewrite and optimize code for Julia, but it was worth it: for 100x100 matrix intersection-based algorithms gives almost 10 times speedup, and the larger the matrix, the higher is the difference. Here's my new code: function fillpoly!{T}(M::Matrix{T}, px::Vector{Int}, py::Vector{Int}, value::T) @assert length(px) == length(py) left, right = minimum(px), maximum(px) top, bottom = minimum(py), maximum(py) @inbounds for x=left:right ys = Set{Int64}() j = length(px) for i=1:length(px) if (px[i] = x x = px[j]) || (px[j] = x x = px[i]) # special case: adding the whole cut to ys if px[i] == px[j] push!(ys, py[i]) push!(ys, py[j]) else y = py[i] + (x - px[i]) / (px[j] - px[i]) * (py[j] - py[i]) push!(ys, int(y)) end end j = i end ys = sort([y for y in ys]) # if there's an odd number of intersection points, add one imeginary point if length(ys) % 2 == 1 push!(ys, ys[end]) end for i=1:2:length(ys) M[ys[i]:ys[i+1], x] = value end end return M end function poly2mask(px::Vector{Int}, py::Vector{Int}, m::Int, n::Int) mask = zeros(Bool, m, n) fillpoly!(mask, px, py, true) end On Thursday, January 29, 2015 at 4:15:10 AM UTC+3, Patrick O'Leary wrote: On Wednesday, January 28, 2015 at 4:05:07 PM UTC-6, Andrew McLean wrote: [Apologies if you see this post twice, it's been a number of hours since my original post and it hasn't appeared.] Sorry about that--I have no idea how it sat in the queue for so long. I discarded the original post and kept only the repost. Further posts should appear immediately. Patrick
[julia-users] Re: Parallel loop, what wroong ? Parallel is slower then normal
Thx, but, no. For sparse matrix 10^5,10^4,0.002 is the same . Time for both whiles is about 48 sek, only 11% o cores is used. I vave 8 cores, 7 sleeps:/ Paul W dniu sobota, 31 stycznia 2015 15:50:02 UTC+1 użytkownik Sam Kaplan napisał: Hi Paul, If D is allocated on the master, then Julia will need to pass D from the master to the workers. I'm guessing that this communication might be more expensive than the compute in your loops. It may be useful to take a look at distributed arrays in the parallel section of the Julia docs. Hope it helps. Sam On Saturday, January 31, 2015 at 7:38:22 AM UTC-6, paul analyst wrote: Parallel loop, what wroong ? Parallel is slower then normal julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.443197509 seconds (14074576 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 5.287007403 seconds (435449580 bytes allocated, 5.00% gc time) 1x1 Array{Float64,2}: Paul julia @time for i=1:l w[i]=var(D[:,i]) end elapsed time: 4.331569152 seconds (8637464 bytes allocated) julia @time ww=@parallel (hcat) for i=1:l var(D[:,i]) end elapsed time: 4.908234336 seconds (422121448 bytes allocated, 4.85% gc time) 1x1 Array{Float64,2}: 0.000703737 0.000731674 0.000582672 0.000803880.000759479 0.000402509 0.0007118 0.000989408 julia size(D) (1,1)
[julia-users] Re: Question on tuple size
Dear Kirill Ignatiev; Crystal clear and thanks for your kind explanation... Jase On Saturday, January 31, 2015 at 5:09:47 PM UTC+11, Jung Soo Park wrote: Hi, I am generating two functions and the outcome of the first function will be use as input values of the second function. Like Matlab's case of [x,y,z] =test_function(input), I used Julia's tuple function to generate [x,y,z] and it worked well. function test_function(input) x=rand(20,5); y=rand(30,50); z=rand(5,100); ALL=tuple(x,y,z); return ALL end I sliced the output of ALL with [[ ]] and save as xx,yy,zz. ALL= test_function(best) xx=ALL[[1]]; yy=ALL[[2]]; zz=ALL[[3]]; But I found the size of original output (say x) and sliced (say xx) are not identical so I can not transfer the values into the second function. size(x) (20,5) size(xx) `size` has no method matching size(::(Array{Float64,2},)) while loading In[1], in expression starting on line 12 Q) How can I convert the size of xx into size of x so that I can run my second function like following? function test_2(x,y,z) smile!! end Thank you for your time. Jase
Re: [julia-users] Show method for types
On Sat, 2015-01-31 at 04:53, i.costi...@me.com wrote: I've got lots of multi-field types in a package I'm developing https://github.com/imanuelcostigan/FinancialMarkets.jl and would like to understand how I might best show() them optimally. The default method applied to some types prints what is effectively unreadable garbage to the REPL. Has someone else successfully dealt with this? Try overloading Base.show something like: function Base.show(io::IO, mt::MyType) print(io, My type with field $(mt.fld1)) end
[julia-users] Help with the findin method
Hello, For a Molecular Dynamics simulation I have created a composite type named Particle. After that, I constructed an Array consisted of Particle elements. I want to get the index of a specific element within the Array, however when trying it I get the following error: `start` has no method matching start(::Particle{Float64}) I believe the error is related to the fact I have not defined how to iterate over an Array of Particle types. I don't know how I can do that. Any help will be appreciated. For concreteness, I put a simple code: type Particle{T:Float64} r::T v::T end p = Particle(1.0,1.0) particles = [p] for i in 1:10 p = Particle(rand(),rand()) push!(particles,p) end k = findin(particles, particles[3]) # I expect k = 3 `start` has no method matching start(::Particle{Float64})
Re: [julia-users] Help with the findin method
k = findin(particles, particles[3]) findin requires both arguments to be collections. Try findin(particles, [particles[3]]) instead. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory
Re: [julia-users] Help with the findin method
Le samedi 31 janvier 2015 à 13:08 -0500, Jiahao Chen a écrit : k = findin(particles, particles[3]) findin requires both arguments to be collections. Try findin(particles, [particles[3]]) instead. Maybe we should add a version accepting a single element too? Any reason not to? Regards
Re: [julia-users] Help with the findin method
Oh, I see. Thanks Jiahao. Could you please tell me a bit more about the error message?, it is not evident for me what it means. 2015-01-31 13:24 GMT-06:00 Jiahao Chen jia...@mit.edu: We could have a new function. We have separate functions for push!([1,2,3],4,5,6) and append!([1,2,3],[4,5,6]). Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 1:57 PM, Milan Bouchet-Valat nalimi...@club.fr wrote: Le samedi 31 janvier 2015 à 13:08 -0500, Jiahao Chen a écrit : k = findin(particles, particles[3]) findin requires both arguments to be collections. Try findin(particles, [particles[3]]) instead. Maybe we should add a version accepting a single element too? Any reason not to? Regards
Re: [julia-users] Help with the findin method
We could have a new function. We have separate functions for push!([1,2,3],4,5,6) and append!([1,2,3],[4,5,6]). Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 1:57 PM, Milan Bouchet-Valat nalimi...@club.fr wrote: Le samedi 31 janvier 2015 à 13:08 -0500, Jiahao Chen a écrit : k = findin(particles, particles[3]) findin requires both arguments to be collections. Try findin(particles, [particles[3]]) instead. Maybe we should add a version accepting a single element too? Any reason not to? Regards
Re: [julia-users] ANN: Docile Lexicon update.
I installed Sims and ran `?sim` at REPL. I got the following back: `dasslsim (generic function with 20 methods)`. I see that there is an extensive docstring against sim in src/sim.jl. Why am I not seeing this at REPL? I'm on julia 0.3.5 On Thursday, 22 January 2015 03:54:14 UTC+11, Ivan Ogasawara wrote: great! El 21/01/2015 14:22, Michael Hatherly michael...@gmail.com javascript: escribió: Hi all, I’m pleased to announce the latest update to the Docile https://github.com/MichaelHatherly/Docile.jl and Lexicon https://github.com/MichaelHatherly/Lexicon.jl documentation packages. New features include: - Docile now supports plain strings http://michaelhatherly.github.io/Docile.jl/syntax/#plain-strings, ie. without @doc, as docstrings. Compatibility with the Julia 0.4 doc system is still present. - Thanks to Tom Short, Lexicon can now output nicely formatted markdown. This can then be used to create static documentation using programs such as MkDocs http://www.mkdocs.org/. See the documentation from the following packages for examples of the results: Sims https://tshort.github.io/Sims.jl/, Docile http://michaelhatherly.github.io/Docile.jl/, and Lexicon http://michaelhatherly.github.io/Lexicon.jl/. Any bugs or feature requests can be opened in either the Docile or Lexicon repos. Happy documenting! — Mike
Re: [julia-users] ANN: Docile Lexicon update.
You'll need to install Lexicon (I think). On Sat, Jan 31, 2015 at 3:57 PM, i.costi...@me.com wrote: I installed Sims and ran `?sim` at REPL. I got the following back: `dasslsim (generic function with 20 methods)`. I see that there is an extensive docstring against sim in src/sim.jl. Why am I not seeing this at REPL? I'm on julia 0.3.5 On Thursday, 22 January 2015 03:54:14 UTC+11, Ivan Ogasawara wrote: great! El 21/01/2015 14:22, Michael Hatherly michael...@gmail.com escribió: Hi all, I’m pleased to announce the latest update to the Docile https://github.com/MichaelHatherly/Docile.jl and Lexicon https://github.com/MichaelHatherly/Lexicon.jl documentation packages. New features include: - Docile now supports plain strings http://michaelhatherly.github.io/Docile.jl/syntax/#plain-strings, ie. without @doc, as docstrings. Compatibility with the Julia 0.4 doc system is still present. - Thanks to Tom Short, Lexicon can now output nicely formatted markdown. This can then be used to create static documentation using programs such as MkDocs http://www.mkdocs.org/. See the documentation from the following packages for examples of the results: Sims https://tshort.github.io/Sims.jl/, Docile http://michaelhatherly.github.io/Docile.jl/, and Lexicon http://michaelhatherly.github.io/Lexicon.jl/. Any bugs or feature requests can be opened in either the Docile or Lexicon repos. Happy documenting! — Mike
[julia-users] Re: what path to set in jl_init
Gentlemen, Thank you very much, it now works. gcc -o embedd -I$JULIA_DIR/include/julia -L$JULIA_DIR/lib/julia -Wl,-rpath,$JULIA_DIR/lib/julia -ljulia embedding.c in the C code replace jl_init (NULL) with jl_init(JULIA_DIR/bin); //Where JULIA_DIR is the path to a folder with *bin *and *lib/julia *subfolder. Jeff I have yet to try your approach but I will keep it mind. Thanks again, -Eric
[julia-users] what path to set in jl_init
I am trying to compile and run embedding.c on a 10.9.5 OS X Mac. Compile without a glitch but when I run the executable I get: ./embedd dyld: Library not loaded: @rpath/libjulia.dylib Referenced from: /Users/doe/./embedd Reason: image not found Trace/BPT trap: 5 I have try with jl_init set to: NULL, $JULIA_DIR/ $JULIA_DIR/bin $JULIA_DIR/lib/julia (with $JULIA_DIR fully extended ;) and get the same error over and over... So what path should I write in jl_init () or should I do something different? Thxs. PS: Julia Version 0.3.5 Commit a05f87b* (2015-01-08 22:33 UTC) Platform Info: System: Darwin (x86_64-apple-darwin13.4.0) CPU: Intel(R) Core(TM) i7-4650U CPU @ 1.70GHz WORD_SIZE: 64 BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: libopenblas LIBM: libopenlibm LLVM: libLLVM-3.3
Re: [julia-users] What is the difference between these two functions?
On Saturday, 31 January 2015 07:58:38 UTC-5, Tim Holy wrote: In principle I don't see why the current Complex{Bool} definition shouldn't be feasible. There might just be an inference bug. If you use @code_warntype, you'll see that `im` gets annotated as `Any`. Thanks very much, replacing im with convert(Complex128, im) seems to help. I filed the issue here: https://github.com/JuliaLang/julia/issues/9989
Re: [julia-users] what path to set in jl_init
you're executable seems to be compiled wrong and cannot be executed (missing the @rpath attribute). try giving the -Wl,-rpath,$JULIA_DIR/lib option in your linker command where you passed `-ljulia` On Sat Jan 31 2015 at 6:37:17 PM eric l cdg2...@gmail.com wrote: I am trying to compile and run embedding.c on a 10.9.5 OS X Mac. Compile without a glitch but when I run the executable I get: ./embedd dyld: Library not loaded: @rpath/libjulia.dylib Referenced from: /Users/doe/./embedd Reason: image not found Trace/BPT trap: 5 I have try with jl_init set to: NULL, $JULIA_DIR/ $JULIA_DIR/bin $JULIA_DIR/lib/julia (with $JULIA_DIR fully extended ;) and get the same error over and over... So what path should I write in jl_init () or should I do something different? Thxs. PS: Julia Version 0.3.5 Commit a05f87b* (2015-01-08 22:33 UTC) Platform Info: System: Darwin (x86_64-apple-darwin13.4.0) CPU: Intel(R) Core(TM) i7-4650U CPU @ 1.70GHz WORD_SIZE: 64 BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell) LAPACK: libopenblas LIBM: libopenlibm LLVM: libLLVM-3.3
Re: [julia-users] what path to set in jl_init
here’s what I get for debug info on embedding on 0.4-dev. since you are still getting errors, you might want to check what you are getting, and whether you’ve set JULIA_HOME correctly (it must point to the folder containing the julia binary). julia/usr/bin$ otool -l embedding | grep -C4 RPATH time stamp 2 Wed Dec 31 19:00:02 1969 current version 120.0.0 compatibility version 1.0.0 Load command 18 cmd LC_RPATH cmdsize 48 path @executable_path/../lib/julia (offset 12) Load command 19 -- cmdsize 48 path @executable_path/../lib/julia (offset 12) Load command 19 cmd LC_RPATH cmdsize 40 path @executable_path/../lib (offset 12) Load command 20 cmd LC_FUNCTION_STARTS julia/usr/bin$ otool -L embedding embedding: @rpath/libjulia.dylib (compatibility version 0.0.0, current version 0.0.0) /usr/lib/libffi.dylib (compatibility version 1.0.0, current version 1.0.0) /usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.5) /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1213.0.0) /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 1151.16.0) /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices (compatibility version 1.0.0, current version 62.0.0) /usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 120.0.0) On Sat Jan 31 2015 at 8:02:11 PM eric l cdg2...@gmail.com http://mailto:cdg2...@gmail.com wrote: Thx Jameson, By the way on OSX gcc default to this now: Apple LLVM version 6.0 (clang-600.0.51) (based on LLVM 3.5svn) Target: x86_64-apple-darwin13.4.0 So I was able to make some progress: first I got: ./embedd System image file /Applications/Julia-0.3.5.app/Contents/Resources/julia/../lib/julia/sys.ji not found Which make sense as the path should be julia/lib/julia/sys.ji So I simply created a dir lib/julia/ under Resources/ Then copied all the files in lib/julia into the new dir. I ran ./embedd again and got a long list of grievances and the expected output of embedding.c! All the unfounded lib are in the same directories as libjulia.dylib. Clearly something is not quite right with the compiler settings I am using... .embedd Warning: error initializing module LinAlg: ErrorException(error compiling __init__: error compiling check_blas: error compiling openblas_get_config: could not load module libopenblas: dlopen(libopenblas.dylib, 1): image not found) Entropy pool not available to seed RNG; using ad-hoc entropy sources. Warning: error initializing module Random: ErrorException(could not load module libdSFMT: dlopen(libdSFMT.dylib, 1): image not found) Warning: error initializing module GMP: ErrorException(error compiling __init__: could not load module libgmp: dlopen(libgmp.dylib, 1): image not found) Warning: error initializing module PCRE: ErrorException(ccall: could not find function pcre_jit_stack_alloc in library libpcre) sqrt(2.0) in C: 1.414214e+00 sqrt(2.0) in C: 1.414214e+00 x = [9.00e+00 8.00e+00 7.00e+00 6.00e+00 5.00e+00 4.00e+00 3.00e+00 2.00e+00 1.00e+00 0.00e+00 ] my_func(5.0) = 10.00 UndefVarError(:this_function_does_not_exist) On Saturday, January 31, 2015 at 4:18:59 PM UTC-8, Jameson wrote: you're executable seems to be compiled wrong and cannot be executed (missing the @rpath attribute). try giving the -Wl,-rpath, $JULIA_DIR/lib option in your linker command where you passed `-ljulia`
[julia-users] Re-size image in IJulia Markdown
I am using Markup in IJulia to write notes amongst my code, and want to control the size of an embedded image. To embed the image I'm currently using the code: ![alt text](my_pic.jpg Title) This works fine, but doesn't give me control over the image size. Is there any code to add to this line to control the image size? Supposedly for some implementations http://stackoverflow.com/questions/14675913/how-to-change-image-size-markdown of Markdown the following code works, but this failed for me ![alt text](my_pic.jpg =100x100) For an altenative I looked into the ImageView package https://github.com/timholy/ImageView.jl by Tim Holy, but got an error when I tried to use this in IJulia (following the suggested syntax) using ImageView img = imread(my_pic.jpg) libversion not defined in image2wand at /Users/Nikola/.julia/v0.3/Images/src/io.jl:304 in _writemime at /Users/Nikola/.julia/v0.3/Images/src/io.jl:198 in base64 at base64.jl:125 in display_dict at /Users/Nikola/.julia/v0.3/IJulia/src/execute_request.jl:34 view(img, pixelspacing = [1,1]) view not defined while loading In[11], in expression starting on line 1 Any suggestions? Thanks! Best, Nate
[julia-users] Re: IO writing to a shell command
Thxs that explains it!
[julia-users] How should nested local functions be annotated?
Consider the following function: function f() x::Float64 = 1.0 function g(y::Float64) x += y end g(2.0) g(3.0) x end Then @code_warntype gives: julia @code_warntype f() Variables: x::Float64 g::F Body: begin # /***.jl, line 310: NewvarNode(:x) x = 1.0 # line 311: $(Expr(:method, :g, :((top(tuple))((top(tuple))(Float64)::Any,(top(tuple))()::Any)::Any), AST(:($(Expr(:lambda, Any[:y], Any[Any[symbol(#s5504)],Any[Any[:y,Any,0],Any[symbol(#s5504),Any,18]],Any[Any[:x,Float64,7]]], :(begin # /***.jl, line 312: #s5504 = x + y::Any x = (top(typeassert))((top(convert))(Float64,#s5504)::Any,Float64)::Any return #s5504 end::Any), false)) # line 314: (g::F)(2.0)::Any # line 315: (g::F)(3.0)::Any # line 316: return x::Float64 end::Float64 Is there anything I can do to make it recognize that the type of x doesn't change here? It seems to think that y can be Any, despite the annotation saying it can only be Float64. I am missing something here?
Re: [julia-users] How should nested local functions be annotated?
code_warntype is being a bit misleading there. code inference hasn't yet fully run on the inner function. On Sat Jan 31 2015 at 7:21:56 PM Kirill Ignatiev kirill.ignat...@gmail.com wrote: Consider the following function: function f() x::Float64 = 1.0 function g(y::Float64) x += y end g(2.0) g(3.0) x end Then @code_warntype gives: julia @code_warntype f() Variables: x::Float64 g::F Body: begin # /***.jl, line 310: NewvarNode(:x) x = 1.0 # line 311: $(Expr(:method, :g, :((top(tuple))((top(tuple))(Float64)::Any,(top(tuple))()::Any)::Any), AST(:($(Expr(:lambda, Any[:y], Any[Any[symbol(#s5504)],Any[Any[:y,Any,0],Any[symbol(#s5504),Any,18]],Any[Any[:x,Float64,7]]], :(begin # /***.jl, line 312: #s5504 = x + y::Any x = (top(typeassert))((top(convert))(Float64,#s5504)::Any,Float64)::Any return #s5504 end::Any), false)) # line 314: (g::F)(2.0)::Any # line 315: (g::F)(3.0)::Any # line 316: return x::Float64 end::Float64 Is there anything I can do to make it recognize that the type of x doesn't change here? It seems to think that y can be Any, despite the annotation saying it can only be Float64. I am missing something here?
Re: [julia-users] what path to set in jl_init
Instead of jl_init, I use the following jl_init_with_image((char*)install_directory.c_str(),(char*)sys.ji); where install_directory is the directory of libjulia (e.g. /usr/local/julia/lib/julia), but whatever corresponds to how you link using -Wl,rpath
Re: [julia-users] Parametric vs Abstract typed collections, design question
Anyone have any thoughts on this? What I'm trying to figure out is a way to dispatch on different elements all in the same collection, without declaring the collection to be of abstract type. Meaning `func(a::Foo)` and `func(a::Bar)` should be different, even though `Bar` and `Foo` have the same internal representation (should be no underlying difference in memory). This boils down to wondering if having a collection of an unparametrized parametric type is better/different than just an abstract type. Reiterating my example from above: Option 1: abstract Root type Foo : Root ... end type Bar : Root ... end (the ... are the same for both Foo and Bar) Option 2: type Root ... end typealias Foo Root{:Foo} typealias Bar Root{:Bar} Would either option generate better code when iterating over a collection of `Root`? Or is `Root` in option 2 still treated the same as an abstract parameter (even though the internal structure is invariant among all parametrizations)?
[julia-users] Re: ANN: PGF/TikZ packages
Awesome work. On Friday, August 22, 2014 at 7:05:08 AM UTC+10, Mykel Kochenderfer wrote: There are three new Julia packages for interfacing with PGF/TikZ http://www.ctan.org/pkg/pgf for making publication-quality graphics. 1. TikzPictures.jl https://github.com/sisl/TikzPictures.jl. Basic interface to PGF/TikZ. Images can be saved as PDF, SVG, and TEX. If using IJulia, it will output SVG images. 2. PGFPlots.jl https://github.com/sisl/PGFPlots.jl. Plotting tool that uses the pgfplots http://ctan.org/pkg/pgfplots package (built on top of TikZ). 3. TikzGraphs.jl https://github.com/sisl/TikzGraphs.jl. Graph layout package using algorithms built into PGF/TikZ 3.0+. Documentation is provided with each package. Installation of the dependencies (e.g., pdf2svg and pgfplots) is still a bit manual, but instructions are in the documentation.
Re: [julia-users] ANN: Docile Lexicon update.
Ok, will Lexicon be baked into 0.4? If so, then the instructions in the latest documentation makes sense: http://docs.julialang.org/en/latest/manual/documentation/ Otherwise, there will need to be an extra line in the Accessing Documentation section. Cheers On Sunday, 1 February 2015 10:09:34 UTC+11, Michael Hatherly wrote: Make sure you’ve installed Lexicon as Tom has mentioned. Also call using Lexicon prior to trying to search for package docs at the REPL. — Mike On Saturday, 31 January 2015 22:57:36 UTC+2, i.cos...@me.com wrote: I installed Sims and ran `?sim` at REPL. I got the following back: `dasslsim (generic function with 20 methods)`. I see that there is an extensive docstring against sim in src/sim.jl. Why am I not seeing this at REPL? I'm on julia 0.3.5 On Thursday, 22 January 2015 03:54:14 UTC+11, Ivan Ogasawara wrote: great! El 21/01/2015 14:22, Michael Hatherly michael...@gmail.com escribió: Hi all, I’m pleased to announce the latest update to the Docile https://github.com/MichaelHatherly/Docile.jl and Lexicon https://github.com/MichaelHatherly/Lexicon.jl documentation packages. New features include: - Docile now supports plain strings http://michaelhatherly.github.io/Docile.jl/syntax/#plain-strings, ie. without @doc, as docstrings. Compatibility with the Julia 0.4 doc system is still present. - Thanks to Tom Short, Lexicon can now output nicely formatted markdown. This can then be used to create static documentation using programs such as MkDocs http://www.mkdocs.org/. See the documentation from the following packages for examples of the results: Sims https://tshort.github.io/Sims.jl/, Docile http://michaelhatherly.github.io/Docile.jl/, and Lexicon http://michaelhatherly.github.io/Lexicon.jl/. Any bugs or feature requests can be opened in either the Docile or Lexicon repos. Happy documenting! — Mike
[julia-users] How to read structure of HDF5 file ?
How to read structure of HDF5 file? Look what is inside ? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) I d`not kwnow what is in this file , How to read it (strukture) ? Paul
[julia-users] Re: How to read structure of HDF5 file ?
You can use the freely available HDFView, from the HDF Group. The HDF Group also provides a set of command line tools for inspecting files. Keith On Saturday, January 31, 2015 at 6:04:11 AM UTC-5, paul analyst wrote: How to read structure of HDF5 file? Look what is inside ? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) I d`not kwnow what is in this file , How to read it (strukture) ? Paul
Re: [julia-users] Re: Is it possible to file jld only read range ?
It work, THX Paul W dniu piątek, 30 stycznia 2015 15:50:01 UTC+1 użytkownik Tim Holy napisał: Do Pkg.update() and it will start working. --Tim On Friday, January 30, 2015 03:20:27 AM paul analyst wrote: But why not work this : dset[:,1] ? julia dset[:,1] ERROR: `size` has no method matching size(::JldDataset, ::Int64) julia dset[1:k,1] 30070x1 Array{Float64,2}:
Re: [julia-users] Re: How to read structure of HDF5 file ?
Keith's tips are good. Within julia, you can also try names(fid) or dump(fid). --Tim On Saturday, January 31, 2015 03:52:01 AM Keith Campbell wrote: You can use the freely available HDFView, from the HDF Group. The HDF Group also provides a set of command line tools for inspecting files. Keith On Saturday, January 31, 2015 at 6:04:11 AM UTC-5, paul analyst wrote: How to read structure of HDF5 file? Look what is inside ? julia fid=open(EECo2.h5) IOStream(file EECo2.h5) I d`not kwnow what is in this file , How to read it (strukture) ? Paul
Re: [julia-users] What is the difference between these two functions?
Hm, this is a tough one. The imaginary unit im is currently defined to be of type Complex{Bool}, which seems to be the cause of the type instability. Previously im was its own ImaginaryUnit type, which meant that very few functions were defined on just im and also led to a lot of unwanted NaNs cropping up in computations. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 5:53 AM, Tim Holy tim.h...@gmail.com wrote: There seem to be some problems with type inference and `im`. If at the beginning of the function I define myim = convert(Complex128, im) and then replace all uses of `im` with `myim`, then everything works as expected. Can you file an issue, please? --Tim On Friday, January 30, 2015 09:48:26 PM Kirill Ignatiev wrote: I have a newbie-type performance question. In some of my code there is a structure that looks like this: type FourierPoly periods :: Vector{Int} radiuses :: Vector{Float64} phase_offsets :: Vector{Float64} end and the following two functions that operate on it: - function polyval(f::FourierPoly, t::Float64) 33694968 s = zero(Complex128) 0 @inbounds for k = 1:length(f.periods) 0 s += exp(2.pi*(t + f.phase_offsets[k]) * f.periods[k] * im) * f.radiuses[k] - end 0 return s::Complex128 0 end 0 0 function polyder(f::FourierPoly, t::Float64) 0 s = zero(Complex128) 492303248 @inbounds for k = 1:length(f.periods) 0 θ = 2.pi * f.periods[k] 164100720 s += θ * im * exp((t + f.phase_offsets[k]) * θ * im) * f.radiuses[k] 257652 end 0 return s::Complex128 - end (copied from output of julia run with --track-allocation=user). What is the difference between these two functions? polyval seems fine, but polyder is called at most as often as polyval from the rest of the code, yet its memory consumption is at least an order of magnitude higher? Can somebody please point out what I'm missing here?
Re: [julia-users] What is the difference between these two functions?
El sábado, 31 de enero de 2015, 15:35:49 (UTC+3), Jiahao Chen escribió: Hm, this is a tough one. The imaginary unit im is currently defined to be of type Complex{Bool}, which seems to be the cause of the type instability. Previously im was its own ImaginaryUnit type, which meant that very few functions were defined on just im and also led to a lot of unwanted NaNs cropping up in computations. What is the reason why `im` is not just defined as `Complex(0,1)`? I'm sure there's a good one, since this must have been thought about at length, but I'd like to know what it is or have a reference to the discussion.\ Best, David. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 5:53 AM, Tim Holy tim@gmail.com javascript: wrote: There seem to be some problems with type inference and `im`. If at the beginning of the function I define myim = convert(Complex128, im) and then replace all uses of `im` with `myim`, then everything works as expected. Can you file an issue, please? --Tim On Friday, January 30, 2015 09:48:26 PM Kirill Ignatiev wrote: I have a newbie-type performance question. In some of my code there is a structure that looks like this: type FourierPoly periods :: Vector{Int} radiuses :: Vector{Float64} phase_offsets :: Vector{Float64} end and the following two functions that operate on it: - function polyval(f::FourierPoly, t::Float64) 33694968 s = zero(Complex128) 0 @inbounds for k = 1:length(f.periods) 0 s += exp(2.pi*(t + f.phase_offsets[k]) * f.periods[k] * im) * f.radiuses[k] - end 0 return s::Complex128 0 end 0 0 function polyder(f::FourierPoly, t::Float64) 0 s = zero(Complex128) 492303248 @inbounds for k = 1:length(f.periods) 0 θ = 2.pi * f.periods[k] 164100720 s += θ * im * exp((t + f.phase_offsets[k]) * θ * im) * f.radiuses[k] 257652 end 0 return s::Complex128 - end (copied from output of julia run with --track-allocation=user). What is the difference between these two functions? polyval seems fine, but polyder is called at most as often as polyval from the rest of the code, yet its memory consumption is at least an order of magnitude higher? Can somebody please point out what I'm missing here?
Re: [julia-users] What is the difference between these two functions?
El sábado, 31 de enero de 2015, 15:46:50 (UTC+3), David P. Sanders escribió: El sábado, 31 de enero de 2015, 15:35:49 (UTC+3), Jiahao Chen escribió: Hm, this is a tough one. The imaginary unit im is currently defined to be of type Complex{Bool}, which seems to be the cause of the type instability. Previously im was its own ImaginaryUnit type, which meant that very few functions were defined on just im and also led to a lot of unwanted NaNs cropping up in computations. What is the reason why `im` is not just defined as `Complex(0,1)`? Oh, I guess there's the obvious issue of what type the real and imaginary parts should be. So really, im should be parametrised on this type. I'm sure there's a good one, since this must have been thought about at length, but I'd like to know what it is or have a reference to the discussion.\ Best, David. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 5:53 AM, Tim Holy tim@gmail.com wrote: There seem to be some problems with type inference and `im`. If at the beginning of the function I define myim = convert(Complex128, im) and then replace all uses of `im` with `myim`, then everything works as expected. Can you file an issue, please? --Tim On Friday, January 30, 2015 09:48:26 PM Kirill Ignatiev wrote: I have a newbie-type performance question. In some of my code there is a structure that looks like this: type FourierPoly periods :: Vector{Int} radiuses :: Vector{Float64} phase_offsets :: Vector{Float64} end and the following two functions that operate on it: - function polyval(f::FourierPoly, t::Float64) 33694968 s = zero(Complex128) 0 @inbounds for k = 1:length(f.periods) 0 s += exp(2.pi*(t + f.phase_offsets[k]) * f.periods[k] * im) * f.radiuses[k] - end 0 return s::Complex128 0 end 0 0 function polyder(f::FourierPoly, t::Float64) 0 s = zero(Complex128) 492303248 @inbounds for k = 1:length(f.periods) 0 θ = 2.pi * f.periods[k] 164100720 s += θ * im * exp((t + f.phase_offsets[k]) * θ * im) * f.radiuses[k] 257652 end 0 return s::Complex128 - end (copied from output of julia run with --track-allocation=user). What is the difference between these two functions? polyval seems fine, but polyder is called at most as often as polyval from the rest of the code, yet its memory consumption is at least an order of magnitude higher? Can somebody please point out what I'm missing here?
Re: [julia-users] What is the difference between these two functions?
See #5468 https://github.com/JuliaLang/julia/pull/5468 and linked issues. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 7:49 AM, David P. Sanders dpsand...@gmail.com wrote: El sábado, 31 de enero de 2015, 15:46:50 (UTC+3), David P. Sanders escribió: El sábado, 31 de enero de 2015, 15:35:49 (UTC+3), Jiahao Chen escribió: Hm, this is a tough one. The imaginary unit im is currently defined to be of type Complex{Bool}, which seems to be the cause of the type instability. Previously im was its own ImaginaryUnit type, which meant that very few functions were defined on just im and also led to a lot of unwanted NaNs cropping up in computations. What is the reason why `im` is not just defined as `Complex(0,1)`? Oh, I guess there's the obvious issue of what type the real and imaginary parts should be. So really, im should be parametrised on this type. I'm sure there's a good one, since this must have been thought about at length, but I'd like to know what it is or have a reference to the discussion.\ Best, David. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 5:53 AM, Tim Holy tim@gmail.com wrote: There seem to be some problems with type inference and `im`. If at the beginning of the function I define myim = convert(Complex128, im) and then replace all uses of `im` with `myim`, then everything works as expected. Can you file an issue, please? --Tim On Friday, January 30, 2015 09:48:26 PM Kirill Ignatiev wrote: I have a newbie-type performance question. In some of my code there is a structure that looks like this: type FourierPoly periods :: Vector{Int} radiuses :: Vector{Float64} phase_offsets :: Vector{Float64} end and the following two functions that operate on it: - function polyval(f::FourierPoly, t::Float64) 33694968 s = zero(Complex128) 0 @inbounds for k = 1:length(f.periods) 0 s += exp(2.pi*(t + f.phase_offsets[k]) * f.periods[k] * im) * f.radiuses[k] - end 0 return s::Complex128 0 end 0 0 function polyder(f::FourierPoly, t::Float64) 0 s = zero(Complex128) 492303248 @inbounds for k = 1:length(f.periods) 0 θ = 2.pi * f.periods[k] 164100720 s += θ * im * exp((t + f.phase_offsets[k]) * θ * im) * f.radiuses[k] 257652 end 0 return s::Complex128 - end (copied from output of julia run with --track-allocation=user). What is the difference between these two functions? polyval seems fine, but polyder is called at most as often as polyval from the rest of the code, yet its memory consumption is at least an order of magnitude higher? Can somebody please point out what I'm missing here?
Re: [julia-users] What is the difference between these two functions?
In principle I don't see why the current Complex{Bool} definition shouldn't be feasible. There might just be an inference bug. If you use @code_warntype, you'll see that `im` gets annotated as `Any`. --Tim On Saturday, January 31, 2015 04:49:10 AM David P. Sanders wrote: El sábado, 31 de enero de 2015, 15:46:50 (UTC+3), David P. Sanders escribió: El sábado, 31 de enero de 2015, 15:35:49 (UTC+3), Jiahao Chen escribió: Hm, this is a tough one. The imaginary unit im is currently defined to be of type Complex{Bool}, which seems to be the cause of the type instability. Previously im was its own ImaginaryUnit type, which meant that very few functions were defined on just im and also led to a lot of unwanted NaNs cropping up in computations. What is the reason why `im` is not just defined as `Complex(0,1)`? Oh, I guess there's the obvious issue of what type the real and imaginary parts should be. So really, im should be parametrised on this type. I'm sure there's a good one, since this must have been thought about at length, but I'd like to know what it is or have a reference to the discussion.\ Best, David. Thanks, Jiahao Chen Staff Research Scientist MIT Computer Science and Artificial Intelligence Laboratory On Sat, Jan 31, 2015 at 5:53 AM, Tim Holy tim@gmail.com wrote: There seem to be some problems with type inference and `im`. If at the beginning of the function I define myim = convert(Complex128, im) and then replace all uses of `im` with `myim`, then everything works as expected. Can you file an issue, please? --Tim On Friday, January 30, 2015 09:48:26 PM Kirill Ignatiev wrote: I have a newbie-type performance question. In some of my code there is a structure that looks like this: type FourierPoly periods :: Vector{Int} radiuses :: Vector{Float64} phase_offsets :: Vector{Float64} end and the following two functions that operate on it: - function polyval(f::FourierPoly, t::Float64) 33694968 s = zero(Complex128) 0 @inbounds for k = 1:length(f.periods) 0 s += exp(2.pi*(t + f.phase_offsets[k]) * f.periods[k] * im) * f.radiuses[k] - end 0 return s::Complex128 0 end 0 0 function polyder(f::FourierPoly, t::Float64) 0 s = zero(Complex128) 492303248 @inbounds for k = 1:length(f.periods) 0 θ = 2.pi * f.periods[k] 164100720 s += θ * im * exp((t + f.phase_offsets[k]) * θ * im) * f.radiuses[k] 257652 end 0 return s::Complex128 - end (copied from output of julia run with --track-allocation=user). What is the difference between these two functions? polyval seems fine, but polyder is called at most as often as polyval from the rest of the code, yet its memory consumption is at least an order of magnitude higher? Can somebody please point out what I'm missing here?