Re: [julia-users] Re: Finite element code in Julia: Curious overhead in .* product

2014-12-11 Thread Valentin Churavy
Dear Petr,

I tought the point of having the array objects and the associated 
 manipulation functions was to hide the loops while delivering decent 
 performance...


This assumption is very true in Matlab. Matlab spend an enormous 
engineering amount on optimizing code that uses vectorization and array 
objects (and it is still slow). There has been ongoing discussions over on 
Github [1] on how to improve the current situation in Julia. One of the 
major selling points of Julia for me is the fact that it is quite 
transparent on which kind of optimizations it requires. I can write in a 
very dynamic style with a lot of matrix operations like in Matlab and still 
get decent performance or I can go in and identify with tools like @profile 
[2] what the pains point in my program are. 

The point of using vectorized operations in Matlab is that is the one 
reliable way to get good performance in matlab, because all underlying 
functions are written in C. In Julia most underlying functions are written 
in Julia (note that most mathematical operations call out to C libraries. 
No need to reinvent the wheel). There is a Julia package you can use for 
porting over Matlab code that devectorizes you code [3], but if the 
operation is occurring in a hot loop it still will be a good and necessary 
optimization to unroll the vectorized code. Then you no longer get just 
decent performance, but excellent performance instead. 

Best Valentin

PS:
You can also use @which 1 .* [1] and @edit to see where the operation is 
defined and how it is implemented. The .* operation is implemented by 
allocating an output array and the running * element wise over the array. 
No magic is going on there.

[1] https://github.com/JuliaLang/julia/issues/8450 
[2] http://docs.julialang.org/en/release-0.3/stdlib/profile/
[3] https://github.com/lindahua/Devectorize.jl

On Thursday, 11 December 2014 06:23:29 UTC+1, Petr Krysl wrote:

 Thanks.  Now my head is really spinning!

 See, before I posted the original question I tried expanding the loop in 
 the actual FE code, and the code was SLOWER and was using MORE memory:

 With the expression 
 Fe += Ns[j] * (f * Jac * w[j]); :
 6.223416655 seconds (1648832052 bytes

 With the expanded loop 
 for kx=1:length(Fe) # alternative (devectorization)
  Fe[kx] += Ns[j][kx] * (f * Jac * w[j]); 
 end 
  7.340272676 seconds (1776971604 bytes allocated,

 In addition, your argument clearly demonstrates how to avoid the temporary 
 array for doit1(), but doit2() adds to the 3 x 1 one additional 1x1 
 temporary (it seems to me), yet it is about 14 times slower. Why is that?

 Finally, if the only way I can get decent performance with lines like

  Fe += Ns[j] * (f * Jac * w[j]);  # Fe and Ns[j] arrays

 is to manually write out all the loops, that would be terrible news 
 indeed.  Not only that is a lot of work when rewriting loads of Matlab code 
 (with several matrices concatenated in many many expressions), but the 
 legibility and maintainability tanks. I tought the point of having the 
 array objects and the associated manipulation functions was to hide the 
 loops while delivering decent performance...

 Petr


 On Wednesday, December 10, 2014 8:28:31 PM UTC-8, Tim Holy wrote:

 Multiplying two Float64s yields another Float64; most likely, this will 
 be 
 stored in the CPU's registers. In contrast,  [f]*Jac creates an array, on 
 each 
 iteration, that has to be stored on the heap. 

 A faster variant devectorizes: 
function doit1a(N) 
Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
for i=1:N 
tmp = f*Jac 
for j = 1:length(Fe) 
Fe[j] += Ns[j]*tmp 
end 
end 
Fe 
end 

 julia @time doit1(N); 
 elapsed time: 0.810270399 seconds (384000320 bytes allocated, 61.23% gc 
 time) 

 julia @time doit1a(N); 
 elapsed time: 0.022118726 seconds (320 bytes allocated) 

 Note the tiny allocations in the second case. 

 --Tim 


 On Wednesday, December 10, 2014 07:54:00 PM Petr Krysl wrote: 
  The code is really short: 
  
  N=200 
  
  function doit1(N) 
  Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
  for i=1:N 
  Fe += Ns *  (f * Jac); 
  end 
  Fe 
  end 
  
  function doit2(N) 
  Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
  for i=1:N 
  Fe += Ns .*  ([f] * Jac); 
  end 
  Fe 
  end 
  
  function doit3(N) 
  Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
  fs=[1.0] 
  for i=1:N 
  fs= ([f] * Jac );   Fe += Ns .*  fs; 
  end 
  Fe 
  end 
  
  function doit4(N) 
  Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
  fs=[1.0] 
  for i=1:N   # 
  fs= [f]; fs *= Jac;   Fe += Ns .*  fs; 
  end 
  Fe 
  end 
  # 
  @time doit1(N) 
  @time doit2(N) 
  @time doit3(N) 
  @time doit4(N) 
  
  

[julia-users] MatrixDepot.jl: A Test Matrix Collection

2014-12-11 Thread Weijian Zhang
Hello,

So far I have included 20 matrices in Matrix Depot. I just modified the 
function matrixdepot() so it should display information nicely.

The repository is here: https://github.com/weijianzhang/MatrixDepot.jl

The documentation is here: 
http://nbviewer.ipython.org/github/weijianzhang/MatrixDepot.jl/blob/master/doc/juliadoc.ipynb

Let me know how you feel about it and if you have any questions.

Thanks,

Weijian






Re: [julia-users] Installing packages system-wide

2014-12-11 Thread Ján Dolinský


 There's a Julia variable called LOAD_PATH that is arranged to point at two 
 system directories under your julia installation. E.g.:

 julia LOAD_PATH
 2-element Array{Union(ASCIIString,UTF8String),1}:
  /opt/julia-0.3.3/usr/local/share/julia/site/v0.3
  /opt/julia-0.3.3/usr/share/julia/site/v0.3


 If you install packages under either of those directories, then everyone 
 using that Julia will see them. One way to do this is to run julia as a 
 user who can write to those directories after doing `export 
 JULIA_PKGDIR=/opt/julia-0.3.3/usr/share/julia/site` in the shell. That way 
 Julia will use that as it's package directory and normal package commands 
 will allow you to install packages for everyone. Or you can just copy your 
 installed packages and change the ownership and permissions so that 
 everyone can access the files.


Thanks for the advice! 

Jan 


Re: [julia-users] Unexpected append! behavior

2014-12-11 Thread Mike Innes
Think of append!(X, Y) as equivalent to X = vcat(X, Y). You called append!
twice, so X gets Y appended twice.

julia X = [1,2]; Y = [3,4];

julia X = vcat(X,Y)
[1, 2, 3, 4]

In your example you went ahead and did this again:

julia X = (X = vcat(X, Y))
[1, 2, 3, 4, 3, 4]

But if you reset X, Y via the first statement and *then* call X =
append!(X, Y), it works as you would expect.

julia X = [1,2]; Y = [3,4];

julia X = append!(X, Y) # same as X = (X = vcat(X, Y))
[1, 2, 3, 4]

On 11 December 2014 at 07:51, Alex Ames alexander.m.a...@gmail.com wrote:

 Functions that end with an exclamation point modify their arguments, but
 they can return values just like any other function. For example:

 julia x = [1,2]; y = [3, 4]
 2-element Array{Int64,1}:
  3
  4

 julia append!(x,y)
 4-element Array{Int64,1}:
  1
  2
  3
  4

 julia z = append!(x,y)
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia z
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia x
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 The append! function takes two arrays, appends the second to the first,
 then returns the values now contained by the first array. No recursion
 craziness required.

 On Thursday, December 11, 2014 1:11:50 AM UTC-6, Sean McBane wrote:

 Ivar is correct; I was running in the Windows command prompt and couldn't
 copy and paste so I copied it by hand and made an error.

 Ok, so I understand that append!(X,Y) is modifying X in place. But I
 still do not get where the output for the second case, where the result of
 append!(X,Y) is assigned back into X is what it is. It would make sense to
 me if this resulted in a recursion with Y forever getting appended to X,
 but as it is I don't understand.

 Thanks.

 -- Sean

 On Thursday, December 11, 2014 12:42:45 AM UTC-6, Ivar Nesje wrote:

 I assume the first line should be

  X = [1,2]; Y = [3,4];

 Then the results you get makes sense. The thing is that julia has
 mutable arrays, and the ! at the end of append! indicates that it is a
 function that mutates it's argument.




Re: [julia-users] scope using let and local/global variables

2014-12-11 Thread Mike Innes
You can do this just fine, but you have to be explicit about what variables
you want to pass in, e.g.

let x=2
  exp=:(x+1)
  eval(:(let x = $x; $exp; end))
end

If you want to call the expression with multiple inputs, wrap it in a
function:

let x=2
  exp=:(x+1)
  f = eval(:(x - $exp))
  f(x)
end


On 11 December 2014 at 06:32, Jameson Nash vtjn...@gmail.com wrote:

 I'm not quite sure what a genetic program of that sort would look like. I
 would be interested to hear if you get something out of it.

 Another alternative is to use a module as the environment:

 module MyEnv
 end
 eval(MyEnv, :(code block))

 This is (roughly) how the REPL is implemented to work.

 On Thu Dec 11 2014 at 1:26:57 AM Michael Mayo mm...@waikato.ac.nz wrote:

 Thanks, but its not quite what I'm looking for. I want to be able to edit
 the Expr tree and then evaluate different expressions using variables
 defined in the local scope,not the global scope (e.g. for genetic
 programming, where random changes to an expression are repeatedly evaluated
 to find the best one). Using anonymous functions could work but modifying
 the .code property of an anonymous function looks much more complex than
 modifying the Expr types.

 Anyway thanks for your answer, maybe your suggestion is the only possible
 way to achieve this!

 Mike


 On Thursday, December 11, 2014 6:56:15 PM UTC+13, Jameson wrote:

 eval, by design, doesn't work that way. there are just too many better
 alternatives. typically, an anonymous function / lambda is the best and
 most direct replacement:

 let x=2
   println(x)# Line 1
   exp = () - x+1
   println(exp())# Line 2
 end


 On Wed Dec 10 2014 at 10:43:00 PM Michael Mayo mm...@waikato.ac.nz
 wrote:

 Hi folks,

 I have the following code fragment:

 x=1
 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 It contains two variables both named x, one inside the scope defined by
 let, and one at global scope.

 If I run this code the output is:
 2
 2

 This indicates that (i) that line 1 is using the local version of x,
 and (ii) that line 2 is using the global version of x.

 If I remove this global x I now get an error because eval() is looking
 for the global x which no longer exists:

 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 2

 ERROR: x not defined


 My question: when evaluating an expression using eval() such as line 2,
 how can I force Julia to use the local (not global) version of x and thus
 avoid this error?


 Thanks

 Mike




Re: [julia-users] Julia iPython notebook doesn't display inline graphics

2014-12-11 Thread RecentConvert
using PyPlot
PyPlot.backend

qt4agg


Re: [julia-users] Roadmap

2014-12-11 Thread Mike Innes
It seems to me that a lot of FAQs could be answered by a simple list of the
communities'/core developers' priorities. For example:

We care about module load times and static compilation, so that's going to
happen eventually. We care about package documentation, which is basically
done. We don't care as much about deterministic memory management or TCO,
so neither of those things are happening any time soon.

It doesn't have to be a commitment to releases or dates, or even be
particularly detailed, to give a good sense of where Julia is headed from a
user perspective.

Indeed, it's only the same things you end up posting on HN every time
someone complains that Gadfly is slow.

On 11 December 2014 at 03:01, Tim Holy tim.h...@gmail.com wrote:

 Really nice summaries, John and Tony.

 On Thursday, December 11, 2014 02:08:54 AM Boylan, Ross wrote:
  BTW, is 0.4 still in a you don't want to go there state for users of
  julia?

 In short, yes---for most users I'd personally recommend sticking with 0.3.
 Unless you simply _must_ have some of its lovely new features. But be
 prepared
 to update your code basically every week or so to deal with changes.

 --Tim




Re: [julia-users] Re: How fast reduce the Vector range of values? (Recode for smaler numbers)

2014-12-11 Thread Paul Analyst

I did the loop. Is there a faster solution?


julia o45
835969x26 Array{Int64,2}:
000000  0  0  0  0  0  0 0  0  
0  0  0  0  0  0  0  0  0
930070  1478343  148058100 0  0  0  0  0 0  0  
0  0  0  0  0  0  0  0  0  0  0
214748364960000 0  0  0 0  0  0  
0  0  0  0  0  0  0  0  0  0  0  0
42964454170000 0  0  0  0 0  0  
0  0  0  0  0  0  0  0  0  0  0  0

...
1448463  1475452000 0  0  0  0  0 0  0  
0  0  0  0  0  0  0  0  0  0  0
13799070000 0  0  0  0  0 0  0  
0  0  0  0  0  0  0  0  0  0  0


julia o45a = (Int=eltype(o45))[j = i for (i,j) in 
enumerate(unique(o45))];


julia o45_output=zeros(o45);

julia k,l =size(o45_output)
(835969,26)

julia @time for j=1:l, i=1:k;
   o45_output[i,j]=get(o45a,o45[i,j],0);
   end;
elapsed time: 7.418340987 seconds (1435414160 bytes allocated, 6.41% gc 
time)


julia o45_output=o45_output.-1; #makes thrue zeros


Paul
W dniu 2014-12-10 o 21:41, Paul Analyst pisze:

Thx, but not work,

julia JJ=hcat(J,J);

julia JJa = (Int=eltype(JJ))[j = i for (i,j) in enumerate(unique(JJ))];

julia JJcodes = Int64[JJa[j] for j in JJ];

julia convert(Array{Int, 2}, Jcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia convert(Array{Int64, 2}, Jcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia convert(Array{Int, 2}, JJcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia convert(Array{Int64, 2}, JJcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia convert(Matrix{Int}, JJcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia convert(Matrix{Int64}, JJcodes)
ERROR: `convert` has no method matching 
convert(::Type{Array{Int64,2}}, ::Array{Int64,1})

 in convert at base.jl:13

julia

Paul

W dniu 2014-12-10 o 18:21, Sean Marshallsay pisze:
Vector{T} is just a typealias for Array{T, 1} so it's still an array 
but limited to one dimension. Your problem can be solved with


|
convert(Array{Int,2},Jcodes)
|

or equivalently

|
convert(Matrix{Int},Jcodes)
|


On Wednesday, 10 December 2014 11:09:55 UTC, paul analyst wrote:

And how to do it if the J is an array rather than a vector? So
that was also Jcodes array of the same size as J?
julia J
1557211x2 Array{Int64,2}:
  930070   930070
 1475172  1475172
....
 21474836496  21474836496
  4296445417   4296445417

Paul

W dniu 2014-12-04 o 19:03, Steven G. Johnson pisze:

It sounds like you have an array J and you want to map each
element of J to a unique integer in 1:N for N as small as
possible?  This will do it:

d = (Int=eltype(J))[j = i for (i,j) in enumerate(unique(J))]

Jcodes = [d[j] for j in J]

Here, d is a dictionary mapping integers in 1:N to the
corresponding values in J, and Jcodes is the re-coded array.









Re: [julia-users] Roadmap

2014-12-11 Thread Mike Innes
https://github.com/JuliaLang/julia/issues/4964

On 11 December 2014 at 11:55, Uwe Fechner uwe.fechner@gmail.com wrote:

 What do you mean with TCO?

 On Thursday, December 11, 2014 10:50:19 AM UTC+1, Mike Innes wrote:

 It seems to me that a lot of FAQs could be answered by a simple list of
 the communities'/core developers' priorities. For example:

 We care about module load times and static compilation, so that's going
 to happen eventually. We care about package documentation, which is
 basically done. We don't care as much about deterministic memory management
 or TCO, so neither of those things are happening any time soon.

 It doesn't have to be a commitment to releases or dates, or even be
 particularly detailed, to give a good sense of where Julia is headed from a
 user perspective.

 Indeed, it's only the same things you end up posting on HN every time
 someone complains that Gadfly is slow.

 On 11 December 2014 at 03:01, Tim Holy tim@gmail.com wrote:

 Really nice summaries, John and Tony.

 On Thursday, December 11, 2014 02:08:54 AM Boylan, Ross wrote:
  BTW, is 0.4 still in a you don't want to go there state for users of
  julia?

 In short, yes---for most users I'd personally recommend sticking with
 0.3.
 Unless you simply _must_ have some of its lovely new features. But be
 prepared
 to update your code basically every week or so to deal with changes.

 --Tim





Re: [julia-users] defining types in macros broken in 0.4

2014-12-11 Thread Sheehan Olver

I tried to reduce the problem the bug but it didn't show up in my 
simplified version, but found a work around by splitting into two macros

macro calculus_operator1(Op,AbstOp,WrappOp)
return esc(quote  
abstract $AbstOp{T} : ApproxFun.BandedOperator{T} 
  
end)
end

macro calculus_operator2(Op,AbstOp,WrappOp)
return esc(quote
immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T}
space::S# the domain space
order::Int
end   
immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64}
op::S
order::Int
end
   end)
end

@calculus_operator1(Derivative,AbstractDerivative,DerivativeWrapper)
@calculus_operator2(Derivative,AbstractDerivative,DerivativeWrapper)


On Thursday, December 11, 2014 12:27:06 AM UTC-3, Tim Holy wrote:

 Seems like it might be worth filing an issue. It would help if it's 
 complete 
 enough that people could copy-paste the `macroexpand`ed code and get it to 
 work. (Things like `BandedOperator`, etc, are not available in this 
 snippet.) 

 --Tim 

 On Wednesday, December 10, 2014 05:53:54 PM Sheehan Olver wrote: 
  The below works fine in 0.3 but I get 
  
  ERROR: error compiling anonymous: type definition not allowed inside a 
  local scope 
  
  In 0.4.  How am I suppose to define types inside a Macro?  As far as I 
 can 
  tell, the esc() should cause everything to run in the current scope 
 which 
  isn't local. 
  
  
  
  
  macro calculus_operator(Op,AbstOp,WrappOp) 
  return esc(quote 
  abstract $AbstOp{T} : BandedOperator{T} 
  immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T} 
  space::S# the domain space 
  order::Int 
  end 
  immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64} 
  op::S 
  order::Int 
  end 
  ## More code 
  end) 
  end 
  @calculus_operator(Derivative,AbstractDerivative,DerivativeWrapper) 



Re: [julia-users] defining types in macros broken in 0.4

2014-12-11 Thread Sheehan Olver
Managed to simplify the bug: it only shows up with templated inheritence, 
I'll fill an issue.

 *module FooModule*

   * macro foo(Typ,Abs)*

   * return esc(quote*

   *   abstract $Abs{T}*

   *   immutable $Typ{T} : $Abs{T} end*

   *   end*

   *   )*

   *   end*

   *@foo(FooTyp,FooAbs)*

   *end*


On Thursday, December 11, 2014 9:04:49 AM UTC-3, Sheehan Olver wrote:


 I tried to reduce the problem the bug but it didn't show up in my 
 simplified version, but found a work around by splitting into two macros

 macro calculus_operator1(Op,AbstOp,WrappOp)
 return esc(quote  
 abstract $AbstOp{T} : ApproxFun.BandedOperator{T} 
   
 end)
 end

 macro calculus_operator2(Op,AbstOp,WrappOp)
 return esc(quote
 immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T}
 space::S# the domain space
 order::Int
 end   
 immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64}
 op::S
 order::Int
 end
end)
 end

 @calculus_operator1(Derivative,AbstractDerivative,DerivativeWrapper)
 @calculus_operator2(Derivative,AbstractDerivative,DerivativeWrapper)


 On Thursday, December 11, 2014 12:27:06 AM UTC-3, Tim Holy wrote:

 Seems like it might be worth filing an issue. It would help if it's 
 complete 
 enough that people could copy-paste the `macroexpand`ed code and get it 
 to 
 work. (Things like `BandedOperator`, etc, are not available in this 
 snippet.) 

 --Tim 

 On Wednesday, December 10, 2014 05:53:54 PM Sheehan Olver wrote: 
  The below works fine in 0.3 but I get 
  
  ERROR: error compiling anonymous: type definition not allowed inside a 
  local scope 
  
  In 0.4.  How am I suppose to define types inside a Macro?  As far as I 
 can 
  tell, the esc() should cause everything to run in the current scope 
 which 
  isn't local. 
  
  
  
  
  macro calculus_operator(Op,AbstOp,WrappOp) 
  return esc(quote 
  abstract $AbstOp{T} : BandedOperator{T} 
  immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T} 
  space::S# the domain space 
  order::Int 
  end 
  immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64} 
  op::S 
  order::Int 
  end 
  ## More code 
  end) 
  end 
  @calculus_operator(Derivative,AbstractDerivative,DerivativeWrapper) 



[julia-users] How save Dict{Any,Int64} ?

2014-12-11 Thread paul analyst
julia o2=vec(readcsv(output_2.txt,String))
835969-element Array{String,1}:
 146cd6a978544b0168fb11de000
 ...
 148b7f63e9e377c9b364000
 
julia o2a = [j = i for (i,j) in enumerate(unique(o2))]
Dict{Any,Int64} with 340393 entries:
  145f91af08573052e942acd1000 = 144188
  ...
  146131e51b53828163257fa0c00 = 227714

OK, and now:
  
julia using HDF5,JLD

julia save(o2a.jld,o2a,o2a)

Not wark , crash! Julia made 100GB file a...

How to save Dict{Any,Int64} ?
Paul



[julia-users] Re: THANKS to Julia core developers!

2014-12-11 Thread Tomas Lycken
I have to chime in here too.

Although the language is in many ways fantastic (and, on top of that, 
growing even better, fast), the community surrounding it is probably the 
best corner of the internet I'll ever point my browser to. The other day I 
told a couple of co-workers about how discussions of controversial changes 
and topics are usually handled here - they didn't believe me. Although of 
course everyone here has an effect on the general attitude of the place, I 
think a lot of the credit in this regard is due to the handful of core 
Julia devs that seem to be involved in almost every thread both here and on 
julia-dev, who have the patience to answer every and any inquiry, 
regardless of how it's worded, with facts and follow-up questions (coupled 
with a sharp remark about tone, when necessary). I'm not the first one, and 
I bet I'm not the last one either, to contribute a significant share of my 
spare time to Julia, simply because I feel so encouraged when I do.

Lots of kudos! (And of course, a case of beer from me too, if you ever 
travel as far as Stockholm, Sweden...)

// T

On Wednesday, December 10, 2014 6:44:29 AM UTC+1, David Smith wrote:

 Hear hear!

 I hope the sheer number of contributors back to the language is evidence 
 of how appreciative we all are to have Julia.  And besides the language 
 being great, the community is really extraordinary. 

 Beers are on me if you guys find yourselves in Nashville.  

 On Monday, December 8, 2014 11:02:10 PM UTC-6, Petr Krysl wrote:

 I've been playing in Julia for the past week or so, but already the 
 results are convincing.  This language is GREAT.   I've coded hundreds of 
 thousands of lines in Fortran, C, C++, Matlab, and this is the first 
 language that feels good. And it is precisely what I envision for my 
 project.

 So, THANKS! 

 Petr Krysl



Re: [julia-users] defining types in macros broken in 0.4

2014-12-11 Thread Tracy Wadleigh
I ran into the same issue when extending SharedArray support to windows. I
found the same workaround (give the type it's own macro). I assumed it was
user error, but maybe it is a bug worth filing. My type was not parametric
in that case.

On Thu, Dec 11, 2014 at 7:23 AM, Sheehan Olver dlfivefi...@gmail.com
wrote:

 Managed to simplify the bug: it only shows up with templated inheritence,
 I'll fill an issue.

  *module FooModule*

* macro foo(Typ,Abs)*

* return esc(quote*

*   abstract $Abs{T}*

*   immutable $Typ{T} : $Abs{T} end*

*   end*

*   )*

*   end*

*@foo(FooTyp,FooAbs)*

*end*


 On Thursday, December 11, 2014 9:04:49 AM UTC-3, Sheehan Olver wrote:


 I tried to reduce the problem the bug but it didn't show up in my
 simplified version, but found a work around by splitting into two macros

 macro calculus_operator1(Op,AbstOp,WrappOp)
 return esc(quote
 abstract $AbstOp{T} : ApproxFun.BandedOperator{T}

 end)
 end

 macro calculus_operator2(Op,AbstOp,WrappOp)
 return esc(quote
 immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T}
 space::S# the domain space
 order::Int
 end
 immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64}
 op::S
 order::Int
 end
end)
 end

 @calculus_operator1(Derivative,AbstractDerivative,DerivativeWrapper)
 @calculus_operator2(Derivative,AbstractDerivative,DerivativeWrapper)


 On Thursday, December 11, 2014 12:27:06 AM UTC-3, Tim Holy wrote:

 Seems like it might be worth filing an issue. It would help if it's
 complete
 enough that people could copy-paste the `macroexpand`ed code and get it
 to
 work. (Things like `BandedOperator`, etc, are not available in this
 snippet.)

 --Tim

 On Wednesday, December 10, 2014 05:53:54 PM Sheehan Olver wrote:
  The below works fine in 0.3 but I get
 
  ERROR: error compiling anonymous: type definition not allowed inside a
  local scope
 
  In 0.4.  How am I suppose to define types inside a Macro?  As far as I
 can
  tell, the esc() should cause everything to run in the current scope
 which
  isn't local.
 
 
 
 
  macro calculus_operator(Op,AbstOp,WrappOp)
  return esc(quote
  abstract $AbstOp{T} : BandedOperator{T}
  immutable $Op{S:FunctionSpace,T:Number} : $AbstOp{T}
  space::S# the domain space
  order::Int
  end
  immutable $WrappOp{S:BandedOperator} : $AbstOp{Float64}
  op::S
  order::Int
  end
  ## More code
  end)
  end
  @calculus_operator(Derivative,AbstractDerivative,DerivativeWrapper)




[julia-users] How save Dict{Any,Int64} ?

2014-12-11 Thread Daniel Høegh
You can serialize it see 
https://groups.google.com/forum/m/?fromgroups#!topic/julia-users/zN7OmKwnG40

Re: [julia-users] Roadmap

2014-12-11 Thread Kevin Squire
Tail call optimization does make much more sense than total cost if
ownership.  :-)

(Though I was wondering how much the devs care about the latter ... ;-)

Cheers,
   Kevin

On Thursday, December 11, 2014, Mike Innes mike.j.in...@gmail.com wrote:

 https://github.com/JuliaLang/julia/issues/4964

 On 11 December 2014 at 11:55, Uwe Fechner uwe.fechner@gmail.com
 javascript:_e(%7B%7D,'cvml','uwe.fechner@gmail.com'); wrote:

 What do you mean with TCO?

 On Thursday, December 11, 2014 10:50:19 AM UTC+1, Mike Innes wrote:

 It seems to me that a lot of FAQs could be answered by a simple list of
 the communities'/core developers' priorities. For example:

 We care about module load times and static compilation, so that's going
 to happen eventually. We care about package documentation, which is
 basically done. We don't care as much about deterministic memory management
 or TCO, so neither of those things are happening any time soon.

 It doesn't have to be a commitment to releases or dates, or even be
 particularly detailed, to give a good sense of where Julia is headed from a
 user perspective.

 Indeed, it's only the same things you end up posting on HN every time
 someone complains that Gadfly is slow.

 On 11 December 2014 at 03:01, Tim Holy tim@gmail.com wrote:

 Really nice summaries, John and Tony.

 On Thursday, December 11, 2014 02:08:54 AM Boylan, Ross wrote:
  BTW, is 0.4 still in a you don't want to go there state for users of
  julia?

 In short, yes---for most users I'd personally recommend sticking with
 0.3.
 Unless you simply _must_ have some of its lovely new features. But be
 prepared
 to update your code basically every week or so to deal with changes.

 --Tim






[julia-users] convert woes

2014-12-11 Thread Gustavo Goretkin
I'm on v0.3.4-pre

I have two definitions:

convert{T:Real}(::Type{ComplexDual{T}}, x::Real) =
  ComplexDual(convert(T, x))

convert{T:Real}(::Type{ComplexDual{T}}, x::Complex{Real}) =
  ComplexDual(convert(T, real(x)), zero(T), convert(T, imag(x)), zero(T))


This works as expected:

julia @which convert(ComplexDual{Float64},2.0)
convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at 
/Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32

julia convert(ComplexDual{Float64},2.0)
complexdual(2.0, 0.0, 0.0, 0.0)


This does not:

julia convert(ComplexDual{Float64},Complex(2.0,3.0))
ERROR: `convert` has no method matching 
convert(::Type{ComplexDual{Float64}}, ::Complex{Float64})
 in convert at base.jl:13

even though

julia methods(convert)
...
convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at 
/Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32
convert{T:Real}(::Type{ComplexDual{T:Real}},x::Complex{Real}) at 
/Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:35
...

Any idea what gives?



Re: [julia-users] convert woes

2014-12-11 Thread Andreas Noack
Complex{Float64} is not a subtype of Complex{Real}. You could write the
definition as convert{T:Real}(::Type{ComplexDual{T}}, x::Complex) = ...
instead.

2014-12-11 9:41 GMT-05:00 Gustavo Goretkin gustavo.goret...@gmail.com:

 I'm on v0.3.4-pre

 I have two definitions:

 convert{T:Real}(::Type{ComplexDual{T}}, x::Real) =
   ComplexDual(convert(T, x))

 convert{T:Real}(::Type{ComplexDual{T}}, x::Complex{Real}) =
   ComplexDual(convert(T, real(x)), zero(T), convert(T, imag(x)), zero(T))


 This works as expected:

 julia @which convert(ComplexDual{Float64},2.0)
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32

 julia convert(ComplexDual{Float64},2.0)
 complexdual(2.0, 0.0, 0.0, 0.0)


 This does not:

 julia convert(ComplexDual{Float64},Complex(2.0,3.0))
 ERROR: `convert` has no method matching
 convert(::Type{ComplexDual{Float64}}, ::Complex{Float64})
  in convert at base.jl:13

 even though

 julia methods(convert)
 ...
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Complex{Real}) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:35
 ...

 Any idea what gives?




[julia-users] Re: convert woes

2014-12-11 Thread Ivar Nesje
None of your signatures match

convert(ComplexDual{Float64},Complex(2.0,3.0))

The first misses because it declares that x is Real (or subtype of real)
convert{T:Real}(::Type{ComplexDual{T}}, x::Real)
The next misses because it declares that x is Complex{Real} (or a subtype), 
but Complex{Float64} is not a subtype.

This is a design choice, and has some funny name in computer science, and 
has been discussed to death multiple times

I think the best rationale exemplified in the following method:

function f(A::Array{Real})
a = A[1]
# Now I know that a is a subtype of Real and my method is prepared for 
that
end

But in another function where I want to calculate and store something, I 
might actually want to ensure that the Array can store any Real value.

function f!(A::Array{Real})
# do some calculation that gets a Real result
A[1] = calculate_Real()
end

If you call the second function with Array{Float64}, it will not work 
properly (but implicitly call convert(Float64, x) on your values)

Ivar

kl. 15:41:17 UTC+1 torsdag 11. desember 2014 skrev Gustavo Goretkin 
følgende:

 I'm on v0.3.4-pre

 I have two definitions:

 convert{T:Real}(::Type{ComplexDual{T}}, x::Real) =
   ComplexDual(convert(T, x))

 convert{T:Real}(::Type{ComplexDual{T}}, x::Complex{Real}) =
   ComplexDual(convert(T, real(x)), zero(T), convert(T, imag(x)), zero(T))


 This works as expected:

 julia @which convert(ComplexDual{Float64},2.0)
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at 
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32

 julia convert(ComplexDual{Float64},2.0)
 complexdual(2.0, 0.0, 0.0, 0.0)


 This does not:

 julia convert(ComplexDual{Float64},Complex(2.0,3.0))
 ERROR: `convert` has no method matching 
 convert(::Type{ComplexDual{Float64}}, ::Complex{Float64})
  in convert at base.jl:13

 even though

 julia methods(convert)
 ...
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at 
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Complex{Real}) at 
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:35
 ...

 Any idea what gives?



[julia-users] Re: Julia on android

2014-12-11 Thread Benjamin Lind
I haven't tried it, but it might be possible with GNURoot 
https://play.google.com/store/apps/details?id=champion.gnuroot and one of 
its Linux distributions like Debian 
https://play.google.com/store/apps/details?id=champion.gnuroot.wheezy.

On Wednesday, December 10, 2014 9:59:24 AM UTC+3, googler wrote:

 Hi all

 Do anyone knows how I can run a julia program in android platform? Is 
 there any support given?

 Any help is appreciated!!

 Thanks and Regards
 googler



Re: [julia-users] Re: convert woes

2014-12-11 Thread Gustavo Goretkin
yeah, type invariance. Thanks for explaining it Andreas and Ivar.

Co/contra/in-variance is explained clearly in the manual, but I wasn't even
thinking about it because I got hung up on thinking that

convert{T:Real,S:Real}(::Type{ComplexDual{T}}, x::Complex{S})


and

convert{T:Rea}(::Type{ComplexDual{T}}, x::Complex{Real})


were equivalent definitions (if there is no reference to S in the body).
But there's no good reason to think that.

On Thu, Dec 11, 2014 at 10:04 AM, Ivar Nesje iva...@gmail.com wrote:

 None of your signatures match

 convert(ComplexDual{Float64},Complex(2.0,3.0))

 The first misses because it declares that x is Real (or subtype of real)
 convert{T:Real}(::Type{ComplexDual{T}}, x::Real)
 The next misses because it declares that x is Complex{Real} (or a
 subtype), but Complex{Float64} is not a subtype.

 This is a design choice, and has some funny name in computer science, and
 has been discussed to death multiple times

 I think the best rationale exemplified in the following method:

 function f(A::Array{Real})
 a = A[1]
 # Now I know that a is a subtype of Real and my method is prepared for
 that
 end

 But in another function where I want to calculate and store something, I
 might actually want to ensure that the Array can store any Real value.

 function f!(A::Array{Real})
 # do some calculation that gets a Real result
 A[1] = calculate_Real()
 end

 If you call the second function with Array{Float64}, it will not work
 properly (but implicitly call convert(Float64, x) on your values)

 Ivar

 kl. 15:41:17 UTC+1 torsdag 11. desember 2014 skrev Gustavo Goretkin
 følgende:

 I'm on v0.3.4-pre

 I have two definitions:

 convert{T:Real}(::Type{ComplexDual{T}}, x::Real) =
   ComplexDual(convert(T, x))

 convert{T:Real}(::Type{ComplexDual{T}}, x::Complex{Real}) =
   ComplexDual(convert(T, real(x)), zero(T), convert(T, imag(x)), zero(T))


 This works as expected:

 julia @which convert(ComplexDual{Float64},2.0)
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32

 julia convert(ComplexDual{Float64},2.0)
 complexdual(2.0, 0.0, 0.0, 0.0)


 This does not:

 julia convert(ComplexDual{Float64},Complex(2.0,3.0))
 ERROR: `convert` has no method matching convert(::Type{ComplexDual{Float64}},
 ::Complex{Float64})
  in convert at base.jl:13

 even though

 julia methods(convert)
 ...
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Real) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:32
 convert{T:Real}(::Type{ComplexDual{T:Real}},x::Complex{Real}) at
 /Users/goretkin/.julia/v0.3/ComplexDualNumbers/src/complexdual.jl:35
 ...

 Any idea what gives?




Re: [julia-users] Julia on android

2014-12-11 Thread João Felipe Santos
Biggest issue will be that most Android devices have an ARM processor, and ARM 
support is not complete at the moment. You may have more luck with Intel 
Android devices, though.

João

 On Dec 11, 2014, at 10:17 AM, Benjamin Lind lind.benja...@gmail.com wrote:
 
 I haven't tried it, but it might be possible with GNURoot 
 https://play.google.com/store/apps/details?id=champion.gnuroot and one of 
 its Linux distributions like Debian 
 https://play.google.com/store/apps/details?id=champion.gnuroot.wheezy.
 
 On Wednesday, December 10, 2014 9:59:24 AM UTC+3, googler wrote:
 Hi all
 
 Do anyone knows how I can run a julia program in android platform? Is there 
 any support given?
 
 Any help is appreciated!!
 
 Thanks and Regards
 googler



[julia-users] Re: Community Support for Julia on Travis CI

2014-12-11 Thread Tomas Lycken
This is utterly fantastic.

// Tomas

On Thursday, December 11, 2014 5:15:33 AM UTC+1, Pontus Stenetorp wrote:

 Everyone, 

 I am happy to announce that we now have Julia support on Travis [1]. 
 This has been an effort on the part of the newly formed JuliaCI [2] 
 group and concretely this means that a complete `.travis.yml` for a 
 Julia project can now be as concise as: 

 language: julia 
 julia: 
 - release 
 - nightly 

 As opposed to: 

 language: cpp 
 compiler: 
 - clang 
 env: 
 matrix: 
 - JULIAVERSION=juliareleases 
 - JULIAVERSION=julianightlies 
 before_install: 
 - sudo add-apt-repository ppa:staticfloat/julia-deps -y 
 - sudo add-apt-repository ppa:staticfloat/${JULIAVERSION} -y 
 - sudo apt-get update -qq -y 
 - sudo apt-get install libpcre3-dev julia -y 
 - if [[ -a .git/shallow ]]; then git fetch --unshallow; fi 
 script: 
 - julia --check-bounds=yes -e 'versioninfo(); Pkg.init(); 
 Pkg.clone(pwd()); Pkg.build(Foo.jl); Pkg.test(Foo.jl)' 

 A big thank you to all that made this possible with code, comments, 
 and feedback.  Also a big shout out to all the people over at 
 Travis CI for all their help.  These changes should soon be integrated 
 into `Pkg` so that new packages will make use of it and we will 
 hopefully soon also add support to test specific Julia release 
 branches and numbers. 

 Pontus Stenetorp 

 [1]: 
 http://blog.travis-ci.com/2014-12-10-community-driven-language-support-comes-to-travis-ci/
  
 [2]: https://github.com/JuliaCI 



[julia-users] Re: Community Support for Julia on Travis CI

2014-12-11 Thread Nils Gudat
Excuse the ignorant question, but what exactly does this mean? I haven't 
seen Travis CI before and clicking on the home page is slightly confusing...


Re: [julia-users] Roadmap

2014-12-11 Thread John Myles White
This is a very good point. I'd label this as something like core unsolved 
challenges. Julia #265 (https://github.com/JuliaLang/julia/issues/265) comes 
to mind.

In general, a list of the big issues would be much easier to maintain than a 
list of goals for the future. We could just use a tag like core on the issue 
tracker.

 -- John

On Dec 11, 2014, at 4:49 AM, Mike Innes mike.j.in...@gmail.com wrote:

 It seems to me that a lot of FAQs could be answered by a simple list of the 
 communities'/core developers' priorities. For example:
 
 We care about module load times and static compilation, so that's going to 
 happen eventually. We care about package documentation, which is basically 
 done. We don't care as much about deterministic memory management or TCO, so 
 neither of those things are happening any time soon.
 
 It doesn't have to be a commitment to releases or dates, or even be 
 particularly detailed, to give a good sense of where Julia is headed from a 
 user perspective. 
 
 Indeed, it's only the same things you end up posting on HN every time someone 
 complains that Gadfly is slow.
 
 On 11 December 2014 at 03:01, Tim Holy tim.h...@gmail.com wrote:
 Really nice summaries, John and Tony.
 
 On Thursday, December 11, 2014 02:08:54 AM Boylan, Ross wrote:
  BTW, is 0.4 still in a you don't want to go there state for users of
  julia?
 
 In short, yes---for most users I'd personally recommend sticking with 0.3.
 Unless you simply _must_ have some of its lovely new features. But be prepared
 to update your code basically every week or so to deal with changes.
 
 --Tim
 
 



Re: [julia-users] Re: Finite element code in Julia: Curious overhead in .* product

2014-12-11 Thread Petr Krysl
Valentin,

Thank you so much for this valuable information. 

I would define decent, as does the Julia community with respect to Julia 
itself, as being within a small factor of the best performance.  As Tim 
Holy demonstrated with his example, these two implementations run about a 
factor of 30 apart:

Fe += Ns * (f * Jac); #30 times slower than

for k=1:length(Fe)
  Fe[k]+=Ns[k]*(f * Jac);#this expanded loop
end
   
So that seems like a lot.  As you point out, one might be willing to write 
it like this if to decide between two pains (losing a readable code, or 
losing  a lot of performance) meant that loops win by a lot.

Petr

On Thursday, December 11, 2014 12:18:15 AM UTC-8, Valentin Churavy wrote:

 Dear Petr,

 I tought the point of having the array objects and the associated 
 manipulation functions was to hide the loops while delivering decent 
 performance...


 This assumption is very true in Matlab. Matlab spend an enormous 
 engineering amount on optimizing code that uses vectorization and array 
 objects (and it is still slow). There has been ongoing discussions over on 
 Github [1] on how to improve the current situation in Julia. One of the 
 major selling points of Julia for me is the fact that it is quite 
 transparent on which kind of optimizations it requires. I can write in a 
 very dynamic style with a lot of matrix operations like in Matlab and still 
 get decent performance or I can go in and identify with tools like @profile 
 [2] what the pains point in my program are. 

 The point of using vectorized operations in Matlab is that is the one 
 reliable way to get good performance in matlab, because all underlying 
 functions are written in C. In Julia most underlying functions are written 
 in Julia (note that most mathematical operations call out to C libraries. 
 No need to reinvent the wheel). There is a Julia package you can use for 
 porting over Matlab code that devectorizes you code [3], but if the 
 operation is occurring in a hot loop it still will be a good and necessary 
 optimization to unroll the vectorized code. Then you no longer get just 
 decent performance, but excellent performance instead. 

 Best Valentin

 PS:
 You can also use @which 1 .* [1] and @edit to see where the operation is 
 defined and how it is implemented. The .* operation is implemented by 
 allocating an output array and the running * element wise over the array. 
 No magic is going on there.

 [1] https://github.com/JuliaLang/julia/issues/8450 
 [2] http://docs.julialang.org/en/release-0.3/stdlib/profile/
 [3] https://github.com/lindahua/Devectorize.jl

 On Thursday, 11 December 2014 06:23:29 UTC+1, Petr Krysl wrote:

 Thanks.  Now my head is really spinning!

 See, before I posted the original question I tried expanding the loop in 
 the actual FE code, and the code was SLOWER and was using MORE memory:

 With the expression 
 Fe += Ns[j] * (f * Jac * w[j]); :
 6.223416655 seconds (1648832052 bytes

 With the expanded loop 
 for kx=1:length(Fe) # alternative (devectorization)
  Fe[kx] += Ns[j][kx] * (f * Jac * w[j]); 
 end 
  7.340272676 seconds (1776971604 bytes allocated,

 In addition, your argument clearly demonstrates how to avoid the 
 temporary array for doit1(), but doit2() adds to the 3 x 1 one additional 
 1x1 temporary (it seems to me), yet it is about 14 times slower. Why is 
 that?

 Finally, if the only way I can get decent performance with lines like

  Fe += Ns[j] * (f * Jac * w[j]);  # Fe and Ns[j] arrays

 is to manually write out all the loops, that would be terrible news 
 indeed.  Not only that is a lot of work when rewriting loads of Matlab code 
 (with several matrices concatenated in many many expressions), but the 
 legibility and maintainability tanks. I tought the point of having the 
 array objects and the associated manipulation functions was to hide the 
 loops while delivering decent performance...

 Petr


 On Wednesday, December 10, 2014 8:28:31 PM UTC-8, Tim Holy wrote:

 Multiplying two Float64s yields another Float64; most likely, this will 
 be 
 stored in the CPU's registers. In contrast,  [f]*Jac creates an array, 
 on each 
 iteration, that has to be stored on the heap. 

 A faster variant devectorizes: 
function doit1a(N) 
Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
for i=1:N 
tmp = f*Jac 
for j = 1:length(Fe) 
Fe[j] += Ns[j]*tmp 
end 
end 
Fe 
end 

 julia @time doit1(N); 
 elapsed time: 0.810270399 seconds (384000320 bytes allocated, 61.23% gc 
 time) 

 julia @time doit1a(N); 
 elapsed time: 0.022118726 seconds (320 bytes allocated) 

 Note the tiny allocations in the second case. 

 --Tim 


 On Wednesday, December 10, 2014 07:54:00 PM Petr Krysl wrote: 
  The code is really short: 
  
  N=200 
  
  function doit1(N) 
  Fe=zeros(3,1);Ns=zeros(3,1)+1.0;f=-6.0;Jac=1.0; 
  

[julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Petr Krysl
Acting upon the advice that replacing matrix-matrix multiplications in 
vectorized form with loops would help with performance, I chopped out a 
piece of code from my finite element solver 
(https://gist.github.com/anonymous/4ec426096c02faa4354d) and ran some tests 
with the following results:

Vectorized code:
elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc 
time)

Loops code:
elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc 
time) 

SLOWER and using MORE memory?!

I must be doing something terribly wrong.

Petr



Re: [julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Andreas Noack
See the comment in the gist.

2014-12-11 11:47 GMT-05:00 Petr Krysl krysl.p...@gmail.com:

 Acting upon the advice that replacing matrix-matrix multiplications in
 vectorized form with loops would help with performance, I chopped out a
 piece of code from my finite element solver (
 https://gist.github.com/anonymous/4ec426096c02faa4354d) and ran some
 tests with the following results:

 Vectorized code:
 elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc
 time)

 Loops code:
 elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc
 time)

 SLOWER and using MORE memory?!

 I must be doing something terribly wrong.

 Petr




Re: [julia-users] Unexpected append! behavior

2014-12-11 Thread Sean McBane
Ah, I see the error in my thinking now. Knowing what the ! signifies makes 
it make a lot more sense.

Thanks guys for putting up with a newbie. This was probably one of those 1 
in the morning questions that I should have waited to look at the next day 
before asking for help; it seems obvious now.

-- Sean

On Thursday, December 11, 2014 3:26:12 AM UTC-6, Mike Innes wrote:

 Think of append!(X, Y) as equivalent to X = vcat(X, Y). You called append! 
 twice, so X gets Y appended twice.

 julia X = [1,2]; Y = [3,4];

 julia X = vcat(X,Y)
 [1, 2, 3, 4]

 In your example you went ahead and did this again:

 julia X = (X = vcat(X, Y))
 [1, 2, 3, 4, 3, 4]

 But if you reset X, Y via the first statement and *then* call X = 
 append!(X, Y), it works as you would expect.

 julia X = [1,2]; Y = [3,4];

 julia X = append!(X, Y) # same as X = (X = vcat(X, Y))
 [1, 2, 3, 4]

 On 11 December 2014 at 07:51, Alex Ames alexande...@gmail.com 
 javascript: wrote:

 Functions that end with an exclamation point modify their arguments, but 
 they can return values just like any other function. For example:

 julia x = [1,2]; y = [3, 4]
 2-element Array{Int64,1}:
  3
  4

 julia append!(x,y)
 4-element Array{Int64,1}:
  1
  2
  3
  4

 julia z = append!(x,y)
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia z
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia x
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 The append! function takes two arrays, appends the second to the first, 
 then returns the values now contained by the first array. No recursion 
 craziness required.

 On Thursday, December 11, 2014 1:11:50 AM UTC-6, Sean McBane wrote:

 Ivar is correct; I was running in the Windows command prompt and 
 couldn't copy and paste so I copied it by hand and made an error.

 Ok, so I understand that append!(X,Y) is modifying X in place. But I 
 still do not get where the output for the second case, where the result of 
 append!(X,Y) is assigned back into X is what it is. It would make sense to 
 me if this resulted in a recursion with Y forever getting appended to X, 
 but as it is I don't understand.

 Thanks.

 -- Sean

 On Thursday, December 11, 2014 12:42:45 AM UTC-6, Ivar Nesje wrote:

 I assume the first line should be 

  X = [1,2]; Y = [3,4]; 

 Then the results you get makes sense. The thing is that julia has 
 mutable arrays, and the ! at the end of append! indicates that it is a 
 function that mutates it's argument.




Re: [julia-users] Unexpected append! behavior

2014-12-11 Thread Mike Innes
Happy to help

On 11 December 2014 at 17:04, Sean McBane seanmc...@gmail.com wrote:

 Ah, I see the error in my thinking now. Knowing what the ! signifies makes
 it make a lot more sense.

 Thanks guys for putting up with a newbie. This was probably one of those 1
 in the morning questions that I should have waited to look at the next day
 before asking for help; it seems obvious now.

 -- Sean

 On Thursday, December 11, 2014 3:26:12 AM UTC-6, Mike Innes wrote:

 Think of append!(X, Y) as equivalent to X = vcat(X, Y). You called
 append! twice, so X gets Y appended twice.

 julia X = [1,2]; Y = [3,4];

 julia X = vcat(X,Y)
 [1, 2, 3, 4]

 In your example you went ahead and did this again:

 julia X = (X = vcat(X, Y))
 [1, 2, 3, 4, 3, 4]

 But if you reset X, Y via the first statement and *then* call X =
 append!(X, Y), it works as you would expect.

 julia X = [1,2]; Y = [3,4];

 julia X = append!(X, Y) # same as X = (X = vcat(X, Y))
 [1, 2, 3, 4]

 On 11 December 2014 at 07:51, Alex Ames alexande...@gmail.com wrote:

 Functions that end with an exclamation point modify their arguments, but
 they can return values just like any other function. For example:

 julia x = [1,2]; y = [3, 4]
 2-element Array{Int64,1}:
  3
  4

 julia append!(x,y)
 4-element Array{Int64,1}:
  1
  2
  3
  4

 julia z = append!(x,y)
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia z
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 julia x
 6-element Array{Int64,1}:
  1
  2
  3
  4
  3
  4

 The append! function takes two arrays, appends the second to the first,
 then returns the values now contained by the first array. No recursion
 craziness required.

 On Thursday, December 11, 2014 1:11:50 AM UTC-6, Sean McBane wrote:

 Ivar is correct; I was running in the Windows command prompt and
 couldn't copy and paste so I copied it by hand and made an error.

 Ok, so I understand that append!(X,Y) is modifying X in place. But I
 still do not get where the output for the second case, where the result of
 append!(X,Y) is assigned back into X is what it is. It would make sense to
 me if this resulted in a recursion with Y forever getting appended to X,
 but as it is I don't understand.

 Thanks.

 -- Sean

 On Thursday, December 11, 2014 12:42:45 AM UTC-6, Ivar Nesje wrote:

 I assume the first line should be

  X = [1,2]; Y = [3,4];

 Then the results you get makes sense. The thing is that julia has
 mutable arrays, and the ! at the end of append! indicates that it is a
 function that mutates it's argument.





Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Mark Stock
Wow. I thought I read the docs thoroughly, and I didn't see anything about 
how += (without the [] operator!) rebinds. That's a surprising behavior to 
many scientific programmers, and should be better documented.

Thank you for the quick reply!

On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:

 `x += ...` is equivalent to writing `x = x + ...` which rebinds the 
 variable within that function. Instead, do an explicit array assignment 
 `x[:,:] = ...`

 This is discussed in the manual with a warning about type changes, but the 
 implication for arrays should probably be made clear as well: 
 http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#updating-operators

 (there are some ongoing discussions about in-place array operators to 
 improve the situation)

 On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com 
 javascript: wrote:

 Hello, n00b Julia user here. I have two functions that change the values 
 of a passed-in array. One works (dxFromX), but for the other one 
 (eulerStep) the caller does not see any changes to the array. Why is this?

 function dxFromX!(x,dxdt)
   fill!(dxdt,0.0)

   for itarg = 1:size(x,1)
 for isrc = 1:size(x,1)
   dx = x[isrc,1] - x[itarg,1]
   dy = x[isrc,2] - x[itarg,2]
   coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
   dxdt[itarg,1] -= coeff * dy
   dxdt[itarg,2] += coeff * dx
 end
   end
 end

 function eulerStep!(x)
   dxdt = zeros(x)
   print (\ndxdt before\n,dxdt[1:5,:],\n)
   dxFromX!(x,dxdt)
   print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
   x += 1.0f-5*dxdt
   print (\nx inside\n,x[1:5,:],\n)
 end

 x = float32(rand(1024,2))
 print (\nx before\n,x[1:5,:],\n)
 @time eulerStep!(x)
 print (\nx after is unchanged!\n,x[1:5,:],\n)

 I see the same behavior on 0.3.3 and 0.4.0, both release and debug 
 binaries, on OSX and Linux. 




Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Mark Stock
The line now reads

x[:,:] += 1.0f-5*dxdt

And the result is now correct, but memory usage increased. Shouldn't it go 
down if we're re-assigning to the same variable?

On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:

 `x += ...` is equivalent to writing `x = x + ...` which rebinds the 
 variable within that function. Instead, do an explicit array assignment 
 `x[:,:] = ...`

 This is discussed in the manual with a warning about type changes, but the 
 implication for arrays should probably be made clear as well: 
 http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#updating-operators

 (there are some ongoing discussions about in-place array operators to 
 improve the situation)

 On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com 
 javascript: wrote:

 Hello, n00b Julia user here. I have two functions that change the values 
 of a passed-in array. One works (dxFromX), but for the other one 
 (eulerStep) the caller does not see any changes to the array. Why is this?

 function dxFromX!(x,dxdt)
   fill!(dxdt,0.0)

   for itarg = 1:size(x,1)
 for isrc = 1:size(x,1)
   dx = x[isrc,1] - x[itarg,1]
   dy = x[isrc,2] - x[itarg,2]
   coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
   dxdt[itarg,1] -= coeff * dy
   dxdt[itarg,2] += coeff * dx
 end
   end
 end

 function eulerStep!(x)
   dxdt = zeros(x)
   print (\ndxdt before\n,dxdt[1:5,:],\n)
   dxFromX!(x,dxdt)
   print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
   x += 1.0f-5*dxdt
   print (\nx inside\n,x[1:5,:],\n)
 end

 x = float32(rand(1024,2))
 print (\nx before\n,x[1:5,:],\n)
 @time eulerStep!(x)
 print (\nx after is unchanged!\n,x[1:5,:],\n)

 I see the same behavior on 0.3.3 and 0.4.0, both release and debug 
 binaries, on OSX and Linux. 




Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread John Myles White
Nope.

You'll find Julia much easier to program in if you always replace x += y with x 
= x + y before attempting to reason about performance. In this case, you'll
get

x[:, :] = x[:, :] + 1.0f - 5 * dxdt

In other words, you literally make a copy of the entire matrix x before doing 
any useful work.

 -- John

On Dec 11, 2014, at 12:21 PM, Mark Stock mark.j.st...@gmail.com wrote:

 The line now reads
 
 x[:,:] += 1.0f-5*dxdt
 
 And the result is now correct, but memory usage increased. Shouldn't it go 
 down if we're re-assigning to the same variable?
 
 On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:
 `x += ...` is equivalent to writing `x = x + ...` which rebinds the variable 
 within that function. Instead, do an explicit array assignment `x[:,:] = ...`
 
 This is discussed in the manual with a warning about type changes, but the 
 implication for arrays should probably be made clear as well: 
 http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#updating-operators
 
 (there are some ongoing discussions about in-place array operators to improve 
 the situation)
 
 On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com wrote:
 Hello, n00b Julia user here. I have two functions that change the values of a 
 passed-in array. One works (dxFromX), but for the other one (eulerStep) the 
 caller does not see any changes to the array. Why is this?
 
 function dxFromX!(x,dxdt)
   fill!(dxdt,0.0)
 
   for itarg = 1:size(x,1)
 for isrc = 1:size(x,1)
   dx = x[isrc,1] - x[itarg,1]
   dy = x[isrc,2] - x[itarg,2]
   coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
   dxdt[itarg,1] -= coeff * dy
   dxdt[itarg,2] += coeff * dx
 end
   end
 end
 
 function eulerStep!(x)
   dxdt = zeros(x)
   print (\ndxdt before\n,dxdt[1:5,:],\n)
   dxFromX!(x,dxdt)
   print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
   x += 1.0f-5*dxdt
   print (\nx inside\n,x[1:5,:],\n)
 end
 
 x = float32(rand(1024,2))
 print (\nx before\n,x[1:5,:],\n)
 @time eulerStep!(x)
 print (\nx after is unchanged!\n,x[1:5,:],\n)
 
 I see the same behavior on 0.3.3 and 0.4.0, both release and debug binaries, 
 on OSX and Linux. 
 



Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Tim Holy
On Thursday, December 11, 2014 09:14:32 AM Mark Stock wrote:
 Wow. I thought I read the docs thoroughly, and I didn't see anything about
 how += (without the [] operator!) rebinds. That's a surprising behavior to
 many scientific programmers, and should be better documented.

Please, add that documentation. (Go to 
https://github.com/JuliaLang/julia/tree/master/doc, find the file you need to 
edit, and click on the little pencil icon.)

--Tim

 
 Thank you for the quick reply!
 
 On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:
  `x += ...` is equivalent to writing `x = x + ...` which rebinds the
  variable within that function. Instead, do an explicit array assignment
  `x[:,:] = ...`
  
  This is discussed in the manual with a warning about type changes, but the
  implication for arrays should probably be made clear as well:
  http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#upd
  ating-operators
  
  (there are some ongoing discussions about in-place array operators to
  improve the situation)
  
  On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com
  
  javascript: wrote:
  Hello, n00b Julia user here. I have two functions that change the values
  of a passed-in array. One works (dxFromX), but for the other one
  (eulerStep) the caller does not see any changes to the array. Why is
  this?
  
  function dxFromX!(x,dxdt)
  
fill!(dxdt,0.0)

for itarg = 1:size(x,1)

  for isrc = 1:size(x,1)
  
dx = x[isrc,1] - x[itarg,1]
dy = x[isrc,2] - x[itarg,2]
coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
dxdt[itarg,1] -= coeff * dy
dxdt[itarg,2] += coeff * dx
  
  end

end
  
  end
  
  function eulerStep!(x)
  
dxdt = zeros(x)
print (\ndxdt before\n,dxdt[1:5,:],\n)
dxFromX!(x,dxdt)
print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
x += 1.0f-5*dxdt
print (\nx inside\n,x[1:5,:],\n)
  
  end
  
  x = float32(rand(1024,2))
  print (\nx before\n,x[1:5,:],\n)
  @time eulerStep!(x)
  print (\nx after is unchanged!\n,x[1:5,:],\n)
  
  I see the same behavior on 0.3.3 and 0.4.0, both release and debug
  binaries, on OSX and Linux.



Re: [julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Petr Krysl
Dear Andreas,

Thank you very much. True, I have not noticed that. I put the definitions 
of the arrays outside of the two functions so that their results could be 
compared.

What I'm trying to do here is write a simple chunk of code that would 
reproduce the conditions in the FE package.
There the vectorized code and the loops only see local variables, declared 
above the major loop.  So in my opinion the conditions then are the same as 
in the corrected fragment from the gist (only local variables).

Now I can see that the fragment for some reason did not reproduce the 
conditions from the full code.  Indeed, as you predicted the loop 
implementation is almost 10 times faster than the vectorized version. 
 However, in the FE code the loops run twice as slow and consume more 
memory.

Just in case you, Andreas, or anyone else are curious,  here is the full FE 
code that displays the weird behavior of loops being slower than vectorized 
code.
https://gist.github.com/PetrKryslUCSD/ae4a0f218fe50abe370f

Thanks again,

Petr

On Thursday, December 11, 2014 9:02:00 AM UTC-8, Andreas Noack wrote:

 See the comment in the gist.

 2014-12-11 11:47 GMT-05:00 Petr Krysl krysl...@gmail.com javascript::

 Acting upon the advice that replacing matrix-matrix multiplications in 
 vectorized form with loops would help with performance, I chopped out a 
 piece of code from my finite element solver (
 https://gist.github.com/anonymous/4ec426096c02faa4354d) and ran some 
 tests with the following results:

 Vectorized code:
 elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc 
 time)

 Loops code:
 elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc 
 time) 

 SLOWER and using MORE memory?!

 I must be doing something terribly wrong.

 Petr




Re: [julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Petr Krysl
One more note: I conjectured that perhaps the compiler was not able to 
infer correctly the type of the matrices,  so I hardwired (in the actual FE 
code)

Jac = 1.0; gradN = gradNparams[j]/(J); # get rid of Rm for the moment

About 10% less memory used, runtime about the same.  So, no effect really. 
Loops are still slower than the vectorized code by a factor of two.

Petr




Re: [julia-users] Roadmap

2014-12-11 Thread Stefan Karpinski
We might want to link to canned searches on GitHub to issues that are
relevant. For example, we do use milestones to categorize issues so we
could link to stable release issues and development release issues. That's
not quite a roadmap but it does help to give visitors some clue about
what's in the works without adding to the burden for developers (since
we're already using the milestones for organizational purposes).

On Thu, Dec 11, 2014 at 10:50 AM, John Myles White johnmyleswh...@gmail.com
 wrote:

 This is a very good point. I'd label this as something like core unsolved
 challenges. Julia #265 (https://github.com/JuliaLang/julia/issues/265)
 comes to mind.

 In general, a list of the big issues would be much easier to maintain
 than a list of goals for the future. We could just use a tag like core on
 the issue tracker.

  -- John

 On Dec 11, 2014, at 4:49 AM, Mike Innes mike.j.in...@gmail.com wrote:

 It seems to me that a lot of FAQs could be answered by a simple list of
 the communities'/core developers' priorities. For example:

 We care about module load times and static compilation, so that's going to
 happen eventually. We care about package documentation, which is basically
 done. We don't care as much about deterministic memory management or TCO,
 so neither of those things are happening any time soon.

 It doesn't have to be a commitment to releases or dates, or even be
 particularly detailed, to give a good sense of where Julia is headed from a
 user perspective.

 Indeed, it's only the same things you end up posting on HN every time
 someone complains that Gadfly is slow.

 On 11 December 2014 at 03:01, Tim Holy tim.h...@gmail.com wrote:

 Really nice summaries, John and Tony.

 On Thursday, December 11, 2014 02:08:54 AM Boylan, Ross wrote:
  BTW, is 0.4 still in a you don't want to go there state for users of
  julia?

 In short, yes---for most users I'd personally recommend sticking with 0.3.
 Unless you simply _must_ have some of its lovely new features. But be
 prepared
 to update your code basically every week or so to deal with changes.

 --Tim






Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Mark Stock
Is there any way to update the array in-place without writing an explicit 
loop? I imagine that would be more efficient.

Mark

On Thursday, December 11, 2014 10:23:35 AM UTC-7, John Myles White wrote:

 Nope.

 You'll find Julia much easier to program in if you always replace x += y 
 with x = x + y before attempting to reason about performance. In this case, 
 you'll
 get

 x[:, :] = x[:, :] + 1.0f - 5 * dxdt

 In other words, you literally make a copy of the entire matrix x before 
 doing any useful work.

  -- John

 On Dec 11, 2014, at 12:21 PM, Mark Stock mark.j...@gmail.com 
 javascript: wrote:

 The line now reads

 x[:,:] += 1.0f-5*dxdt

 And the result is now correct, but memory usage increased. Shouldn't it go 
 down if we're re-assigning to the same variable?

 On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:

 `x += ...` is equivalent to writing `x = x + ...` which rebinds the 
 variable within that function. Instead, do an explicit array assignment 
 `x[:,:] = ...`

 This is discussed in the manual with a warning about type changes, but 
 the implication for arrays should probably be made clear as well: 
 http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#updating-operators

 (there are some ongoing discussions about in-place array operators to 
 improve the situation)

 On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com wrote:

 Hello, n00b Julia user here. I have two functions that change the values 
 of a passed-in array. One works (dxFromX), but for the other one 
 (eulerStep) the caller does not see any changes to the array. Why is this?

 function dxFromX!(x,dxdt)
   fill!(dxdt,0.0)

   for itarg = 1:size(x,1)
 for isrc = 1:size(x,1)
   dx = x[isrc,1] - x[itarg,1]
   dy = x[isrc,2] - x[itarg,2]
   coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
   dxdt[itarg,1] -= coeff * dy
   dxdt[itarg,2] += coeff * dx
 end
   end
 end

 function eulerStep!(x)
   dxdt = zeros(x)
   print (\ndxdt before\n,dxdt[1:5,:],\n)
   dxFromX!(x,dxdt)
   print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
   x += 1.0f-5*dxdt
   print (\nx inside\n,x[1:5,:],\n)
 end

 x = float32(rand(1024,2))
 print (\nx before\n,x[1:5,:],\n)
 @time eulerStep!(x)
 print (\nx after is unchanged!\n,x[1:5,:],\n)

 I see the same behavior on 0.3.3 and 0.4.0, both release and debug 
 binaries, on OSX and Linux. 





Re: [julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Peter Simon
One thing I noticed after a quick glance:  The ordering of your nested 
loops is very cache-unfriendly.  Julia stores arrays in column-major order 
(same as Fortran) so that nested loops should arrange that the first 
subscripts of multidimensional arrays are varied most rapidly.

--Peter

On Thursday, December 11, 2014 9:47:33 AM UTC-8, Petr Krysl wrote:

 One more note: I conjectured that perhaps the compiler was not able to 
 infer correctly the type of the matrices,  so I hardwired (in the actual FE 
 code)

 Jac = 1.0; gradN = gradNparams[j]/(J); # get rid of Rm for the moment

 About 10% less memory used, runtime about the same.  So, no effect really. 
 Loops are still slower than the vectorized code by a factor of two.

 Petr




Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Isaiah Norton

 Is there any way to update the array in-place without writing an explicit
 loop? I imagine that would be more efficient.


Not yet - see the discussion in #249 (and many others linked from there):
https://github.com/JuliaLang/julia/issues/249


 Mark

 On Thursday, December 11, 2014 10:23:35 AM UTC-7, John Myles White wrote:

 Nope.

 You'll find Julia much easier to program in if you always replace x += y
 with x = x + y before attempting to reason about performance. In this case,
 you'll
 get

 x[:, :] = x[:, :] + 1.0f - 5 * dxdt

 In other words, you literally make a copy of the entire matrix x before
 doing any useful work.

  -- John

 On Dec 11, 2014, at 12:21 PM, Mark Stock mark.j...@gmail.com wrote:

 The line now reads

 x[:,:] += 1.0f-5*dxdt

 And the result is now correct, but memory usage increased. Shouldn't it
 go down if we're re-assigning to the same variable?

 On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:

 `x += ...` is equivalent to writing `x = x + ...` which rebinds the
 variable within that function. Instead, do an explicit array assignment
 `x[:,:] = ...`

 This is discussed in the manual with a warning about type changes, but
 the implication for arrays should probably be made clear as well:
 http://julia.readthedocs.org/en/latest/manual/mathematical-
 operations/#updating-operators

 (there are some ongoing discussions about in-place array operators to
 improve the situation)

 On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com wrote:

 Hello, n00b Julia user here. I have two functions that change the
 values of a passed-in array. One works (dxFromX), but for the other one
 (eulerStep) the caller does not see any changes to the array. Why is this?

 function dxFromX!(x,dxdt)
   fill!(dxdt,0.0)

   for itarg = 1:size(x,1)
 for isrc = 1:size(x,1)
   dx = x[isrc,1] - x[itarg,1]
   dy = x[isrc,2] - x[itarg,2]
   coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
   dxdt[itarg,1] -= coeff * dy
   dxdt[itarg,2] += coeff * dx
 end
   end
 end

 function eulerStep!(x)
   dxdt = zeros(x)
   print (\ndxdt before\n,dxdt[1:5,:],\n)
   dxFromX!(x,dxdt)
   print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
   x += 1.0f-5*dxdt
   print (\nx inside\n,x[1:5,:],\n)
 end

 x = float32(rand(1024,2))
 print (\nx before\n,x[1:5,:],\n)
 @time eulerStep!(x)
 print (\nx after is unchanged!\n,x[1:5,:],\n)

 I see the same behavior on 0.3.3 and 0.4.0, both release and debug
 binaries, on OSX and Linux.






Re: [julia-users] Changes to array are not visible from caller

2014-12-11 Thread Tim Holy
On Thursday, December 11, 2014 09:51:32 AM Mark Stock wrote:
 Is there any way to update the array in-place without writing an explicit
 loop? I imagine that would be more efficient.

It's not. Other languages you may be used to call C, and their underlying C 
code uses...loops. Julia's loops are just as fast (and use SIMD vectorization, 
etc, when possible).

I'll repeat John's point that

   x[:, :] = RHS

updates x in-place, and on its own it is not at all wasteful. It's just a 
question of whether you need to allocate space for RHS---that's where the 
problem comes in.

--Tim


 
 Mark
 
 On Thursday, December 11, 2014 10:23:35 AM UTC-7, John Myles White wrote:
  Nope.
  
  You'll find Julia much easier to program in if you always replace x += y
  with x = x + y before attempting to reason about performance. In this
  case,
  you'll
  get
  
  x[:, :] = x[:, :] + 1.0f - 5 * dxdt
  
  In other words, you literally make a copy of the entire matrix x before
  doing any useful work.
  
   -- John
  
  On Dec 11, 2014, at 12:21 PM, Mark Stock mark.j...@gmail.com
  javascript: wrote:
  
  The line now reads
  
  x[:,:] += 1.0f-5*dxdt
  
  And the result is now correct, but memory usage increased. Shouldn't it go
  down if we're re-assigning to the same variable?
  
  On Wednesday, December 10, 2014 11:57:28 PM UTC-7, Isaiah wrote:
  `x += ...` is equivalent to writing `x = x + ...` which rebinds the
  variable within that function. Instead, do an explicit array assignment
  `x[:,:] = ...`
  
  This is discussed in the manual with a warning about type changes, but
  the implication for arrays should probably be made clear as well:
  http://julia.readthedocs.org/en/latest/manual/mathematical-operations/#up
  dating-operators
  
  (there are some ongoing discussions about in-place array operators to
  improve the situation)
  
  On Wed, Dec 10, 2014 at 7:44 PM, Mark Stock mark.j...@gmail.com wrote:
  Hello, n00b Julia user here. I have two functions that change the values
  of a passed-in array. One works (dxFromX), but for the other one
  (eulerStep) the caller does not see any changes to the array. Why is
  this?
  
  function dxFromX!(x,dxdt)
  
fill!(dxdt,0.0)

for itarg = 1:size(x,1)

  for isrc = 1:size(x,1)
  
dx = x[isrc,1] - x[itarg,1]
dy = x[isrc,2] - x[itarg,2]
coeff = 1.0f0 / (dx^2 + dy^2 + 0.1f0)
dxdt[itarg,1] -= coeff * dy
dxdt[itarg,2] += coeff * dx
  
  end

end
  
  end
  
  function eulerStep!(x)
  
dxdt = zeros(x)
print (\ndxdt before\n,dxdt[1:5,:],\n)
dxFromX!(x,dxdt)
print (\ndxdt after has changed\n,dxdt[1:5,:],\n)
x += 1.0f-5*dxdt
print (\nx inside\n,x[1:5,:],\n)
  
  end
  
  x = float32(rand(1024,2))
  print (\nx before\n,x[1:5,:],\n)
  @time eulerStep!(x)
  print (\nx after is unchanged!\n,x[1:5,:],\n)
  
  I see the same behavior on 0.3.3 and 0.4.0, both release and debug
  binaries, on OSX and Linux.



[julia-users] Re: Are there julia versions of dynamic time warping and peak finding in noisy data?

2014-12-11 Thread Joe Fowler
Hi g

You and I have discussed this privately and decided to work up a 
DynamicTimeWarp package in Julia ourselves, because we couldn't find one. 
It's not nearly ready for real-world use, I think, but it can be found at 
GitHub: https://github.com/joefowler/DynamicTimeWarp.jl

Our goals for the package include:

   1. Performing Dynamic Time Warping between 2 time series.
   2. Performing DTW with the solution path restricted to a specified 
   window. This restriction speeds up the computation but can fail to find 
   the global optimum.
   3. The FastDTW algorithm (Salvador  Chan, 2007), which effectively 
   chooses a window by downsampling the original problem and running FastDTW 
   on that (or as a base case, running DTW once the down sampled problem is 
   small enough). Also faster but potentially misses the full DTW solution.
   4. DTW Barycenter Averaging (Petitjean, Ketterlin, and Gancarski, 
   _Pattern Recognition_ 44, 2011).  This algorithm aims to create a 
   consensus sequence iteratively from 2 or more sequences, using the 
   identification of samples that DTW generates between the latest consensus 
   and the constituent sequences.
   5. Tools for using DTW to align spectra. In our work, this would mean 
   calibration to unify uncalibrated energy spectra from x-ray spectrometers. 
   This is not a well-defined goal yet, but it's the reason that you and I 
   actually care about DTW.
   6. Demonstrations, documentation, good tests, and the usual things like 
   that.


Peak-finding, e.g. by continuous wavelet transforms or any other method, is 
a separate issue.

--Joe

On Wednesday, December 3, 2014 4:03:41 PM UTC-7, g wrote:

 Hello,

 I'm interested in using dynamic time warping and an algorithm for peak 
 finding in noisy data (like scipy.signal.find_peaks_cwt).  I'm just 
 wondering if there are any Julia implementations around, otherwise I'll 
 probably just use PyCall for now to use existing python code.






Re: [julia-users] Aren't loops supposed to be faster?

2014-12-11 Thread Petr Krysl
I experimented with it a little bit before (mx innermost loop): does not 
make a difference.

On Thursday, December 11, 2014 9:55:46 AM UTC-8, Peter Simon wrote:

 One thing I noticed after a quick glance:  The ordering of your nested 
 loops is very cache-unfriendly.  Julia stores arrays in column-major order 
 (same as Fortran) so that nested loops should arrange that the first 
 subscripts of multidimensional arrays are varied most rapidly.

 --Peter

 On Thursday, December 11, 2014 9:47:33 AM UTC-8, Petr Krysl wrote:

 One more note: I conjectured that perhaps the compiler was not able to 
 infer correctly the type of the matrices,  so I hardwired (in the actual FE 
 code)

 Jac = 1.0; gradN = gradNparams[j]/(J); # get rid of Rm for the moment

 About 10% less memory used, runtime about the same.  So, no effect 
 really. Loops are still slower than the vectorized code by a factor of two.

 Petr




Re: [julia-users] How save Dict{Any,Int64} ?

2014-12-11 Thread Paul Analyst

I now :
writecsv(dict.csv, dict)

d1=readcsv(dict.csv)
d=Dict(d1[:,1],d1[:,2])
or
d=Dict(d1[:,1],int(d1[:,2]) ) if second col is int

Paul

Is it a bug
save(o2a.jld,o2a,o2a)
?
if o2a is Dict

Paul

W dniu 2014-12-11 o 14:57, Daniel Høegh pisze:

You can serialize it see 
https://groups.google.com/forum/m/?fromgroups#!topic/julia-users/zN7OmKwnG40




[julia-users] Install Julia packages without internet

2014-12-11 Thread Pooja Khanna
Hello,

I have no Internet connection on the server I am running Julia on.
What would be the steps to install packages on this machine?

Any pointers would be appreciated.

Thanks,
Pooja Khanna


Re: [julia-users] Install Julia packages without internet

2014-12-11 Thread Stefan Karpinski
Does the machine have a USB drive or something?

On Thu, Dec 11, 2014 at 3:16 PM, Pooja Khanna ms.poojakha...@gmail.com
wrote:

 Hello,

 I have no Internet connection on the server I am running Julia on.
 What would be the steps to install packages on this machine?

 Any pointers would be appreciated.

 Thanks,
 Pooja Khanna



[julia-users] Help with types (Arrays)

2014-12-11 Thread S


Hi all - very new to julia and am trying to wrap my head around multiple 
dispatch.

I would think that because Int64 : Integer, that an array of Int64 is an 
array of Integer. However, that doesn't appear to be the case.

That is, with c = [1,2,3,4], I can't create a constructor function 
foo(myarr::Array{Integer,1}) - and Vector{Integer} doesn't work either.

How do I create a constructor that takes a vector of Integers (of any 
subtype) - and only integers - and operates on the elements?


julia c = [1,2,3,4]

julia c
4-element Array{Int64,1}:
 1
 2
 3
 4

julia isa(c,Vector)
true

julia isa(c,Vector{Integer})
false

julia isa(c,Vector{Int64})
true

julia isa(c,Array{Integer})
false


[julia-users] Re: MatrixDepot.jl: A Test Matrix Collection

2014-12-11 Thread cdm

thank you for your fine work in this area ...

do not be surprised if you inspired some
to take on some of these collections:

   http://math.nist.gov/MatrixMarket/

   http://www.cise.ufl.edu/research/sparse/matrices/


well done and good show !

best,

cdm



On Thursday, December 11, 2014 12:30:23 AM UTC-8, Weijian Zhang wrote:

 Hello,

 So far I have included 20 matrices in Matrix Depot. I just modified the 
 function matrixdepot() so it should display information nicely.

 The repository is here: https://github.com/weijianzhang/MatrixDepot.jl

 The documentation is here: 
 http://nbviewer.ipython.org/github/weijianzhang/MatrixDepot.jl/blob/master/doc/juliadoc.ipynb

 Let me know how you feel about it and if you have any questions.

 Thanks,

 Weijian






[julia-users] EACCESS errors when starting julia 0.3.3

2014-12-11 Thread Robbin Bonthond
I'm trying to start julia 0.3.3 on a RHEL510 linux 64b machine. The tool 
has been installed centrally via a NFS share where I don't have write 
access to.

$ /tool/julia/0.3.3/bin/julia
   _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type help() for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.3.3 (2014-11-23 20:19 UTC)
 _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ release
|__/   |  x86_64-redhat-linux

ERROR: stat: permission denied (EACCES)
 in stat at ./stat.jl:43
 in isfile at ./stat.jl:103
 in load_juliarc at ./client.jl:322
 in _start at ./client.jl:382
 in _start_3B_1732 at /tool/julia/0.3.3/bin/../lib/julia/sys.so

$ ls -l /tool/julia/0.3.3/bin/../lib/julia/sys.so
-r-xr-xr-x 1 cad cad 6007276 Nov 30 14:57 
/tool/julia/0.3.3/bin/../lib/julia/sys.so*

Any suggestions?

Robbin



Re: [julia-users] Install Julia packages without internet

2014-12-11 Thread Pooja Khanna
I can copy stuff onto it but do not have physical access.

On Thursday, December 11, 2014 12:26:16 PM UTC-8, Stefan Karpinski wrote:

 Does the machine have a USB drive or something?

 On Thu, Dec 11, 2014 at 3:16 PM, Pooja Khanna ms.pooj...@gmail.com 
 javascript: wrote:

 Hello,

 I have no Internet connection on the server I am running Julia on.
 What would be the steps to install packages on this machine?

 Any pointers would be appreciated.

 Thanks,
 Pooja Khanna




Re: [julia-users] Help with types (Arrays)

2014-12-11 Thread Isaiah Norton
I suggest to start with this recent thread, and there are some others if
you search for covariance on the list archives:

https://groups.google.com/d/msg/julia-users/dEnCJ-nxAGE/1Bw4a1UvFCEJ

(if it is still unclear how to do what you need after that, feel free to
ask)






On Thu, Dec 11, 2014 at 3:26 PM, S sab...@gmail.com wrote:



 Hi all - very new to julia and am trying to wrap my head around multiple
 dispatch.

 I would think that because Int64 : Integer, that an array of Int64 is an
 array of Integer. However, that doesn't appear to be the case.

 That is, with c = [1,2,3,4], I can't create a constructor function
 foo(myarr::Array{Integer,1}) - and Vector{Integer} doesn't work either.

 How do I create a constructor that takes a vector of Integers (of any
 subtype) - and only integers - and operates on the elements?


 julia c = [1,2,3,4]

 julia c
 4-element Array{Int64,1}:
  1
  2
  3
  4

 julia isa(c,Vector)
 true

 julia isa(c,Vector{Integer})
 false

 julia isa(c,Vector{Int64})
 true

 julia isa(c,Array{Integer})
 false



Re: [julia-users] Help with types (Arrays)

2014-12-11 Thread S


On Thursday, December 11, 2014 12:34:10 PM UTC-8, Isaiah wrote:

 I suggest to start with this recent thread, and there are some others if 
 you search for covariance on the list archives:

 https://groups.google.com/d/msg/julia-users/dEnCJ-nxAGE/1Bw4a1UvFCEJ

 (if it is still unclear how to do what you need after that, feel free to 
 ask)





Thank you. I'm sorry I didn't see this before posting. I'll have a look. 


Re: [julia-users] Install Julia packages without internet

2014-12-11 Thread Stefan Karpinski
I guess I would install the packages you need on another system with a
similar OS and setup and then copy over the julia install directory and the
~/.julia directory.

On Thu, Dec 11, 2014 at 3:30 PM, Pooja Khanna ms.poojakha...@gmail.com
wrote:

 I can copy stuff onto it but do not have physical access.

 On Thursday, December 11, 2014 12:26:16 PM UTC-8, Stefan Karpinski wrote:

 Does the machine have a USB drive or something?

 On Thu, Dec 11, 2014 at 3:16 PM, Pooja Khanna ms.pooj...@gmail.com
 wrote:

 Hello,

 I have no Internet connection on the server I am running Julia on.
 What would be the steps to install packages on this machine?

 Any pointers would be appreciated.

 Thanks,
 Pooja Khanna





[julia-users] Re: EACCESS errors when starting julia 0.3.3

2014-12-11 Thread Robbin Bonthond
found it, for some reason the binary installer of julia uses restricted 
group permissions

$ ll -l /tool/julia/0.3.3/./etc
total 4.0K
dr-x--S--- 2 cad cad 4.0K Dec 11 14:56 julia

We fixed all directories to be accessible, and things are working now.

Robbin


On Thursday, December 11, 2014 2:28:53 PM UTC-6, Robbin Bonthond wrote:

 I'm trying to start julia 0.3.3 on a RHEL510 linux 64b machine. The tool 
 has been installed centrally via a NFS share where I don't have write 
 access to.

 $ /tool/julia/0.3.3/bin/julia
_
_   _ _(_)_ |  A fresh approach to technical computing
   (_) | (_) (_)|  Documentation: http://docs.julialang.org
_ _   _| |_  __ _   |  Type help() for help.
   | | | | | | |/ _` |  |
   | | |_| | | | (_| |  |  Version 0.3.3 (2014-11-23 20:19 UTC)
  _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ release
 |__/   |  x86_64-redhat-linux

 ERROR: stat: permission denied (EACCES)
  in stat at ./stat.jl:43
  in isfile at ./stat.jl:103
  in load_juliarc at ./client.jl:322
  in _start at ./client.jl:382
  in _start_3B_1732 at /tool/julia/0.3.3/bin/../lib/julia/sys.so

 $ ls -l /tool/julia/0.3.3/bin/../lib/julia/sys.so
 -r-xr-xr-x 1 cad cad 6007276 Nov 30 14:57 
 /tool/julia/0.3.3/bin/../lib/julia/sys.so*

 Any suggestions?

 Robbin



Re: [julia-users] Re: EACCESS errors when starting julia 0.3.3

2014-12-11 Thread Stefan Karpinski
On Thu, Dec 11, 2014 at 4:05 PM, Robbin Bonthond robbin.bonth...@gmail.com
wrote:

 for some reason the binary installer of julia uses restricted group
 permissions


That sounds like a potential problem – would you mind filing an issue?

https://github.com/JuliaLang/julia/issues


Re: [julia-users] Install Julia packages without internet

2014-12-11 Thread Pooja Khanna
That worked. Thank you so much!

Pooja

On Thursday, December 11, 2014 1:00:17 PM UTC-8, Stefan Karpinski wrote:

 I guess I would install the packages you need on another system with a 
 similar OS and setup and then copy over the julia install directory and the 
 ~/.julia directory.

 On Thu, Dec 11, 2014 at 3:30 PM, Pooja Khanna ms.pooj...@gmail.com 
 javascript: wrote:

 I can copy stuff onto it but do not have physical access.

 On Thursday, December 11, 2014 12:26:16 PM UTC-8, Stefan Karpinski wrote:

 Does the machine have a USB drive or something?

 On Thu, Dec 11, 2014 at 3:16 PM, Pooja Khanna ms.pooj...@gmail.com 
 wrote:

 Hello,

 I have no Internet connection on the server I am running Julia on.
 What would be the steps to install packages on this machine?

 Any pointers would be appreciated.

 Thanks,
 Pooja Khanna





[julia-users] Is there a function that performs (1:length(v))[v] for v a Vector{Bool} or a BitArray?

2014-12-11 Thread Douglas Bates
I realize it would be a one-liner to write one but, in the interests of not 
reinventing the wheel, I wanted to ask if I had missed a function that does 
this.  In R there is such a function called which but that name is 
already taken in Julia for something else.


Re: [julia-users] Reviewing a Julia programming book for Packt

2014-12-11 Thread cdm

i will vote with my green paper and let the market decide ...

here is some competition in the book space:

LJTHW (aka: 'the g–ddamn book')
http://chrisvoncsefalvay.com/2014/12/11/A-change-of-seasons.html


awesome.

cdm


On Friday, December 5, 2014 9:23:29 AM UTC-8, Iain Dunning wrote:

 SNIP 

I don't think such a book should exist (yet). 

SNIP



Re: [julia-users] Is there a function that performs (1:length(v))[v] for v a Vector{Bool} or a BitArray?

2014-12-11 Thread John Myles White
Does find() work?

 -- John

On Dec 11, 2014, at 4:19 PM, Douglas Bates dmba...@gmail.com wrote:

 I realize it would be a one-liner to write one but, in the interests of not 
 reinventing the wheel, I wanted to ask if I had missed a function that does 
 this.  In R there is such a function called which but that name is already 
 taken in Julia for something else.



[julia-users] Problem with Domain Error in an aggregator function

2014-12-11 Thread Pileas
Hello all,

I have this function from which I want to make a 3D figure. I have some 
problems though, because I get a domain error. I am sure it must be a 
stupid mistake or something that I do not understand ... but I cannot 
figure out what I am doing wrong.

So, the function is: C_M = ((c_1^(0.3/1.3) + c_2^(0.3/1.3))^(1.3/0.3)

To my understanding, in order to make a 3D graph I need to keep one 
dimension constant while the other changes and gives values to C_M until 
all combinations in both directions have been exhausted (at least in the 
domain that I set). Eventually you get triplets of the form (c1, c2, cm).

Here is the code:

--
*m = 100;# 100 points in each direction*
*n = 100; **# 100 points in each direction*

*c1 = randn(m);*
*c2 = randn(n);*
*cm = zeros(m*n);  # Cartesian product *

*csvfile = open(graph.csv,w)*
*write(csvfile,c1,c2,cm, \n)*

*for i = 1:m*
*for j = 1:n*
*   cm[] = (c1[i]^(0.3/1.3) + c2[j]^(0.3/1.3))^(1.3/0.3)*
*end*
*write(csvfile, join(graph,,), \n)*
*end*

--

P.S. I want to save the results and then use PyPlot to make the graph.

Thank you for your time.


Re: [julia-users] Problem with Domain Error in an aggregator function

2014-12-11 Thread Andreas Noack
The problem is the non-integer power of a negative number, so you'll have
to restrict the consumption to be positive.

2014-12-11 16:29 GMT-05:00 Pileas phoebus.apollo...@gmail.com:

 Hello all,

 I have this function from which I want to make a 3D figure. I have some
 problems though, because I get a domain error. I am sure it must be a
 stupid mistake or something that I do not understand ... but I cannot
 figure out what I am doing wrong.

 So, the function is: C_M = ((c_1^(0.3/1.3) + c_2^(0.3/1.3))^(1.3/0.3)

 To my understanding, in order to make a 3D graph I need to keep one
 dimension constant while the other changes and gives values to C_M until
 all combinations in both directions have been exhausted (at least in the
 domain that I set). Eventually you get triplets of the form (c1, c2, cm).

 Here is the code:

 --
 *m = 100;# 100 points in each direction*
 *n = 100; **# 100 points in each direction*

 *c1 = randn(m);*
 *c2 = randn(n);*
 *cm = zeros(m*n);  # Cartesian product *

 *csvfile = open(graph.csv,w)*
 *write(csvfile,c1,c2,cm, \n)*

 *for i = 1:m*
 *for j = 1:n*
 *   cm[] = (c1[i]^(0.3/1.3) + c2[j]^(0.3/1.3))^(1.3/0.3)*
 *end*
 *write(csvfile, join(graph,,), \n)*
 *end*

 --

 P.S. I want to save the results and then use PyPlot to make the graph.

 Thank you for your time.



Re: [julia-users] Broadcasting variables

2014-12-11 Thread Madeleine Udell
Amit and Blake, thanks for all your advice. I've managed to cobble together
a shared memory version of LowRankModels.jl, using the workarounds we
devised above. In case you're interested, it's at

https://github.com/madeleineudell/LowRankModels.jl/blob/master/src/shareglrm.jl

and you can run it using eg

julia -p 3 LowRankModels/examples/sharetest.jl

There's a significant overhead, but it's faster than the serial version for
large problem sizes. Any advice for reducing the overhead would be much
appreciated.

However, in that package I'm seeing some unexpected behavior: occasionally
it seems that some of the processes have not finished their jobs at the end
of an @everywhere block, although looking at the code for @everywhere I see
it's wrapped in a @sync already. Is there something else I can use to
synchronize (ie wait for completion of all) the processes?

On Tue, Dec 2, 2014 at 12:21 AM, Amit Murthy amit.mur...@gmail.com wrote:

 Issue - https://github.com/JuliaLang/julia/issues/9219

 On Tue, Dec 2, 2014 at 10:04 AM, Amit Murthy amit.mur...@gmail.com
 wrote:

 From the documentation - Modules in Julia are separate global variable
 workspaces.

 So what is happening is that the anonymous function in remotecall(i,
 x-(global const X=x; nothing), localX) creates X as module global.

 The following works:

 module ParallelStuff
 export doparallelstuff

 function doparallelstuff()(m = 10, n = 20)
 # initialize variables
 localX = Base.shmem_rand(m; pids=procs())
 localY = Base.shmem_rand(n; pids=procs())
 localf = [x-i+sum(x) for i=1:m]
 localg = [x-i+sum(x) for i=1:n]

 # broadcast variables to all worker processes (thanks to Amit Murthy
 for suggesting this syntax)
 @sync begin
 for i in procs(localX)
 remotecall(i, x-(global X=x; nothing), localX)
 remotecall(i, x-(global Y=x; nothing), localY)
 remotecall(i, x-(global f=x; nothing), localf)
 remotecall(i, x-(global g=x; nothing), localg)
 end
 end

 # compute
 for iteration=1:1
 @everywhere begin
 X=ParallelStuff.X
 Y=ParallelStuff.Y
 f=ParallelStuff.f
 g=ParallelStuff.g
 for i=localindexes(X)
 X[i] = f[i](Y)
 end
 for j=localindexes(Y)
 Y[j] = g[j](X)
 end
 end
 end
 end

 end #module


 While remotecall, @everywhere, etc run under Main, the fact that the
 closure variables refers to Module ParallelStuff is pretty confusing.
 I think we need a better way to handle this.


 On Tue, Dec 2, 2014 at 4:58 AM, Madeleine Udell 
 madeleine.ud...@gmail.com wrote:

 Thanks to Blake and Amit for some excellent suggestions! Both strategies
 work fine when embedded in functions, but not when those functions are
 embedded in modules. For example, the following throws an error:

 @everywhere include(ParallelStuff.jl)
 @everywhere using ParallelStuff
 doparallelstuff()

 when ParallelStuff.jl contains the following code:

 module ParallelStuff
 export doparallelstuff

 function doparallelstuff(m = 10, n = 20)
 # initialize variables
 localX = Base.shmem_rand(m; pids=procs())
 localY = Base.shmem_rand(n; pids=procs())
 localf = [x-i+sum(x) for i=1:m]
 localg = [x-i+sum(x) for i=1:n]

 # broadcast variables to all worker processes (thanks to Amit Murthy
 for suggesting this syntax)
 @sync begin
 for i in procs(localX)
 remotecall(i, x-(global const X=x; nothing), localX)
 remotecall(i, x-(global const Y=x; nothing), localY)
 remotecall(i, x-(global const f=x; nothing), localf)
 remotecall(i, x-(global const g=x; nothing), localg)
 end
 end

 # compute
 for iteration=1:1
 @everywhere for i=localindexes(X)
 X[i] = f[i](Y)
 end
 @everywhere for j=localindexes(Y)
 Y[j] = g[j](X)
 end
 end
 end

 end #module

 On 3 processes (julia -p 3), the error is as follows:

 exception on 1: exception on 2: exception on 3: ERROR: X not defined
  in anonymous at no file
  in eval at
 /Users/vagrant/tmp/julia-packaging/osx10.7+/julia-master/base/sysimg.jl:7
  in anonymous at multi.jl:1310
  in run_work_thunk at multi.jl:621
  in run_work_thunk at multi.jl:630
  in anonymous at task.jl:6
 ERROR: X not defined
  in anonymous at no file
  in eval at
 /Users/vagrant/tmp/julia-packaging/osx10.7+/julia-master/base/sysimg.jl:7
  in anonymous at multi.jl:1310
  in anonymous at multi.jl:848
  in run_work_thunk at multi.jl:621
  in run_work_thunk at multi.jl:630
  in anonymous at task.jl:6
 ERROR: X not defined
  in anonymous at no file
  in eval at
 /Users/vagrant/tmp/julia-packaging/osx10.7+/julia-master/base/sysimg.jl:7
  in anonymous at multi.jl:1310
  in anonymous at multi.jl:848
  in run_work_thunk at multi.jl:621
  in run_work_thunk at multi.jl:630
  in anonymous at 

Re: [julia-users] scope using let and local/global variables

2014-12-11 Thread Michael Mayo
Thanks for both answers! I figured out a slightly different way of doing it 
by putting the let assignments into a string with a nothing expression, 
parsing the string, and then inserting the actual expression to be 
evaluated into the correct place in the let block:

function run(assignments,program)
  program_string=let
  for pair in assignments
program_string=$(program_string) $(pair[1])=$(pair[2]);
  end
  program_string=$(program_string) nothing; end
  program_final=parse(program_string)
  program_final.args[1].args[end]=program
  eval(program_final)
end

I can now evaluate the same expression with different inputs in parallel 
without worrying that they might conflict because all the variables are 
local, e.g.:

pmap(dict-run(dict,:(x+y*y)), [{:x=2,:y=5},{:x=6,:y=10}])

*2-element Array{Any,1}:*

*  27*

* 106*

Thanks for your help!
Mike



On Thursday, December 11, 2014 10:34:22 PM UTC+13, Mike Innes wrote:

 You can do this just fine, but you have to be explicit about what 
 variables you want to pass in, e.g.

 let x=2
   exp=:(x+1)
   eval(:(let x = $x; $exp; end))
 end

 If you want to call the expression with multiple inputs, wrap it in a 
 function:

 let x=2
   exp=:(x+1)
   f = eval(:(x - $exp))
   f(x)
 end


 On 11 December 2014 at 06:32, Jameson Nash vtj...@gmail.com javascript:
  wrote:

 I'm not quite sure what a genetic program of that sort would look like. I 
 would be interested to hear if you get something out of it.

 Another alternative is to use a module as the environment:

 module MyEnv
 end
 eval(MyEnv, :(code block))

 This is (roughly) how the REPL is implemented to work.

 On Thu Dec 11 2014 at 1:26:57 AM Michael Mayo mm...@waikato.ac.nz 
 javascript: wrote:

 Thanks, but its not quite what I'm looking for. I want to be able to 
 edit the Expr tree and then evaluate different expressions using variables 
 defined in the local scope,not the global scope (e.g. for genetic 
 programming, where random changes to an expression are repeatedly evaluated 
 to find the best one). Using anonymous functions could work but modifying 
 the .code property of an anonymous function looks much more complex than 
 modifying the Expr types.

 Anyway thanks for your answer, maybe your suggestion is the only 
 possible way to achieve this!

 Mike 


 On Thursday, December 11, 2014 6:56:15 PM UTC+13, Jameson wrote:

 eval, by design, doesn't work that way. there are just too many better 
 alternatives. typically, an anonymous function / lambda is the best and 
 most direct replacement:

 let x=2
   println(x)# Line 1
   exp = () - x+1
   println(exp())# Line 2
 end


 On Wed Dec 10 2014 at 10:43:00 PM Michael Mayo mm...@waikato.ac.nz 
 wrote:

 Hi folks,

 I have the following code fragment:

 x=1
 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 It contains two variables both named x, one inside the scope defined 
 by let, and one at global scope.

 If I run this code the output is:
 2
 2

 This indicates that (i) that line 1 is using the local version of x, 
 and (ii) that line 2 is using the global version of x.

 If I remove this global x I now get an error because eval() is looking 
 for the global x which no longer exists:

 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 2

 ERROR: x not defined


 My question: when evaluating an expression using eval() such as line 
 2, how can I force Julia to use the local (not global) version of x and 
 thus avoid this error?


 Thanks

 Mike

  


Re: [julia-users] Is there a function that performs (1:length(v))[v] for v a Vector{Bool} or a BitArray?

2014-12-11 Thread Douglas Bates

On Thursday, December 11, 2014 3:21:07 PM UTC-6, John Myles White wrote:

 Does find() work? 


Yes.  Thanks. 


  -- John 

 On Dec 11, 2014, at 4:19 PM, Douglas Bates dmb...@gmail.com javascript: 
 wrote: 

  I realize it would be a one-liner to write one but, in the interests of 
 not reinventing the wheel, I wanted to ask if I had missed a function that 
 does this.  In R there is such a function called which but that name is 
 already taken in Julia for something else. 



Re: [julia-users] Initializing a SharedArray Memory Error

2014-12-11 Thread benFranklin
I have noticed that these remote references can't be fetched:

fetch(zeroMatrix.refs[1]) 

 the driver process just waits until infinity, so I'm thinking that the 
remotecall_wait() 
in 
https://github.com/JuliaLang/julia/blob/f3c355115ab02868ac644a5561b788fc16738443/base/sharedarray.jl#L96
 
exit before it should. Any ideas?

On Wednesday, 10 December 2014 13:47:19 UTC-5, benFranklin wrote:

 I think you are right about some references not being released yet:

 If I change the while loop to include you way of replacing every 
 reference, the put! actually never gets executed, it just waits:

 while true
 zeroMatrix = 
 SharedArray(Float64,(nQ,nQ,3,nQ,nQ,nQ),pids=workers(), init = x-inF(x,nQ))
 println(ran!)

 for i = 1:length(zeroMatrix.refs) 
 put!(zeroMatrix.refs[i], 1) 
 end 
 @everywhere gc()
end
 ran!
 

 Runs once and stalls, after C-c:


 ^CERROR: interrupt
  in process_events at /usr/bin/../lib64/julia/sys.so
  in wait at /usr/bin/../lib64/julia/sys.so (repeats 2 times)
  in wait_full at /usr/bin/../lib64/julia/sys.so
 
 After C-d

 julia 

 WARNING: Forcibly interrupting busy workers
 error in running finalizer: InterruptException()
 error in running finalizer: InterruptException()
 WARNING: Unable to terminate all workers
 [...]


 It seems after the init function not all workers are done. I'll see if 
 there's something weird with that part, but if the SharedArray is being 
 returned, I don't see any reason for this to be so.



 On Wednesday, 10 December 2014 05:19:55 UTC-5, Tim Holy wrote:

 After your gc() it should be able to be unmapped, see 

 https://github.com/JuliaLang/julia/blob/f3c355115ab02868ac644a5561b788fc16738443/base/mmap.jl#L113
  

 My guess is something in the parallel architecture is holding a 
 reference. 
 Have you tried going at this systematically from the internal 
 representation 
 of the SharedArray? For example, I might consider trying to put! new 
 stuff in 
 zeroMatrix.refs: 

 for i = 1:length(zeroMatrix.refs) 
 put!(zeroMatrix.refs[i], 1) 
 end 

 before calling gc(). I don't know if this will work, but it's where I'd 
 start 
 experimenting. 

 If you can fix this, please do submit a pull request. 

 Best, 
 --Tim 

 On Tuesday, December 09, 2014 08:06:10 PM ele...@gmail.com wrote: 
  On Wednesday, December 10, 2014 12:28:29 PM UTC+10, benFranklin wrote: 
   I've made a small example of the memory problems I've been running 
 into. I 
   can't find a way to deallocate a SharedArray, 
  
  Someone more expert might find it, but I can't see anywhere that the 
  mmapped memory is unmapped. 
  
   if the code below runs once, it means the computer has enough memory 
 to 
   run this. If I can properly deallocate the memory I should be able to 
 do 
   it 
   again, however, I run out of memory. Am I misunderstanding something 
 about 
   garbage collection in Julia? 
   
   Thanks for your attention 
   
   Code: 
   
   @everywhere nQ = 60 
   
   @everywhere function inF(x::SharedArray,nQ::Int64) 
   
   number = myid()-1; 
   targetLength = nQ*nQ*3 
   
   startN = floor((number-1)*targetLength/nworkers()) + 1 
   endN = floor(number*targetLength/nworkers()) 
   
   myIndexes = int64(startN:endN) 
   for j in myIndexes 
   inds = ind2sub((nQ,nQ,nQ),j) 
   x[inds[1],inds[2],inds[3],:,:,:] = rand(nQ,nQ,nQ) 
   end 
   
   
   end 
   
   while true 
   zeroMatrix = SharedArray(Float64,(nQ,nQ,3,nQ,nQ,nQ),pids=workers(), 
 init = 
   x-inF(x,nQ)) 
   println(ran!) 
   @everywhere zeroMatrix = 1 
   @everywhere gc() 
   end 
   
   On Monday, 8 December 2014 23:43:03 UTC-5, Isaiah wrote: 
   Hopefully you will get an answer on pmap from someone more familiar 
 with 
   the parallel stuff, but: have you tried splitting the init step? 
 (see the 
   example in the manual for how to init an array in chunks done by 
   different 
   workers). Just guessing though: I'm not sure if/how those will be 
   serialized if each worker is contending for the whole array. 
   
   On Fri, Dec 5, 2014 at 4:23 PM, benFranklin gca...@gmail.com 
 wrote: 
   Hi all, I'm trying to figure out how to best initialize a 
 SharedArray, 
   using a C function to fill it up that computes a huge matrix in 
 parts, 
   and 
   all comments are appreciated. To summarise: Is A, making an empty 
 shared 
   array, computing the matrix in parallel using pmap and then filling 
 it 
   up 
   serially, better than using B, computing in parallel and storing in 
 one 
   step by using an init function in the SharedArray declaration? 
   
   
   The difference tends to be that B uses a lot more memory, each 
 process 
   using the exact same amount of memory. However it is much faster 
 than A, 
   as 
   the copy step takes longer than the computation, but in A most of 
 the 
   memory usage is in one process, using less memory overall. 
   
   Any tips on how to do this better? Also, this pmap is how I'm 
 handling 
   

[julia-users] LLVM3.2 and JULIA BUILD PROBLEM

2014-12-11 Thread Vehbi Eşref Bayraktar
Hi;

I am using llvm 3.2 with libnvvm . However when i try to build julia using 
those 2 flags :
USE_SYSTEM_LLVM = 1
USE_LLVM_SHLIB = 1

I have a bunch of errors. starting as following:

codegen.cpp: In function ‘void jl_init_codegen()’:
codegen.cpp:4886:26: error: ‘getProcessTriple’ is not a member of 
‘llvm::sys’
 Triple TheTriple(sys::getProcessTriple()); // *llvm32 doesn't have 
this one instead it has getDefaultTargetTriple()*
  ^
codegen.cpp:4919:5: error: ‘mbuilder’ was not declared in this scope
 mbuilder = new MDBuilder(getGlobalContext());  //  *include 
llvm/MDBuilder.h would fix this*
 ^
codegen.cpp:4919:20: error: expected type-specifier before ‘MDBuilder’
 mbuilder = new MDBuilder(getGlobalContext());

Even you fix these errors, you keep hitting the following ones:
In file included from codegen.cpp:976:0:
intrinsics.cpp: In function ‘llvm::Value* emit_intrinsic(JL_I::intrinsic, 
jl_value_t**, size_t, jl_codectx_t*)’:
intrinsics.cpp:1158:72: error: ‘ceil’ is not a member of ‘llvm::Intrinsic’
 return builder.CreateCall(Intrinsic::getDeclaration(jl_Module, 
Intrinsic::ceil,



So is the master branch currently supporting llvm32? Or is there a patch 
somewhere?

Thanks


Re: [julia-users] LLVM3.2 and JULIA BUILD PROBLEM

2014-12-11 Thread John Myles White
My understanding is that different versions of LLVM are enormously different 
and that there's no safe way to make Julia work with any version of LLVM other 
than the intended one.

 -- John

On Dec 11, 2014, at 4:56 PM, Vehbi Eşref Bayraktar 
vehbi.esref.bayrak...@gmail.com wrote:

 Hi;
 
 I am using llvm 3.2 with libnvvm . However when i try to build julia using 
 those 2 flags :
 USE_SYSTEM_LLVM = 1
 USE_LLVM_SHLIB = 1
 
 I have a bunch of errors. starting as following:
 
 codegen.cpp: In function ‘void jl_init_codegen()’:
 codegen.cpp:4886:26: error: ‘getProcessTriple’ is not a member of ‘llvm::sys’
  Triple TheTriple(sys::getProcessTriple()); // llvm32 doesn't have 
 this one instead it has getDefaultTargetTriple()
   ^
 codegen.cpp:4919:5: error: ‘mbuilder’ was not declared in this scope
  mbuilder = new MDBuilder(getGlobalContext());  //  include 
 llvm/MDBuilder.h would fix this
  ^
 codegen.cpp:4919:20: error: expected type-specifier before ‘MDBuilder’
  mbuilder = new MDBuilder(getGlobalContext());
 
 Even you fix these errors, you keep hitting the following ones:
 In file included from codegen.cpp:976:0:
 intrinsics.cpp: In function ‘llvm::Value* emit_intrinsic(JL_I::intrinsic, 
 jl_value_t**, size_t, jl_codectx_t*)’:
 intrinsics.cpp:1158:72: error: ‘ceil’ is not a member of ‘llvm::Intrinsic’
  return builder.CreateCall(Intrinsic::getDeclaration(jl_Module, 
 Intrinsic::ceil,
 
 
 
 So is the master branch currently supporting llvm32? Or is there a patch 
 somewhere?
 
 Thanks



Re: [julia-users] scope using let and local/global variables

2014-12-11 Thread Mike Innes
Great that you got this working, but I strongly recommend working with
expression objects here as opposed to strings. It's likely to be more
robust and will mean you can use data that isn't parseable (i.e. most
things other than numbers) as inputs.

On 11 December 2014 at 21:40, Michael Mayo mm...@waikato.ac.nz wrote:

 Thanks for both answers! I figured out a slightly different way of doing
 it by putting the let assignments into a string with a nothing
 expression, parsing the string, and then inserting the actual expression to
 be evaluated into the correct place in the let block:

 function run(assignments,program)
   program_string=let
   for pair in assignments
 program_string=$(program_string) $(pair[1])=$(pair[2]);
   end
   program_string=$(program_string) nothing; end
   program_final=parse(program_string)
   program_final.args[1].args[end]=program
   eval(program_final)
 end

 I can now evaluate the same expression with different inputs in parallel
 without worrying that they might conflict because all the variables are
 local, e.g.:

 pmap(dict-run(dict,:(x+y*y)), [{:x=2,:y=5},{:x=6,:y=10}])

 *2-element Array{Any,1}:*

 *  27*

 * 106*

 Thanks for your help!
 Mike



 On Thursday, December 11, 2014 10:34:22 PM UTC+13, Mike Innes wrote:

 You can do this just fine, but you have to be explicit about what
 variables you want to pass in, e.g.

 let x=2
   exp=:(x+1)
   eval(:(let x = $x; $exp; end))
 end

 If you want to call the expression with multiple inputs, wrap it in a
 function:

 let x=2
   exp=:(x+1)
   f = eval(:(x - $exp))
   f(x)
 end


 On 11 December 2014 at 06:32, Jameson Nash vtj...@gmail.com wrote:

 I'm not quite sure what a genetic program of that sort would look like.
 I would be interested to hear if you get something out of it.

 Another alternative is to use a module as the environment:

 module MyEnv
 end
 eval(MyEnv, :(code block))

 This is (roughly) how the REPL is implemented to work.

 On Thu Dec 11 2014 at 1:26:57 AM Michael Mayo mm...@waikato.ac.nz
 wrote:

 Thanks, but its not quite what I'm looking for. I want to be able to
 edit the Expr tree and then evaluate different expressions using variables
 defined in the local scope,not the global scope (e.g. for genetic
 programming, where random changes to an expression are repeatedly evaluated
 to find the best one). Using anonymous functions could work but modifying
 the .code property of an anonymous function looks much more complex than
 modifying the Expr types.

 Anyway thanks for your answer, maybe your suggestion is the only
 possible way to achieve this!

 Mike


 On Thursday, December 11, 2014 6:56:15 PM UTC+13, Jameson wrote:

 eval, by design, doesn't work that way. there are just too many better
 alternatives. typically, an anonymous function / lambda is the best and
 most direct replacement:

 let x=2
   println(x)# Line 1
   exp = () - x+1
   println(exp())# Line 2
 end


 On Wed Dec 10 2014 at 10:43:00 PM Michael Mayo mm...@waikato.ac.nz
 wrote:

 Hi folks,

 I have the following code fragment:

 x=1
 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 It contains two variables both named x, one inside the scope defined
 by let, and one at global scope.

 If I run this code the output is:
 2
 2

 This indicates that (i) that line 1 is using the local version of x,
 and (ii) that line 2 is using the global version of x.

 If I remove this global x I now get an error because eval() is
 looking for the global x which no longer exists:

 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 2

 ERROR: x not defined


 My question: when evaluating an expression using eval() such as line
 2, how can I force Julia to use the local (not global) version of x and
 thus avoid this error?


 Thanks

 Mike





[julia-users] Re: Aren't loops supposed to be faster?

2014-12-11 Thread Robert Gates
In any case, this does make me wonder what is going on under the hood... I 
would not call the vectorized code vectorized. IMHO, this should just 
pass to BLAS without overhead. Something appears to be creating a bunch of 
temporaries.

On Thursday, December 11, 2014 5:47:01 PM UTC+1, Petr Krysl wrote:

 Acting upon the advice that replacing matrix-matrix multiplications in 
 vectorized form with loops would help with performance, I chopped out a 
 piece of code from my finite element solver (
 https://gist.github.com/anonymous/4ec426096c02faa4354d) and ran some 
 tests with the following results:

 Vectorized code:
 elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc 
 time)

 Loops code:
 elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc 
 time) 

 SLOWER and using MORE memory?!

 I must be doing something terribly wrong.

 Petr



[julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
The example below has two modules that define methods of function f for 
different parameter types.
Both modules are imported.
It seems like that using the second module causes the first one to 
disappear.
Is that the intended behaviour?


!/Applications/Julia-0.3.0-rc4.app/Contents/Resources/julia/bin/julia

module m1

export f

f(x::String) = println(String:  * x)
f(x) = println( ?:  * string(x))
end


module m2

export f

f(x::Int)= println(   Int:  * string(x))
end

using m1
using m2

f(7)
f(Foo)

output:

   Int: 7
ERROR: `f` has no method matching f(::ASCIIString)




Re: [julia-users] LLVM3.2 and JULIA BUILD PROBLEM

2014-12-11 Thread Stefan Karpinski
LLVM 3.2 is no longer supported – the default Julia version of LLVM is 3.3.


 On Dec 11, 2014, at 4:58 PM, John Myles White johnmyleswh...@gmail.com 
 wrote:
 
 My understanding is that different versions of LLVM are enormously different 
 and that there's no safe way to make Julia work with any version of LLVM 
 other than the intended one.
 
  -- John
 
 On Dec 11, 2014, at 4:56 PM, Vehbi Eşref Bayraktar 
 vehbi.esref.bayrak...@gmail.com wrote:
 
 Hi;
 
 I am using llvm 3.2 with libnvvm . However when i try to build julia using 
 those 2 flags :
 USE_SYSTEM_LLVM = 1
 USE_LLVM_SHLIB = 1
 
 I have a bunch of errors. starting as following:
 
 codegen.cpp: In function ‘void jl_init_codegen()’:
 codegen.cpp:4886:26: error: ‘getProcessTriple’ is not a member of ‘llvm::sys’
  Triple TheTriple(sys::getProcessTriple()); // llvm32 doesn't have 
 this one instead it has getDefaultTargetTriple()
   ^
 codegen.cpp:4919:5: error: ‘mbuilder’ was not declared in this scope
  mbuilder = new MDBuilder(getGlobalContext());  //  include 
 llvm/MDBuilder.h would fix this
  ^
 codegen.cpp:4919:20: error: expected type-specifier before ‘MDBuilder’
  mbuilder = new MDBuilder(getGlobalContext());
 
 Even you fix these errors, you keep hitting the following ones:
 In file included from codegen.cpp:976:0:
 intrinsics.cpp: In function ‘llvm::Value* emit_intrinsic(JL_I::intrinsic, 
 jl_value_t**, size_t, jl_codectx_t*)’:
 intrinsics.cpp:1158:72: error: ‘ceil’ is not a member of ‘llvm::Intrinsic’
  return builder.CreateCall(Intrinsic::getDeclaration(jl_Module, 
 Intrinsic::ceil,
 
 
 
 So is the master branch currently supporting llvm32? Or is there a patch 
 somewhere?
 
 Thanks
 


Re: [julia-users] Problem with Domain Error in an aggregator function

2014-12-11 Thread Andreas Noack
I think the easiest solution would be to store the result in a matrix and
save that with writedlm.

2014-12-11 17:07 GMT-05:00 Pileas phoebus.apollo...@gmail.com:

 Andreas thanks, you were right about the consumption. I changed the
 example and it worked. My only complain is that I am not able to save my
 results the way I want.

 See for example the modified code:

 --
 m = 10;
 n = 5;

 c1 = abs(randn(m));
 c2 = abs(randn(n));
 cm = Array(Float64, m*n);

 csvfile = open(Dixit_Stiglitz.csv,w)
 write(csvfile,c1,c2,cm, \n)

 for i = 1:m
 for j = 1:n
 cm = (c1[i]^(0.3/1.3) + c2[j]^(0.3/1.3))^(1.3/0.3)
 println(csvfile, join(c1, c2, cm), ,)
 end
 end
 -

 What I want to do is to have a csv file of the following format (here for
 simplicity c1 = (1, 2) and c2 = (1,  2, 3))

 c1  c2cm
 ---
 1   1   cm(1,1)
 2   1   cm(2,1)
 1   2   cm(1,2)
 2   2   cm(2,2)
 1   3   cm(1,3)
 2   3   cm(2,3)

 and eventually, with the above numbers I want to plot the 3D graph. But
 the code that I gave does not show the results well.

 Any help will be appreciated.

 Thanks

 Τη Πέμπτη, 11 Δεκεμβρίου 2014 4:33:10 μ.μ. UTC-5, ο χρήστης Andreas Noack
 έγραψε:

 The problem is the non-integer power of a negative number, so you'll have
 to restrict the consumption to be positive.

 2014-12-11 16:29 GMT-05:00 Pileas phoebus@gmail.com:

 Hello all,

 I have this function from which I want to make a 3D figure. I have some
 problems though, because I get a domain error. I am sure it must be a
 stupid mistake or something that I do not understand ... but I cannot
 figure out what I am doing wrong.

 So, the function is: C_M = ((c_1^(0.3/1.3) + c_2^(0.3/1.3))^(1.3/0.3)

 To my understanding, in order to make a 3D graph I need to keep one
 dimension constant while the other changes and gives values to C_M until
 all combinations in both directions have been exhausted (at least in the
 domain that I set). Eventually you get triplets of the form (c1, c2, cm).

 Here is the code:

 
 --
 *m = 100;# 100 points in each direction*
 *n = 100; **# 100 points in each direction*

 *c1 = randn(m);*
 *c2 = randn(n);*
 *cm = zeros(m*n);  # Cartesian product *

 *csvfile = open(graph.csv,w)*
 *write(csvfile,c1,c2,cm, \n)*

 *for i = 1:m*
 *for j = 1:n*
 *   cm[] = (c1[i]^(0.3/1.3) + c2[j]^(0.3/1.3))^(1.3/0.3)*
 *end*
 *write(csvfile, join(graph,,), \n)*
 *end*

 
 --

 P.S. I want to save the results and then use PyPlot to make the graph.

 Thank you for your time.





Re: [julia-users] Re: Aren't loops supposed to be faster?

2014-12-11 Thread Andreas Noack
I wrote a comment in the gist.

2014-12-11 17:08 GMT-05:00 Robert Gates robert.ga...@gmail.com:

 In any case, this does make me wonder what is going on under the hood... I
 would not call the vectorized code vectorized. IMHO, this should just
 pass to BLAS without overhead. Something appears to be creating a bunch of
 temporaries.

 On Thursday, December 11, 2014 5:47:01 PM UTC+1, Petr Krysl wrote:

 Acting upon the advice that replacing matrix-matrix multiplications in
 vectorized form with loops would help with performance, I chopped out a
 piece of code from my finite element solver (https://gist.github.com/
 anonymous/4ec426096c02faa4354d) and ran some tests with the following
 results:

 Vectorized code:
 elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc
 time)

 Loops code:
 elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc
 time)

 SLOWER and using MORE memory?!

 I must be doing something terribly wrong.

 Petr




Re: [julia-users] Re: EACCESS errors when starting julia 0.3.3

2014-12-11 Thread Robbin Bonthond
https://github.com/JuliaLang/julia/issues/9319 has been filed

On Thursday, December 11, 2014 3:07:06 PM UTC-6, Stefan Karpinski wrote:

 On Thu, Dec 11, 2014 at 4:05 PM, Robbin Bonthond robbin@gmail.com 
 javascript: wrote:

 for some reason the binary installer of julia uses restricted group 
 permissions


 That sounds like a potential problem – would you mind filing an issue?

 https://github.com/JuliaLang/julia/issues



Re: [julia-users] Help with types (Arrays)

2014-12-11 Thread elextr
And it is emphasised in the 
manual 
http://docs.julialang.org/en/release-0.3/manual/types/#man-parametric-types.

On Friday, December 12, 2014 6:34:10 AM UTC+10, Isaiah wrote:

 I suggest to start with this recent thread, and there are some others if 
 you search for covariance on the list archives:

 https://groups.google.com/d/msg/julia-users/dEnCJ-nxAGE/1Bw4a1UvFCEJ

 (if it is still unclear how to do what you need after that, feel free to 
 ask)






 On Thu, Dec 11, 2014 at 3:26 PM, S sab...@gmail.com javascript: wrote:



 Hi all - very new to julia and am trying to wrap my head around multiple 
 dispatch.

 I would think that because Int64 : Integer, that an array of Int64 is an 
 array of Integer. However, that doesn't appear to be the case.

 That is, with c = [1,2,3,4], I can't create a constructor function 
 foo(myarr::Array{Integer,1}) - and Vector{Integer} doesn't work either.

 How do I create a constructor that takes a vector of Integers (of any 
 subtype) - and only integers - and operates on the elements?


 julia c = [1,2,3,4]

 julia c
 4-element Array{Int64,1}:
  1
  2
  3
  4

 julia isa(c,Vector)
 true

 julia isa(c,Vector{Integer})
 false

 julia isa(c,Vector{Int64})
 true

 julia isa(c,Array{Integer})
 false




Re: [julia-users] scope using let and local/global variables

2014-12-11 Thread Michael Mayo
Yes you are right! The final version with the strings removed:

function run(assignments,program)
  program_final=Expr(:block)
  args=Expr[]
  for pair in assignments
append!(args, [Expr(:(=), pair[1], pair[2])])
  end
  append!(args,[program])
  program_final.args=args
  eval(program_final)
end

Turns out that making a let expression is not required; a simple block 
expression does the job.
Mike

On Friday, December 12, 2014 11:07:42 AM UTC+13, Mike Innes wrote:

 Great that you got this working, but I strongly recommend working with 
 expression objects here as opposed to strings. It's likely to be more 
 robust and will mean you can use data that isn't parseable (i.e. most 
 things other than numbers) as inputs.

 On 11 December 2014 at 21:40, Michael Mayo mm...@waikato.ac.nz 
 javascript: wrote:

 Thanks for both answers! I figured out a slightly different way of doing 
 it by putting the let assignments into a string with a nothing 
 expression, parsing the string, and then inserting the actual expression to 
 be evaluated into the correct place in the let block:

 function run(assignments,program)
   program_string=let
   for pair in assignments
 program_string=$(program_string) $(pair[1])=$(pair[2]);
   end
   program_string=$(program_string) nothing; end
   program_final=parse(program_string)
   program_final.args[1].args[end]=program
   eval(program_final)
 end

 I can now evaluate the same expression with different inputs in parallel 
 without worrying that they might conflict because all the variables are 
 local, e.g.:

 pmap(dict-run(dict,:(x+y*y)), [{:x=2,:y=5},{:x=6,:y=10}])

 *2-element Array{Any,1}:*

 *  27*

 * 106*

 Thanks for your help!
 Mike



 On Thursday, December 11, 2014 10:34:22 PM UTC+13, Mike Innes wrote:

 You can do this just fine, but you have to be explicit about what 
 variables you want to pass in, e.g.

 let x=2
   exp=:(x+1)
   eval(:(let x = $x; $exp; end))
 end

 If you want to call the expression with multiple inputs, wrap it in a 
 function:

 let x=2
   exp=:(x+1)
   f = eval(:(x - $exp))
   f(x)
 end


 On 11 December 2014 at 06:32, Jameson Nash vtj...@gmail.com wrote:

 I'm not quite sure what a genetic program of that sort would look like. 
 I would be interested to hear if you get something out of it.

 Another alternative is to use a module as the environment:

 module MyEnv
 end
 eval(MyEnv, :(code block))

 This is (roughly) how the REPL is implemented to work.

 On Thu Dec 11 2014 at 1:26:57 AM Michael Mayo mm...@waikato.ac.nz 
 wrote:

 Thanks, but its not quite what I'm looking for. I want to be able to 
 edit the Expr tree and then evaluate different expressions using 
 variables 
 defined in the local scope,not the global scope (e.g. for genetic 
 programming, where random changes to an expression are repeatedly 
 evaluated 
 to find the best one). Using anonymous functions could work but modifying 
 the .code property of an anonymous function looks much more complex than 
 modifying the Expr types.

 Anyway thanks for your answer, maybe your suggestion is the only 
 possible way to achieve this!

 Mike 


 On Thursday, December 11, 2014 6:56:15 PM UTC+13, Jameson wrote:

 eval, by design, doesn't work that way. there are just too many 
 better alternatives. typically, an anonymous function / lambda is the 
 best 
 and most direct replacement:

 let x=2
   println(x)# Line 1
   exp = () - x+1
   println(exp())# Line 2
 end


 On Wed Dec 10 2014 at 10:43:00 PM Michael Mayo mm...@waikato.ac.nz 
 wrote:

 Hi folks,

 I have the following code fragment:

 x=1
 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 It contains two variables both named x, one inside the scope defined 
 by let, and one at global scope.

 If I run this code the output is:
 2
 2

 This indicates that (i) that line 1 is using the local version of x, 
 and (ii) that line 2 is using the global version of x.

 If I remove this global x I now get an error because eval() is 
 looking for the global x which no longer exists:

 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 2

 ERROR: x not defined


 My question: when evaluating an expression using eval() such as line 
 2, how can I force Julia to use the local (not global) version of x and 
 thus avoid this error?


 Thanks

 Mike

  



Re: [julia-users] Re: Aren't loops supposed to be faster?

2014-12-11 Thread Robert Gates
Yeah, I think I figured it out on my own, hence the message deletion. 
Nonetheless, I don't see your comment.

On Thursday, December 11, 2014 11:29:15 PM UTC+1, Andreas Noack wrote:

 I wrote a comment in the gist.

 2014-12-11 17:08 GMT-05:00 Robert Gates robert...@gmail.com javascript:
 :

 In any case, this does make me wonder what is going on under the hood... 
 I would not call the vectorized code vectorized. IMHO, this should just 
 pass to BLAS without overhead. Something appears to be creating a bunch of 
 temporaries.

 On Thursday, December 11, 2014 5:47:01 PM UTC+1, Petr Krysl wrote:

 Acting upon the advice that replacing matrix-matrix multiplications in 
 vectorized form with loops would help with performance, I chopped out a 
 piece of code from my finite element solver (https://gist.github.com/
 anonymous/4ec426096c02faa4354d) and ran some tests with the following 
 results:

 Vectorized code:
 elapsed time: 0.326802682 seconds (134490340 bytes allocated, 17.06% gc 
 time)

 Loops code:
 elapsed time: 4.681451441 seconds (997454276 bytes allocated, 9.05% gc 
 time) 

 SLOWER and using MORE memory?!

 I must be doing something terribly wrong.

 Petr




Re: [julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread Rob J. Goedman
Sam,

Maybe below slightly expanded version of your example will help.

I think key is to import m1.f in module m2

Regards
Rob J. Goedman
goed...@mac.com


module m1

  export f

  f(x::ASCIIString) = println(ASCIIString:  * x)
  f{T:String}(x::T) = println(  $(typeof(x)):  * x)
  f(x) = println(  $(typeof(x)):  * string(x))
end


module m2

  import m1.f
  export f

  f(x::Int)= println(Int:  * string(x))
end

using m1
using m2

f(7)
f(Foo)
f(\u2200 x \u2203 y)
f(12.0)
f(2.0+3.0im)




 On Dec 11, 2014, at 2:18 PM, samoconnor samocon...@mac.com wrote:
 
 The example below has two modules that define methods of function f for 
 different parameter types.
 Both modules are imported.
 It seems like that using the second module causes the first one to 
 disappear.
 Is that the intended behaviour?
 
 
 !/Applications/Julia-0.3.0-rc4.app/Contents/Resources/julia/bin/julia
 
 module m1
 
 export f
 
 f(x::String) = println(String:  * x)
 f(x) = println( ?:  * string(x))
 end
 
 
 module m2
 
 export f
 
 f(x::Int)= println(   Int:  * string(x))
 end
 
 using m1
 using m2
 
 f(7)
 f(Foo)
 
 output:
 
Int: 7
 ERROR: `f` has no method matching f(::ASCIIString)
 
 



Re: [julia-users] scope using let and local/global variables

2014-12-11 Thread Michael Mayo
My apologies, a let block *is* required otherwise the variables are being 
defined at global scope. The corrected function:

function run(assignments,program)
  block=Expr(:block)
  args=Expr[]
  for pair in assignments
append!(args, [Expr(:(=), pair[1], pair[2])])
  end
  append!(args,[program])
  block.args=args
  program_final=Expr(:let, block)
  eval(program_final)
end

Mike

On Friday, December 12, 2014 11:43:54 AM UTC+13, Michael Mayo wrote:

 Yes you are right! The final version with the strings removed:

 function run(assignments,program)
   program_final=Expr(:block)
   args=Expr[]
   for pair in assignments
 append!(args, [Expr(:(=), pair[1], pair[2])])
   end
   append!(args,[program])
   program_final.args=args
   eval(program_final)
 end

 Turns out that making a let expression is not required; a simple block 
 expression does the job.
 Mike

 On Friday, December 12, 2014 11:07:42 AM UTC+13, Mike Innes wrote:

 Great that you got this working, but I strongly recommend working with 
 expression objects here as opposed to strings. It's likely to be more 
 robust and will mean you can use data that isn't parseable (i.e. most 
 things other than numbers) as inputs.

 On 11 December 2014 at 21:40, Michael Mayo mm...@waikato.ac.nz wrote:

 Thanks for both answers! I figured out a slightly different way of doing 
 it by putting the let assignments into a string with a nothing 
 expression, parsing the string, and then inserting the actual expression to 
 be evaluated into the correct place in the let block:

 function run(assignments,program)
   program_string=let
   for pair in assignments
 program_string=$(program_string) $(pair[1])=$(pair[2]);
   end
   program_string=$(program_string) nothing; end
   program_final=parse(program_string)
   program_final.args[1].args[end]=program
   eval(program_final)
 end

 I can now evaluate the same expression with different inputs in parallel 
 without worrying that they might conflict because all the variables are 
 local, e.g.:

 pmap(dict-run(dict,:(x+y*y)), [{:x=2,:y=5},{:x=6,:y=10}])

 *2-element Array{Any,1}:*

 *  27*

 * 106*

 Thanks for your help!
 Mike



 On Thursday, December 11, 2014 10:34:22 PM UTC+13, Mike Innes wrote:

 You can do this just fine, but you have to be explicit about what 
 variables you want to pass in, e.g.

 let x=2
   exp=:(x+1)
   eval(:(let x = $x; $exp; end))
 end

 If you want to call the expression with multiple inputs, wrap it in a 
 function:

 let x=2
   exp=:(x+1)
   f = eval(:(x - $exp))
   f(x)
 end


 On 11 December 2014 at 06:32, Jameson Nash vtj...@gmail.com wrote:

 I'm not quite sure what a genetic program of that sort would look 
 like. I would be interested to hear if you get something out of it.

 Another alternative is to use a module as the environment:

 module MyEnv
 end
 eval(MyEnv, :(code block))

 This is (roughly) how the REPL is implemented to work.

 On Thu Dec 11 2014 at 1:26:57 AM Michael Mayo mm...@waikato.ac.nz 
 wrote:

 Thanks, but its not quite what I'm looking for. I want to be able to 
 edit the Expr tree and then evaluate different expressions using 
 variables 
 defined in the local scope,not the global scope (e.g. for genetic 
 programming, where random changes to an expression are repeatedly 
 evaluated 
 to find the best one). Using anonymous functions could work but 
 modifying 
 the .code property of an anonymous function looks much more complex than 
 modifying the Expr types.

 Anyway thanks for your answer, maybe your suggestion is the only 
 possible way to achieve this!

 Mike 


 On Thursday, December 11, 2014 6:56:15 PM UTC+13, Jameson wrote:

 eval, by design, doesn't work that way. there are just too many 
 better alternatives. typically, an anonymous function / lambda is the 
 best 
 and most direct replacement:

 let x=2
   println(x)# Line 1
   exp = () - x+1
   println(exp())# Line 2
 end


 On Wed Dec 10 2014 at 10:43:00 PM Michael Mayo mm...@waikato.ac.nz 
 wrote:

 Hi folks,

 I have the following code fragment:

 x=1
 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 It contains two variables both named x, one inside the scope 
 defined by let, and one at global scope.

 If I run this code the output is:
 2
 2

 This indicates that (i) that line 1 is using the local version of 
 x, and (ii) that line 2 is using the global version of x.

 If I remove this global x I now get an error because eval() is 
 looking for the global x which no longer exists:

 let x=2
   println(x)# Line 1
   exp=:(x+1)
   println(eval(exp))# Line 2
 end

 2

 ERROR: x not defined


 My question: when evaluating an expression using eval() such as 
 line 2, how can I force Julia to use the local (not global) version of 
 x 
 and thus avoid this error?


 Thanks

 Mike

  



[julia-users] `;` output supressor behaves oddly with commented lines.

2014-12-11 Thread Ismael VC
Hello guys, do yo know if this is a bug or if the behavior 
changed? http://bit.ly/1vWmrHx 

`;` works to supress output in IJulia (0.3.3) *only* at the end of line, 
even if the line has a comment, and I can't get into JuliaBox to confirm if 
its the same on the REPL.


It works as I expect in my PC (0.3.2+2), but I can't install IJulia here to 
test if it´s the same as above in the notebook:

julia @show x = 5
x = 5 = 5
5

julia @show x = 5;
x = 5 = 5

julia @show x = 5; # Yep!
x = 5 = 5

julia versioninfo()
Julia Version 0.3.2+2
Commit 6babc84 (2014-10-22 01:21 UTC)
Platform Info:
  System: Windows (i686-w64-mingw32)
  CPU: AMD Athlon(tm) XP 2000+
  WORD_SIZE: 32
  BLAS: libopenblas (DYNAMIC_ARCH NO_AFFINITY Athlon)
  LAPACK: libopenblas
  LIBM: libopenlibm
  LLVM: libLLVM-3.3



Cheers!


Re: [julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
Hi Rob,

Ok, I see why that works, but it's a different example.

Assume that m1 and m2 are libraries from different vendors, they know 
nothing about each other, but they both export methods for f().

It is surprising to me that importing two modules would cause one to 
overwrite methods from the other with no warning or error. 

On Friday, December 12, 2014 9:45:37 AM UTC+11, Rob J Goedman wrote:

 Sam,

 Maybe below slightly expanded version of your example will help.

 I think key is to import m1.f in module m2

 Regards
 Rob J. Goedman
 goe...@mac.com javascript:


 module m1

   export f

   f(x::ASCIIString) = println(ASCIIString:  * x)
   f{T:String}(x::T) = println(  $(typeof(x)):  * x)
   f(x) = println(  $(typeof(x)):  * string(x))
 end


 module m2

   import m1.f
   export f

   f(x::Int)= println(Int:  * string(x))
 end

 using m1
 using m2

 f(7)
 f(Foo)
 f(\u2200 x \u2203 y)
 f(12.0)
 f(2.0+3.0im)



  
 On Dec 11, 2014, at 2:18 PM, samoconnor samoc...@mac.com javascript: 
 wrote:

 The example below has two modules that define methods of function f for 
 different parameter types.
 Both modules are imported.
 It seems like that using the second module causes the first one to 
 disappear.
 Is that the intended behaviour?


 !/Applications/Julia-0.3.0-rc4.app/Contents/Resources/julia/bin/julia

 module m1

 export f

 f(x::String) = println(String:  * x)
 f(x) = println( ?:  * string(x))
 end


 module m2

 export f

 f(x::Int)= println(   Int:  * string(x))
 end

 using m1
 using m2

 f(7)
 f(Foo)

 output:

Int: 7
 ERROR: `f` has no method matching f(::ASCIIString)





Re: [julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread elextr
See open Issues https://github.com/JuliaLang/julia/issues/2327 
and https://github.com/JuliaLang/julia/issues/4345

On Friday, December 12, 2014 8:52:38 AM UTC+10, samoconnor wrote:

 Hi Rob,

 Ok, I see why that works, but it's a different example.

 Assume that m1 and m2 are libraries from different vendors, they know 
 nothing about each other, but they both export methods for f().

 It is surprising to me that importing two modules would cause one to 
 overwrite methods from the other with no warning or error. 

 On Friday, December 12, 2014 9:45:37 AM UTC+11, Rob J Goedman wrote:

 Sam,

 Maybe below slightly expanded version of your example will help.

 I think key is to import m1.f in module m2

 Regards
 Rob J. Goedman
 goe...@mac.com


 module m1

   export f

   f(x::ASCIIString) = println(ASCIIString:  * x)
   f{T:String}(x::T) = println(  $(typeof(x)):  * x)
   f(x) = println(  $(typeof(x)):  * string(x))
 end


 module m2

   import m1.f
   export f

   f(x::Int)= println(Int:  * string(x))
 end

 using m1
 using m2

 f(7)
 f(Foo)
 f(\u2200 x \u2203 y)
 f(12.0)
 f(2.0+3.0im)



  
 On Dec 11, 2014, at 2:18 PM, samoconnor samoc...@mac.com wrote:

 The example below has two modules that define methods of function f for 
 different parameter types.
 Both modules are imported.
 It seems like that using the second module causes the first one to 
 disappear.
 Is that the intended behaviour?


 !/Applications/Julia-0.3.0-rc4.app/Contents/Resources/julia/bin/julia

 module m1

 export f

 f(x::String) = println(String:  * x)
 f(x) = println( ?:  * string(x))
 end


 module m2

 export f

 f(x::Int)= println(   Int:  * string(x))
 end

 using m1
 using m2

 f(7)
 f(Foo)

 output:

Int: 7
 ERROR: `f` has no method matching f(::ASCIIString)





[julia-users] `;` output supressor behaves oddly with commented lines.

2014-12-11 Thread Ivar Nesje
I expect it's the same everywhere.

See https://github.com/JuliaLang/julia/issues/6225

The ; to suppress output is not implemented in the parser, but as a simple 
search for a trailing ;

Re: [julia-users] Re: Aren't loops supposed to be faster?

2014-12-11 Thread Ivar Nesje
https://gist.github.com/anonymous/4ec426096c02faa4354d#comment-1354636

Re: [julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread Rob J. Goedman
Yes, I am struggling with this aspect as well and recently posed a related 
question (’Struggling with generic functions’) although slightly differently 
formulated.

John’s answer included below.

In your example that might look like below update of the example I think. Like 
you (I think) I was looking to hide this from end users.

Rob J. Goedman
goed...@mac.com

module m1

  export f

  f(x::ASCIIString) = println(ASCIIString:  * x)
  f{T:String}(x::T) = println(  $(typeof(x)):  * x)
  f(x) = println(  $(typeof(x)):  * string(x))
end


module m2
  export f

  f(x::Int)= println(Int:  * string(x))
end

import m1.f
import m2.f

f(7)
f(Foo)
f(\u2200 x \u2203 y)
f(12.0)
f(2.0+3.0im)



On Dec 2, 2014, at 4:37 PM, John Myles White johnmyleswh...@gmail.com wrote:

There's no clean solution to this. In general, I'd argue that we should stop 
exporting so many names and encourage people to use qualified names much more 
often than we do right now.

But for important abstractions, we can put them into StatsBase, which all stats 
packages should be derived from.

-- John

On Dec 2, 2014, at 4:34 PM, Rob J. Goedman goed...@icloud.com wrote:

 I’ll try to give an example of my problem based on how I’ve seen it occur in 
 Stan.jl and Jags.jl.
 
 Both DataFrames.jl and Mamba.jl export describe(). Stan.jl relies on Mamba, 
 but neither Stan or Mamba need DataFrames. So DataFrames is not imported by 
 default.
 
 Recently someone used Stan and wanted to read in a .csv file and added 
 DataFrames to the using clause in the script, i.e.
 
 ```
 using Gadfly, Stan, Mamba, DataFrames
 ```
 
 After running a simulation, Mamba’s describe(::Mamba.Chains) could no longer 
 be found.
 
 I wonder if someone can point me in the right direction how best to solve 
 these kind of problems (for end users):
 
 1. One way around it is to always qualify describe(), e.g. Mamba.describe().
 2. Use isdefined(Main, :DataFrames) to upfront test for such a collision.
 3. Suggest to end users to import DataFrames and qualify e.g. 
 DataFrames.readtable().
 4. ?
 
 Thanks and regards,
 Rob J. Goedman
 goed...@mac.com mailto:goed...@mac.com



 On Dec 11, 2014, at 2:52 PM, samoconnor samocon...@mac.com wrote:
 
 Hi Rob,
 
 Ok, I see why that works, but it's a different example.
 
 Assume that m1 and m2 are libraries from different vendors, they know nothing 
 about each other, but they both export methods for f().
 
 It is surprising to me that importing two modules would cause one to 
 overwrite methods from the other with no warning or error. 
 
 On Friday, December 12, 2014 9:45:37 AM UTC+11, Rob J Goedman wrote:
 Sam,
 
 Maybe below slightly expanded version of your example will help.
 
 I think key is to import m1.f in module m2
 
 Regards
 Rob J. Goedman
 goe...@mac.com javascript:
 
 
 module m1
 
   export f
 
   f(x::ASCIIString) = println(ASCIIString:  * x)
   f{T:String}(x::T) = println(  $(typeof(x)):  * x)
   f(x) = println(  $(typeof(x)):  * string(x))
 end
 
 
 module m2
 
   import m1.f
   export f
 
   f(x::Int)= println(Int:  * string(x))
 end
 
 using m1
 using m2
 
 f(7)
 f(Foo)
 f(\u2200 x \u2203 y)
 f(12.0)
 f(2.0+3.0im)
 
 
 
 
 On Dec 11, 2014, at 2:18 PM, samoconnor samoc...@mac.com javascript: 
 wrote:
 
 The example below has two modules that define methods of function f for 
 different parameter types.
 Both modules are imported.
 It seems like that using the second module causes the first one to 
 disappear.
 Is that the intended behaviour?
 
 
 !/Applications/Julia-0.3.0-rc4.app/Contents/Resources/julia/bin/julia
 
 module m1
 
 export f
 
 f(x::String) = println(String:  * x)
 f(x) = println( ?:  * string(x))
 end
 
 
 module m2
 
 export f
 
 f(x::Int)= println(   Int:  * string(x))
 end
 
 using m1
 using m2
 
 f(7)
 f(Foo)
 
 output:
 
Int: 7
 ERROR: `f` has no method matching f(::ASCIIString)
 
 
 



[julia-users] Re: Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
If I change the example to use import instead of using...

import m1: f
import m2: f

... then I get:

Warning: ignoring conflicting import of m2.f into Main
 ?: 7
String: Foo

Now Julia spots the problem, but resolves it the opposite way (i.e. the 
first definition wins).


Re: [julia-users] BinDeps fails to find a built dependency (but only on Travis)

2014-12-11 Thread Michael Eastwood
The output of BinDeps.debug(CasaCore) on Travis is

INFO: Reading build script...

The package declares 4 dependencies.

- Library libblas

   - Satisfied by:

 - System Paths at /usr/lib/libopenblas.so.0

   - Providers:

 - BinDeps.AptGet package libopenblas-dev

- Library libcasa_tables

   - Satisfied by:

 - Simple Build Process at 
/home/travis/.julia/v0.4/CasaCore/deps/usr/lib/libcasa_tables.so

   - Providers:

 - Simple Build Process

- Library libcasa_measures

   - Satisfied by:

 - Simple Build Process at 
/home/travis/.julia/v0.4/CasaCore/deps/usr/lib/libcasa_measures.so

   - Providers:

 - Simple Build Process

- Library libcasacorewrapper

   - Providers:

 - Simple Build Process

On Wednesday, December 10, 2014 1:55:18 PM UTC-8, Elliot Saba wrote:

 It would be helpful if you could have julia execute using BinDeps; 
 BinDeps.debug(CasaCore) after attempting to build.



 It also looks like you have another error that we should look into:
 Warning: error initializing module GMP:

 ErrorException(The dynamically loaded GMP library (version 5.0.2 with 
 __gmp_bits_per_limb == 64)

 does not correspond to the compile time version (version 5.1.3 with 
 __gmp_bits_per_limb == 64).

 Please rebuild Julia.)
 Has Travis changed Ubuntu releases recently or something?



Re: [julia-users] Re: Aren't loops supposed to be faster?

2014-12-11 Thread Robert Gates
Hi Ivar,

yeah, I know, I thought Andreas was replying to my deleted post. I think 
Petr already solved the problem with the globals (his gist was apparently 
not the right context). However, he still reported:

On Thursday, December 11, 2014 6:47:33 PM UTC+1, Petr Krysl wrote:

 One more note: I conjectured that perhaps the compiler was not able to 
 infer correctly the type of the matrices,  so I hardwired (in the actual FE 
 code)

 Jac = 1.0; gradN = gradNparams[j]/(J); # get rid of Rm for the moment

 About 10% less memory used, runtime about the same.  So, no effect really. 
 Loops are still slower than the vectorized code by a factor of two.

 Petr


Best,

Robert

On Friday, December 12, 2014 12:01:44 AM UTC+1, Ivar Nesje wrote:

 https://gist.github.com/anonymous/4ec426096c02faa4354d#comment-1354636



Re: [julia-users] Re: Need help to understand method dispatch with modules

2014-12-11 Thread Rob J. Goedman
Did you restart the REPL?

Rob J. Goedman
goed...@mac.com





 On Dec 11, 2014, at 3:19 PM, samoconnor samocon...@mac.com wrote:
 
 If I change the example to use import instead of using...
 
 import m1: f
 import m2: f
 
 ... then I get:
 
 Warning: ignoring conflicting import of m2.f into Main
  ?: 7
 String: Foo
 
 Now Julia spots the problem, but resolves it the opposite way (i.e. the first 
 definition wins).



Re: [julia-users] Re: EACCESS errors when starting julia 0.3.3

2014-12-11 Thread Stefan Karpinski
Thank you!

On Thu, Dec 11, 2014 at 5:31 PM, Robbin Bonthond robbin.bonth...@gmail.com
wrote:

 https://github.com/JuliaLang/julia/issues/9319 has been filed

 On Thursday, December 11, 2014 3:07:06 PM UTC-6, Stefan Karpinski wrote:

 On Thu, Dec 11, 2014 at 4:05 PM, Robbin Bonthond robbin@gmail.com
 wrote:

 for some reason the binary installer of julia uses restricted group
 permissions


 That sounds like a potential problem – would you mind filing an issue?

 https://github.com/JuliaLang/julia/issues




Re: [julia-users] Help with types (Arrays)

2014-12-11 Thread Stefan Karpinski
On Thu, Dec 11, 2014 at 3:41 PM, S sab...@gmail.com wrote:

 Thank you. I'm sorry I didn't see this before posting. I'll have a look.


No need to apologize, this part of Julia's type system can be confusing
coming from languages that have covariant parametric types.


Re: [julia-users] Re: Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
Hi Rob,

I don't use the REPL. I have #!/[...]bin/julia on the first line of the 
script and run ./script.jl from the command line.

On Friday, December 12, 2014 10:27:45 AM UTC+11, Rob J Goedman wrote:

 Did you restart the REPL?

 Rob J. Goedman
 goe...@mac.com javascript:




  
 On Dec 11, 2014, at 3:19 PM, samoconnor samoc...@mac.com javascript: 
 wrote:

 If I change the example to use import instead of using...

 import m1: f
 import m2: f

 ... then I get:

 Warning: ignoring conflicting import of m2.f into Main
  ?: 7
 String: Foo

 Now Julia spots the problem, but resolves it the opposite way (i.e. the 
 first definition wins).




Re: [julia-users] Re: Need help to understand method dispatch with modules

2014-12-11 Thread Rob J. Goedman
Hmmm, now I think early on I saw you’re on Julia 0.3.0-RC4?

I tried it on 0.3.3 and 0.4 both with the same output. Could that explain the 
difference?

Rob J. Goedman
goed...@mac.com


julia include(/Users/rob/Projects/Julia/Rob/MetaProgramming/meta13.jl)
Int: 7
ASCIIString: Foo
  UTF8String: ∀ x ∃ y
  Float64: 12.0
  Complex{Float64}: 2.0 + 3.0im



 On Dec 11, 2014, at 3:53 PM, samoconnor samocon...@mac.com wrote:
 
 Hi Rob,
 
 I don't use the REPL. I have #!/[...]bin/julia on the first line of the 
 script and run ./script.jl from the command line.
 
 On Friday, December 12, 2014 10:27:45 AM UTC+11, Rob J Goedman wrote:
 Did you restart the REPL?
 
 Rob J. Goedman
 goe...@mac.com javascript:
 
 
 
 
 
 On Dec 11, 2014, at 3:19 PM, samoconnor samoc...@mac.com javascript: 
 wrote:
 
 If I change the example to use import instead of using...
 
 import m1: f
 import m2: f
 
 ... then I get:
 
 Warning: ignoring conflicting import of m2.f into Main
  ?: 7
 String: Foo
 
 Now Julia spots the problem, but resolves it the opposite way (i.e. the 
 first definition wins).
 



Re: [julia-users] Re: Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
I've just done a fresh build from git HEAD (julia version 0.4.0-dev+2067).
I don't see any difference in behaviour between 0.3.0-RC4 and 0.4 for the 
examples I have posted.

On Friday, December 12, 2014 10:58:19 AM UTC+11, Rob J Goedman wrote:

 Hmmm, now I think early on I saw you’re on Julia 0.3.0-RC4?

 I tried it on 0.3.3 and 0.4 both with the same output. Could that explain 
 the difference?

 Rob J. Goedman
 goe...@mac.com javascript:


 *julia *
 *include(/Users/rob/Projects/Julia/Rob/MetaProgramming/meta13.jl)*
 Int: 7
 ASCIIString: Foo
   UTF8String: ∀ x ∃ y
   Float64: 12.0
   Complex{Float64}: 2.0 + 3.0im


  
 On Dec 11, 2014, at 3:53 PM, samoconnor samoc...@mac.com javascript: 
 wrote:

 Hi Rob,

 I don't use the REPL. I have #!/[...]bin/julia on the first line of the 
 script and run ./script.jl from the command line.

 On Friday, December 12, 2014 10:27:45 AM UTC+11, Rob J Goedman wrote:

 Did you restart the REPL?

 Rob J. Goedman
 goe...@mac.com




  
 On Dec 11, 2014, at 3:19 PM, samoconnor samoc...@mac.com wrote:

 If I change the example to use import instead of using...

 import m1: f
 import m2: f

 ... then I get:

 Warning: ignoring conflicting import of m2.f into Main
  ?: 7
 String: Foo

 Now Julia spots the problem, but resolves it the opposite way (i.e. the 
 first definition wins).





Re: [julia-users] LLVM3.2 and JULIA BUILD PROBLEM

2014-12-11 Thread Keno Fischer
LLVM 3.2 is no longer supported. I wouldn't be opposed to a patch
supporting 3.2, since haven't formally dropped support (i.e. there's still
some ifdefs in the code) for it yet - it's just that nobody is using it
anymore.

On Thu, Dec 11, 2014 at 5:21 PM, Stefan Karpinski 
stefan.karpin...@gmail.com wrote:

 LLVM 3.2 is no longer supported – the default Julia version of LLVM is 3.3.


 On Dec 11, 2014, at 4:58 PM, John Myles White johnmyleswh...@gmail.com
 wrote:

 My understanding is that different versions of LLVM are enormously
 different and that there's no safe way to make Julia work with any version
 of LLVM other than the intended one.

  -- John

 On Dec 11, 2014, at 4:56 PM, Vehbi Eşref Bayraktar 
 vehbi.esref.bayrak...@gmail.com wrote:

 Hi;

 I am using llvm 3.2 with libnvvm . However when i try to build julia using
 those 2 flags :
 USE_SYSTEM_LLVM = 1
 USE_LLVM_SHLIB = 1

 I have a bunch of errors. starting as following:

 codegen.cpp: In function ‘void jl_init_codegen()’:
 codegen.cpp:4886:26: error: ‘getProcessTriple’ is not a member of
 ‘llvm::sys’
  Triple TheTriple(sys::getProcessTriple()); // *llvm32 doesn't
 have this one instead it has getDefaultTargetTriple()*
   ^
 codegen.cpp:4919:5: error: ‘mbuilder’ was not declared in this scope
  mbuilder = new MDBuilder(getGlobalContext());  //  *include
 llvm/MDBuilder.h would fix this*
  ^
 codegen.cpp:4919:20: error: expected type-specifier before ‘MDBuilder’
  mbuilder = new MDBuilder(getGlobalContext());

 Even you fix these errors, you keep hitting the following ones:
 In file included from codegen.cpp:976:0:
 intrinsics.cpp: In function ‘llvm::Value* emit_intrinsic(JL_I::intrinsic,
 jl_value_t**, size_t, jl_codectx_t*)’:
 intrinsics.cpp:1158:72: error: ‘ceil’ is not a member of ‘llvm::Intrinsic’
  return builder.CreateCall(Intrinsic::getDeclaration(jl_Module,
 Intrinsic::ceil,



 So is the master branch currently supporting llvm32? Or is there a patch
 somewhere?

 Thanks





[julia-users] Define composite types in a different file - constructor not defined error

2014-12-11 Thread Test This

I have two files: dataTypes.jl and paramcombos.jl

In dataTypes.jl I have 

*type Params*
* .*
* . // field names and types*
* .*
*end*



In paramcombos.jl I have 

*module paramcombos*

*require(dataTypes.jl)*


*function baseParams()*
*   params = Params( field1 = blah1, field2 = blah2, ...)*
*end*


*end*

 

In the julia repl if I do 

*require(paramcombos.jl)*


and then, 

*basep = paramcombos.baseParams()*


I get an error saying:

*ERROR: Params not defined*
* in baseParams at /Users/code/paramcombos.jl:33 (where 33 is the line 
shown above from baseParams() function. *


If I move type declaration to paramcombos.jl, things work fine. Is there a 
way to keep type definitions in one file and use the constructor in another 
file?

Thank you


Re: [julia-users] Need help to understand method dispatch with modules

2014-12-11 Thread samoconnor
On Friday, December 12, 2014 9:58:57 AM UTC+11, ele...@gmail.com wrote:

 See open Issues https://github.com/JuliaLang/julia/issues/2327 and 
 https://github.com/JuliaLang/julia/issues/4345


Ok, I take it that the short answer, from #4345 is that the intended 
behaviour is not well thought out or well defined yet. 


Re: [julia-users] Define composite types in a different file - constructor not defined error

2014-12-11 Thread John Myles White
You want include, not require.

 -- John

On Dec 11, 2014, at 7:25 PM, Test This curiousle...@gmail.com wrote:

 
 I have two files: dataTypes.jl and paramcombos.jl
 
 In dataTypes.jl I have 
 
 type Params
  .
  . // field names and types
  .
 end
 
 
 In paramcombos.jl I have 
 
 module paramcombos
 
 require(dataTypes.jl)
 
 function baseParams()
params = Params( field1 = blah1, field2 = blah2, ...)
 end
 
 end
  
 
 In the julia repl if I do 
 
 require(paramcombos.jl)
 
 and then, 
 
 basep = paramcombos.baseParams()
 
 I get an error saying:
 
 ERROR: Params not defined
  in baseParams at /Users/code/paramcombos.jl:33 (where 33 is the line shown 
 above from baseParams() function. 
 
 If I move type declaration to paramcombos.jl, things work fine. Is there a 
 way to keep type definitions in one file and use the constructor in another 
 file?
 
 Thank you



Re: [julia-users] Re: home page content

2014-12-11 Thread cdm

in support of Why Julia, it seems that fact that Julia is attracting some 
of the best
and brightest minds spanning a diverse collection of fields ought to be 
displayed
prominently ...

as an example, perusing the COIN-OR Cup winners list returns several 
familiar
names  ( see http://www.coin-or.org/coinCup/coinCup.html ) ... speaking of 
which,
it looks as though the cup needs to come home next year, as it is off for a 
stay in
Python land  ( http://www.coin-or.org/coinCup/coinCup2014Winner.html ) and
when it has been won again, the string Julia should be featured in the 
team/
application name.


in addition to INFORMS, there are other conferences to target ...
the hit-list:

http://en.wikipedia.org/wiki/List_of_computer_science_conferences


stay the course, Julians ...

cdm



On Thursday, December 11, 2014 3:40:52 AM UTC-8, Christoph Ortner wrote:

 I'm glad people agree with my Why Julia paragraph.



Re: [julia-users] Define composite types in a different file - constructor not defined error

2014-12-11 Thread Test This
Thank you, John. That worked!

Could you please direct me to a reference which explains when one should 
use include/require/import/using?

Thank you.

On Thursday, December 11, 2014 7:39:23 PM UTC-5, John Myles White wrote:

 You want include, not require.

  -- John

 On Dec 11, 2014, at 7:25 PM, Test This curiou...@gmail.com javascript: 
 wrote:


 I have two files: dataTypes.jl and paramcombos.jl

 In dataTypes.jl I have 

 *type Params*
 * .*
 * . // field names and types*
 * .*
 *end*



 In paramcombos.jl I have 

 *module paramcombos*

 *require(dataTypes.jl)*


 *function baseParams()*
 *   params = Params( field1 = blah1, field2 = blah2, ...)*
 *end*


 *end*

  

 In the julia repl if I do 

 *require(paramcombos.jl)*


 and then, 

 *basep = paramcombos.baseParams()*


 I get an error saying:

 *ERROR: Params not defined*
 * in baseParams at /Users/code/paramcombos.jl:33 (where 33 is the line 
 shown above from baseParams() function. *


 If I move type declaration to paramcombos.jl, things work fine. Is there a 
 way to keep type definitions in one file and use the constructor in another 
 file?

 Thank you




  1   2   >