[julia-users] Re: facing problem in building Pakages

2016-03-21 Thread kunal singh
trying using g++compiler-4.7 or above

On Tuesday, March 22, 2016 at 6:28:06 AM UTC+5:30, Jeffrey Sarnoff wrote:
>
> For others, the package is here:  SymEngine.jl 
> 
> There is more difficulty within the package as it currently configured for 
> distribution.  It does not build properly for me on Linux.
> ERROR: LoadError: failed process: (`cmake ...`) ... while loading 
> /home/jas/.julia/v0.4/SymEngine/deps/build.jl, in expression starting on 
> line 63
> also, try going into the SymEngine build directory and removing any 
> autogenerated files
>
>
>
> On Monday, March 21, 2016 at 1:16:59 PM UTC-4, kunal singh wrote:
>>
>> Hi everyone,
>>
>> I have installed SymEngine package .
>> I am trying to fix some warning issues in this package.
>>
>> So after I have made correct changes(tested it on travis-CI and got no 
>> more warning there) 
>> in this package i execute the following commands :
>>  julia -e 'Pkg.build("SymEngine")'
>> julia -e 'Pkg.test("SymEngine")'
>>
>> I get the same warnings.
>> Want I think is whatever changes I make the changes are not being 
>> compiled by  julia -e 'Pkg.build("SymEngine")'
>> Need urgent help pls.
>>
>

[julia-users] Re: Announcing JuDE: autocomplete and jump to definition support for Atom

2016-03-21 Thread James Dang
Hi Evan, no, the console is controlled by Juno (or "julia-client" in Atom), 
and Jude can't hook into that. Besides, Jude is complementary and distinct 
from what Juno provides. Jude does auto-complete within scopes, so you can 
get autocomplete in the middle of a function body you are editing, which 
might be nested deep in a module which is imported into your main script. 
Juno does auto-complete just on your main script, ie at global scope, just 
like a Julia REPL. It's probably possible for Juno to eventually hook its 
own auto-completes into the console. It's just not as easy because then it 
can't use Atom's Autocomplete+ package.

Instead, you could type your code directly into your source file, 
leveraging Jude's autocomplete there. Then once you're done typing the 
line, you could type Ctrl-Enter to execute it via Juno. When I use Juno, I 
rarely have to type directly into the console. I usually just edit a script 
file and run the lines individually.




[julia-users] Re: Announcing JuDE: autocomplete and jump to definition support for Atom

2016-03-21 Thread James Dang
Hi Nitin, yeah I haven't set up a way to configure which Julia binary to 
run, so it just expects `julia` on your path. That will be fixed up soon. 
Another user who used homebrew found a workaround: 
https://github.com/jamesdanged/Jude/issues/1


[julia-users] Re: Announcing JuDE: autocomplete and jump to definition support for Atom

2016-03-21 Thread Evan Fields
Looks great - I'm excited to try this out. Does the autocomplete work in 
the console as well? I recently tried Atom flavored Juno and was super 
impressed but found the lack of console completions a major pain point. 
Will this play nicely with Juno's packages?

On Sunday, March 20, 2016 at 2:58:15 PM UTC-4, James Dang wrote:
>
> Hi All, Julia has been great for me, and I wanted to give back a little. 
> LightTable and Atom are great editors, but I was really starting to miss 
> good intellisense-like autocomplete and basic navigation features like 
> jump-to-definition, especially on larger codebases. It's really quite a 
> slog to remember exactly where in which file a function was defined, or 
> what its exact arguments are. And maybe with better tooling, more people 
> will be drawn to the community. So I put a bit of work into a new package 
> for Atom that gives you that!
>
> https://atom.io/packages/jude
>
>
> 
>
>
> This is a bit different from what you get out of julia-client and 
> autocomplete-julia because it does a full syntax parsing and scope 
> resolution of your codebase without executing it in a Julia process. It 
> reparses very quickly on the fly without needing to save. And the matching 
> is precise, not fuzzy, giving you exactly what names are available in the 
> scope you are in currently. It's quite new and unpolished, but please try 
> it out and let me know what you think!
>
> Cheers,
> James
>
>

[julia-users] Re: Announcing JuDE: autocomplete and jump to definition support for Atom

2016-03-21 Thread Evan Fields
Looks great - I'm excited to try this out. Does the autocomplete work in 
the console as well? I recently tried Atom flavored Juno and was super 
impressed but found the lack of console completions a major pain point. 
Will this play nicely with Juno's packages?


Re: [julia-users] DataFrame from string

2016-03-21 Thread Eric Forgy
On Tuesday, March 22, 2016 at 12:09:32 AM UTC+8, Milan Bouchet-Valat wrote:
>
> And with the next release (available from git master) you will be able 
> to do this directly: 
> df = csv""" 
> 1, 7.6 
> 2, 45.6 
> 3, 12.1 
> ... 
> """ 
>

This looks cool and my first reaction was, "Neat!", but why base Julia? 
Will it support string interpolation, e.g.

x = 7.6
df = csv"""
1, $x
2, 45.6
3, 12.1
...
"""

This seems like something suitable for nice package, but confused why it 
would be added to base (when I thought one goal was to make base smaller). 
I am probably confused :)


[julia-users] Declaring a typealias that is also subtype of an abstract type

2016-03-21 Thread Tomas Lycken
Hah - only a couple of hours ago I uploaded this notebook, which outlines one 
possible way of getting there:

http://nbviewer.jupyter.org/github/tlycken/IJulia-Notebooks/blob/master/Using%20macros%20to%20implement%20value-objects%20and%20other%20simple%20wrappers%20in%20Julia.ipynb

// T 

[julia-users] Re: facing problem in building Pakages

2016-03-21 Thread Jeffrey Sarnoff
For others, the package is here:  SymEngine.jl 

There is more difficulty within the package as it currently configured for 
distribution.  It does not build properly for me on Linux.
ERROR: LoadError: failed process: (`cmake ...`) ... while loading 
/home/jas/.julia/v0.4/SymEngine/deps/build.jl, in expression starting on 
line 63
also, try going into the SymEngine build directory and removing any 
autogenerated files



On Monday, March 21, 2016 at 1:16:59 PM UTC-4, kunal singh wrote:

Hi everyone,

I have installed SymEngine package .
I am trying to fix some warning issues in this package.

So after I have made correct changes(tested it on travis-CI and got no more 
warning there) 
in this package i execute the following commands :
 julia -e 'Pkg.build("SymEngine")'
julia -e 'Pkg.test("SymEngine")'

I get the same warnings.
Want I think is whatever changes I make the changes are not being compiled 
by  julia -e 'Pkg.build("SymEngine")'
Need urgent help pls.



[julia-users] Declaring a typealias that is also subtype of an abstract type

2016-03-21 Thread Jean-François Baffier
I am trying to declare a typealias that should be also a subtype of an 
abstract type. For instance, below I would like AliasType to be an alias of 
Vector{Int} and also a subtype of AbstractType. This code will declare the 
alias correctly but not the subtype. (I think it evaluates Vector{Int} <: 
AbstractType instead)

abstract AbstractType

typealias AliasType Vector{Int} <: AbstractType

I can go around the problem by doing something like this:

abstract AbstractType

type AliasType <: AbstractType
  content::Vector{Int}
end

But then many function that are defined for Vector would be directly 
accessed for AliasType and I would need to overload them.

The reason I want to do this is because I have more complicated type that 
are subtypes of AbstractType. Maybe there is a better solution than the 
second piece of code :)


[julia-users] ANN: Julia 0.4.5 released

2016-03-21 Thread Tony Kelman
Hello all! The latest bugfix release of the Julia 0.4.x line has been 
released. Binaries are available from the usual place 
, and as is typical with such things, 
please report all issues to either the issue tracker 
, or email the julia-users list. 
(If you reply to this message on julia-users, please do not cc julia-news 
which is intended to be low-volume.)

This is a bugfix release, see this commit log 
 for the list 
of bugs fixed between 0.4.3 and 0.4.5. Bugfix backports to the 0.4.x line 
will be continuing with a target of one point release per month. If you are 
a package author and want to rely on functionality that did not work in 
earlier 0.4.x releases but does work in 0.4.5 in your package, please be 
sure to change the minimum julia version in your REQUIRE file to 0.4.5 
accordingly. If you're not sure about this, you can test your package 
specifically against older 0.4.x releases on Travis and/or locally.

This is a recommended upgrade for anyone using previous releases, and 
should act as a drop-in replacement. If you find any regressions relative 
to previous releases, please let us know. There was an issue with 
Pkg.publish in the 0.4.4 tag, thanks to Scott Lundberg for fixing it. 0.4.5 
should be otherwise equivalent.

-Tony



[julia-users] Re: Hello, How do I find my mentor for GSoC2016?

2016-03-21 Thread Jiahao Chen
Tiled matrix-matrix products sounds like a good start!


Re: [julia-users] macroexpand entire module

2016-03-21 Thread Yichao Yu
On Mon, Mar 21, 2016 at 5:54 PM,   wrote:
> Oops, maybe name it differently
>
> function expand(ex::Expr)
>   if ex.head == :module
> Expr(:module, ex.args[1], ex.args[2], macroexpand(ex.args[3]))
>   else
> macroexpand(ex)
>   end
> end
>

FYI, expand is another base function.

>
> So if someone were to give me:
>
>
> module M
>
> include("X.jl")
> import X: @y, @z
>
> f(x) = X.@y(3)
>
> end
>
> I would then...
>
> eval(:(module M
> include("X.jl")
> import X: @y, @z
>
> f(x) = X.@y(3)
>
> end)
>
> expand(:(module M ... end))
>
> ?
> Sorry, not sure how this contextual evaluating/expanding would look like.
> It's not clear to me how evaluating the module M will make relevant
> definitions accessible to the expand function, since that ignores the fact
> that I'm inside module M when expanding each form.
>

As I said, you need to manually go through each of the statement in
the module, macro expand and evaluate them.
Try this:

```
julia> function expand_module(ex::Expr)
   @assert ex.head === :module
   std_imports = ex.args[1]::Bool
   name = ex.args[2]::Symbol
   body = ex.args[3]::Expr
   mod = Module(name, std_imports)
   newbody = quote end
   modex = Expr(:module, std_imports, name, newbody)
   for subex in body.args
   expandf = ()->macroexpand(subex)
   subex = eval(mod, :($expandf()))
   push!(newbody.args, subex)
   eval(mod, subex)
   end
   modex, mod
   end
expand_module (generic function with 1 method)

julia> expand_module(:(module A
   macro X()
   1
   end
   b = 1 + @X
   @show b
   end))
b = 2
(:(module A
eval(x) = begin  # none, line 1:
top(Core).eval(A,x)
end
eval(m,x) = begin  # none, line 1:
top(Core).eval(m,x)
end # none, line 2:
$(Expr(:macro, :(X()), quote  # none, line 3:
1
end)) # none, line 5:
b = 1 + 1 # none, line 6:
begin
Base.println("b = ",Base.repr(begin  # show.jl, line 166:
#2#value = b
end))
#2#value
end
end),A)
```

>
>


Re: [julia-users] Re: Parametric splines?

2016-03-21 Thread Kaj Wiik
Tomas,

That's exactly I was after, thanks!

Kaj

On Sunday, March 20, 2016 at 6:55:19 PM UTC+2, Tomas Lycken wrote:
>
> I tried googling for “parametric splines” but didn’t end up with a concise 
> definition of what they are. If B-splines fit your needs (and it seems from 
> the SciPy documentation that it might do), maybe Interpolations.jl 
>  would be useful enough? 
> The API is a little different, but I think this does what you’re after:
>
> using Interpolations
>
> t = 0:.1:.9
> x = sin(2π*t)
> y = cos(2π*t)
> A = hcat(x,y)
>
> itp = scale(interpolate(A, (BSpline(Cubic(Periodic())), NoInterp()), 
> OnGrid()), t, 1:2)
>
> tfine = 0:.01:1
> xs, ys = [itp[t,1] for t in tfine], [itp[t,2] for t in tfine]
>
> using Gadfly
> plot(layer(x=x,y=y,Geom.point),layer(x=xs,y=ys,Geom.path))
>
> Results on my machine:
>
>
> 
>
> // T
>
> On Sunday, March 20, 2016 at 3:04:12 AM UTC+1, Kyle Barbary wrote:
>
> Hi Kaj,
>>
>> A pull request adding a wrapper for this to Dierckx.jl would be most 
>> welcome. This would be a matter of reading the docstring for the parcur 
>> function here 
>> 
>>  
>> and then writing a wrapper function that sets up the arguments correctly 
>> and calls the Fortran function with ccall. There are a lot of examples 
>> in Dierckx.jl. It’s a bit tedious but mostly straight forward. Fortunately 
>> (for me anyway) knowing Fortran is not a requirement. I usually consult the 
>> relevant scipy.interpolate wrapper (e.g., this one for splprep 
>> )
>>  
>> to see how they handled things, and look at the tests in that package as 
>> well.
>>
>> You can construct an array of vectors by prepending the element type, 
>> which in your example would be Vector{Float64}. For example:
>>
>> julia> a = [1., 2.];
>>
>> julia> b = [3., 4.];
>>
>> julia> Vector{Float64}[a, b]
>> 2-element Array{Array{Float64,1},1}:
>>  [1.0,2.0]
>>  [3.0,4.0]
>>
>> The plan for Julia 0.5, is that you won’t need to prepend the element 
>> type: just [a, b] will do.
>>
>> By the way, collect is often not necessary. For example:
>>
>> julia> t = 0.0:0.1:0.5
>> 0.0:0.1:0.5
>>
>> julia> sin(2*pi*t)
>> 6-element Array{Float64,1}:
>>  0.0
>>  0.587785   
>>  0.951057   
>>  0.951057   
>>  0.587785   
>>  1.22465e-16
>>
>> Best,
>> — Kyle
>> ​
>>
>> On Sat, Mar 19, 2016 at 3:24 PM, Kaj Wiik  wrote:
>>
>>> Replying to myself...sorry.
>>>
>>> It seems that the easiest way for now is to call SciPy:
>>>
>>> using PyCall
>>> @pyimport scipy.interpolate as interpolate
>>> t = collect(0:.1:1)
>>> x = sin(2π*t)
>>> y = cos(2π*t)
>>> p = Array[]
>>> push!(p, x)
>>> push!(p, y)
>>> tck, u = interpolate.splprep(p, s=0)
>>> unew = collect(0:0.01:1)
>>> out = interpolate.splev(unew, tck)
>>>
>>> using Winston
>>> plot(x, y, "o", out[1], out[2], "-r")
>>>
>>> BTW, is there an easier way to create an array of vectors?
>>>
>>> Cheers,
>>> Kaj
>>>
>>>
>>> On Saturday, March 19, 2016 at 4:13:19 PM UTC+2, Kaj Wiik wrote:


 Is there a Julia package that implements parametric splines? 

 I noticed that the Dierckx Fortran library has an implementation but 
 the corresponding Julia package does not does not have bindings for it.

 Thanks,
 Kaj

>>>
>> ​
>


Re: [julia-users] Re: Parametric splines?

2016-03-21 Thread Kaj Wiik
Hi Kyle,

I'll probably try to write a wrapper when I have more time, considered it 
already but I had to finish something quite quickly. Thanks for the tips 
for using collect and arrays of arrays.

Kaj


On Sunday, March 20, 2016 at 4:04:12 AM UTC+2, Kyle Barbary wrote:
>
> Hi Kaj,
>
> A pull request adding a wrapper for this to Dierckx.jl would be most 
> welcome. This would be a matter of reading the docstring for the parcur 
> function here 
> 
>  
> and then writing a wrapper function that sets up the arguments correctly 
> and calls the Fortran function with ccall. There are a lot of examples in 
> Dierckx.jl. It’s a bit tedious but mostly straight forward. Fortunately 
> (for me anyway) knowing Fortran is not a requirement. I usually consult the 
> relevant scipy.interpolate wrapper (e.g., this one for splprep 
> )
>  
> to see how they handled things, and look at the tests in that package as 
> well.
>
> You can construct an array of vectors by prepending the element type, 
> which in your example would be Vector{Float64}. For example:
>
> julia> a = [1., 2.];
>
> julia> b = [3., 4.];
>
> julia> Vector{Float64}[a, b]
> 2-element Array{Array{Float64,1},1}:
>  [1.0,2.0]
>  [3.0,4.0]
>
> The plan for Julia 0.5, is that you won’t need to prepend the element 
> type: just [a, b] will do.
>
> By the way, collect is often not necessary. For example:
>
> julia> t = 0.0:0.1:0.5
> 0.0:0.1:0.5
>
> julia> sin(2*pi*t)
> 6-element Array{Float64,1}:
>  0.0
>  0.587785   
>  0.951057   
>  0.951057   
>  0.587785   
>  1.22465e-16
>
> Best,
> — Kyle
> ​
>
> On Sat, Mar 19, 2016 at 3:24 PM, Kaj Wiik  > wrote:
>
>> Replying to myself...sorry.
>>
>> It seems that the easiest way for now is to call SciPy:
>>
>> using PyCall
>> @pyimport scipy.interpolate as interpolate
>> t = collect(0:.1:1)
>> x = sin(2π*t)
>> y = cos(2π*t)
>> p = Array[]
>> push!(p, x)
>> push!(p, y)
>> tck, u = interpolate.splprep(p, s=0)
>> unew = collect(0:0.01:1)
>> out = interpolate.splev(unew, tck)
>>
>> using Winston
>> plot(x, y, "o", out[1], out[2], "-r")
>>
>> BTW, is there an easier way to create an array of vectors?
>>
>> Cheers,
>> Kaj
>>
>>
>> On Saturday, March 19, 2016 at 4:13:19 PM UTC+2, Kaj Wiik wrote:
>>>
>>>
>>> Is there a Julia package that implements parametric splines? 
>>>
>>> I noticed that the Dierckx Fortran library has an implementation but the 
>>> corresponding Julia package does not does not have bindings for it.
>>>
>>> Thanks,
>>> Kaj
>>>
>>
>

Re: [julia-users] macroexpand entire module

2016-03-21 Thread vishesh
Oops, maybe name it differently

function expand(ex::Expr)
  if ex.head == :module
Expr(:module, ex.args[1], ex.args[2], macroexpand(ex.args[3]))
  else
macroexpand(ex)
  end
end


So if someone were to give me:


module M 

include("X.jl")
import X: @y, @z

f(x) = X.@y(3)

end

I would then...

eval(:(module M
include("X.jl")
import X: @y, @z

f(x) = X.@y(3)

end)

expand(:(module M ... end))

? 
Sorry, not sure how this contextual evaluating/expanding would look like. 
It's not clear to me how evaluating the module M will make relevant 
definitions accessible to the expand function, since that ignores the fact 
that I'm inside module M when expanding each form. 





Re: [julia-users] Is it possible to grab REPL contents after a .jl file completes execution ?

2016-03-21 Thread 'Jhan Jar' via julia-users
Thank you Karpinski and Sachs.

It appears, that for now, a workaround outside Julia is the only way.

On Monday, March 21, 2016 at 6:54:22 PM UTC+5, Josef Sachs wrote:
>
> > On Mon, 21 Mar 2016 09:39:40 -0400, Stefan Karpinski said: 
>
> > Could you just pipe the output of non-interactive Julia to the `tee` 
> > command? 
>
> I'd still like to be able to pipe the output of non-non-interactive Julia. 
>
> https://github.com/JuliaLang/julia/issues/14776 
>


Re: [julia-users] macroexpand entire module

2016-03-21 Thread Yichao Yu
On Mon, Mar 21, 2016 at 4:16 PM, Yichao Yu  wrote:
> On Mon, Mar 21, 2016 at 4:11 PM,   wrote:
>> I think I got that much down:
>>
>> function macroexpand(ex::Expr)
>>   if ex.head == :module
>> Expr(:module, ex.args[1], ex.args[2], macroexpand(ex.args[3]))
>>   else
>> macroexpand(ex)
>>   end
>> end
>
> Seems that this will infinitely recurse?
>
>>
>>
>> the issue is that now I want to resolve all imports and such in the module
>> so that when I expand it, I get properly qualified names for the macros.
>>
>> so if I had
>>
>> module M
>>
>> include("X.jl")
>> import X: @y, @z
>>
>> f(x) = X.@y(3)
>>
>> end
>>
>> It would still be able to find X.@y.
>> Evaluating the include/import myself would be kind of risky since someone
>> could clobber a module name in my global namespace (of the program that's
>> running to macroexpand this stuff).
>> Is there a way to sandbox it?

And as I said, you should eval into the new module, not the one your
own toplevel interpreter is running in.

>
> You are looking for a way to sandbox the whole julia runtime. I don't
> think it's currently possible. Also note that macroexpansion is also
> calling arbitrary code.
>
>>
>>
>> On Monday, March 21, 2016 at 1:00:01 PM UTC-7, Yichao Yu wrote:
>>>
>>> On Mon, Mar 21, 2016 at 3:57 PM,   wrote:
>>> > So there's no way to macroexpand the module in the module scope itself?
>>> > I don't mind evaluating the module, but how do I then dump out the
>>> > macroexpanded version of it?
>>> >
>>> > If there's a way to "clear" global scope then it would also be possible
>>> > to
>>> > eval the module, expand in global scope, clean global scope,
>>> > rinse/repeat on
>>> > next module.
>>> > Is there a way to do that instead?
>>> >
>>>
>>> If you want to just repeat what the global interpreter is doing, you
>>> can parse the module, create the module yourself, then macroexpand,
>>> print and evaluate each statement.
>>>
>>> >
>>> > On Monday, March 21, 2016 at 11:55:47 AM UTC-7, Tim Holy wrote:
>>> >>
>>> >> Interesting. julia's `macroexpand` function doesn't seem to work for
>>> >> expressions inside a module:
>>> >>
>>> >> julia> macroexpand( :(module M @time(1+1) end))
>>> >> :(module M
>>> >> eval(x) = begin  # none, line 1:
>>> >> top(Core).eval(M,x)
>>> >> end
>>> >> eval(m,x) = begin  # none, line 1:
>>> >> top(Core).eval(m,x)
>>> >> end # none, line 1:
>>> >> @time 1 + 1
>>> >> end)
>>> >>
>>> >> which is the same thing you get back if you omit the `macroexpand`.
>>> >>
>>> >> Try commenting out the module declaration and see if you like it
>>> >> better.
>>> >>
>>> >> Best,
>>> >> --Tim
>>> >>
>>> >> On Monday, March 21, 2016 11:15:56 AM vis...@stanford.edu wrote:
>>> >> > The MacroExpandJL package seems promising, but maybe I'm not able to
>>> >> > get
>>> >> > it
>>> >> > to work. After updating syntax to match julia 0.4,
>>> >> > MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2)
>>> >> > end
>>> >> > end))
>>> >> > module M
>>> >> > begin  # line 1:
>>> >> > function f(x) # line 1:
>>> >> > 1 + @m 2
>>> >> > end
>>> >> > endend
>>> >> >
>>> >> > Notice how the @m 2 is still there. Also, why is everything wrapped
>>> >> > in
>>> >> > an
>>> >> > extra do block inside the module? Is this a printing issue, because
>>> >> > that
>>> >> > expression doesn't have one.
>>> >> >
>>> >> > How would I go about evaluating a module and it's macros, macro
>>> >> > expanding
>>> >> > the whole thing, and then dumping it out? @eval seems like, name
>>> >> > wise,
>>> >> > it
>>> >> > should do this but it doesn't.
>>> >> > Do you first eval() the module, then @eval the module? That didn't
>>> >> > work
>>> >> > for
>>> >> > me either.
>>> >> >
>>> >> > Predefining a macro and then trying to evaluate:
>>> >> > > macro m(x) 1 end
>>> >> > > @eval(:(module M function f(x) @m 2 end end))
>>> >> > :
>>> >> > :(module M
>>> >> >
>>> >> > eval(x) = begin  # none, line 1:
>>> >> > top(Core).eval(M,x)
>>> >> > end
>>> >> > eval(m,x) = begin  # none, line 1:
>>> >> > top(Core).eval(m,x)
>>> >> > end # none, line 1:
>>> >> > function f(x) # none, line 1:
>>> >> > @m 2
>>> >> > end
>>> >> > end)
>>> >> >
>>> >> > Also doesn't work.
>>> >> >
>>> >> > On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
>>> >> > > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
>>> >> > > > Tim, I'm assuming that module must assume that no macros are
>>> >> > > > defined
>>> >> > >
>>> >> > > *and*
>>> >> > >
>>> >> > > > then used within the module body. If that does occur, the only
>>> >> > > > way
>>> >> > > > to do
>>> >> > > > macro expansion correctly is to evaluate the module since the
>>> >> > > > module
>>> >> > > > definition can depend on arbitrary previously evaluated code.
>>> >> > >
>>> >> > > Probably true. I 

Re: [julia-users] macroexpand entire module

2016-03-21 Thread Yichao Yu
On Mon, Mar 21, 2016 at 4:11 PM,   wrote:
> I think I got that much down:
>
> function macroexpand(ex::Expr)
>   if ex.head == :module
> Expr(:module, ex.args[1], ex.args[2], macroexpand(ex.args[3]))
>   else
> macroexpand(ex)
>   end
> end

Seems that this will infinitely recurse?

>
>
> the issue is that now I want to resolve all imports and such in the module
> so that when I expand it, I get properly qualified names for the macros.
>
> so if I had
>
> module M
>
> include("X.jl")
> import X: @y, @z
>
> f(x) = X.@y(3)
>
> end
>
> It would still be able to find X.@y.
> Evaluating the include/import myself would be kind of risky since someone
> could clobber a module name in my global namespace (of the program that's
> running to macroexpand this stuff).
> Is there a way to sandbox it?

You are looking for a way to sandbox the whole julia runtime. I don't
think it's currently possible. Also note that macroexpansion is also
calling arbitrary code.

>
>
> On Monday, March 21, 2016 at 1:00:01 PM UTC-7, Yichao Yu wrote:
>>
>> On Mon, Mar 21, 2016 at 3:57 PM,   wrote:
>> > So there's no way to macroexpand the module in the module scope itself?
>> > I don't mind evaluating the module, but how do I then dump out the
>> > macroexpanded version of it?
>> >
>> > If there's a way to "clear" global scope then it would also be possible
>> > to
>> > eval the module, expand in global scope, clean global scope,
>> > rinse/repeat on
>> > next module.
>> > Is there a way to do that instead?
>> >
>>
>> If you want to just repeat what the global interpreter is doing, you
>> can parse the module, create the module yourself, then macroexpand,
>> print and evaluate each statement.
>>
>> >
>> > On Monday, March 21, 2016 at 11:55:47 AM UTC-7, Tim Holy wrote:
>> >>
>> >> Interesting. julia's `macroexpand` function doesn't seem to work for
>> >> expressions inside a module:
>> >>
>> >> julia> macroexpand( :(module M @time(1+1) end))
>> >> :(module M
>> >> eval(x) = begin  # none, line 1:
>> >> top(Core).eval(M,x)
>> >> end
>> >> eval(m,x) = begin  # none, line 1:
>> >> top(Core).eval(m,x)
>> >> end # none, line 1:
>> >> @time 1 + 1
>> >> end)
>> >>
>> >> which is the same thing you get back if you omit the `macroexpand`.
>> >>
>> >> Try commenting out the module declaration and see if you like it
>> >> better.
>> >>
>> >> Best,
>> >> --Tim
>> >>
>> >> On Monday, March 21, 2016 11:15:56 AM vis...@stanford.edu wrote:
>> >> > The MacroExpandJL package seems promising, but maybe I'm not able to
>> >> > get
>> >> > it
>> >> > to work. After updating syntax to match julia 0.4,
>> >> > MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2)
>> >> > end
>> >> > end))
>> >> > module M
>> >> > begin  # line 1:
>> >> > function f(x) # line 1:
>> >> > 1 + @m 2
>> >> > end
>> >> > endend
>> >> >
>> >> > Notice how the @m 2 is still there. Also, why is everything wrapped
>> >> > in
>> >> > an
>> >> > extra do block inside the module? Is this a printing issue, because
>> >> > that
>> >> > expression doesn't have one.
>> >> >
>> >> > How would I go about evaluating a module and it's macros, macro
>> >> > expanding
>> >> > the whole thing, and then dumping it out? @eval seems like, name
>> >> > wise,
>> >> > it
>> >> > should do this but it doesn't.
>> >> > Do you first eval() the module, then @eval the module? That didn't
>> >> > work
>> >> > for
>> >> > me either.
>> >> >
>> >> > Predefining a macro and then trying to evaluate:
>> >> > > macro m(x) 1 end
>> >> > > @eval(:(module M function f(x) @m 2 end end))
>> >> > :
>> >> > :(module M
>> >> >
>> >> > eval(x) = begin  # none, line 1:
>> >> > top(Core).eval(M,x)
>> >> > end
>> >> > eval(m,x) = begin  # none, line 1:
>> >> > top(Core).eval(m,x)
>> >> > end # none, line 1:
>> >> > function f(x) # none, line 1:
>> >> > @m 2
>> >> > end
>> >> > end)
>> >> >
>> >> > Also doesn't work.
>> >> >
>> >> > On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
>> >> > > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
>> >> > > > Tim, I'm assuming that module must assume that no macros are
>> >> > > > defined
>> >> > >
>> >> > > *and*
>> >> > >
>> >> > > > then used within the module body. If that does occur, the only
>> >> > > > way
>> >> > > > to do
>> >> > > > macro expansion correctly is to evaluate the module since the
>> >> > > > module
>> >> > > > definition can depend on arbitrary previously evaluated code.
>> >> > >
>> >> > > Probably true. I haven't played with it in a long time, but it's
>> >> > > possible
>> >> > > you
>> >> > > could load the module (so the macros are defined) and then parse
>> >> > > the
>> >> > > file...but
>> >> > > I can't remember if that works.
>> >> > >
>> >> > > Best,
>> >> > > --Tim
>> >> > >
>> >> > > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy 

Re: [julia-users] macroexpand entire module

2016-03-21 Thread vishesh
I think I got that much down:

function macroexpand(ex::Expr)
  if ex.head == :module
Expr(:module, ex.args[1], ex.args[2], macroexpand(ex.args[3]))
  else
macroexpand(ex)
  end
end


the issue is that now I want to resolve all imports and such in the module 
so that when I expand it, I get properly qualified names for the macros.

so if I had

module M 

include("X.jl")
import X: @y, @z

f(x) = X.@y(3)

end

It would still be able to find X.@y. 
Evaluating the include/import myself would be kind of risky since someone 
could clobber a module name in my global namespace (of the program that's 
running to macroexpand this stuff).
Is there a way to sandbox it?


On Monday, March 21, 2016 at 1:00:01 PM UTC-7, Yichao Yu wrote:
>
> On Mon, Mar 21, 2016 at 3:57 PM,   
> wrote: 
> > So there's no way to macroexpand the module in the module scope itself? 
> > I don't mind evaluating the module, but how do I then dump out the 
> > macroexpanded version of it? 
> > 
> > If there's a way to "clear" global scope then it would also be possible 
> to 
> > eval the module, expand in global scope, clean global scope, 
> rinse/repeat on 
> > next module. 
> > Is there a way to do that instead? 
> > 
>
> If you want to just repeat what the global interpreter is doing, you 
> can parse the module, create the module yourself, then macroexpand, 
> print and evaluate each statement. 
>
> > 
> > On Monday, March 21, 2016 at 11:55:47 AM UTC-7, Tim Holy wrote: 
> >> 
> >> Interesting. julia's `macroexpand` function doesn't seem to work for 
> >> expressions inside a module: 
> >> 
> >> julia> macroexpand( :(module M @time(1+1) end)) 
> >> :(module M 
> >> eval(x) = begin  # none, line 1: 
> >> top(Core).eval(M,x) 
> >> end 
> >> eval(m,x) = begin  # none, line 1: 
> >> top(Core).eval(m,x) 
> >> end # none, line 1: 
> >> @time 1 + 1 
> >> end) 
> >> 
> >> which is the same thing you get back if you omit the `macroexpand`. 
> >> 
> >> Try commenting out the module declaration and see if you like it 
> better. 
> >> 
> >> Best, 
> >> --Tim 
> >> 
> >> On Monday, March 21, 2016 11:15:56 AM vis...@stanford.edu wrote: 
> >> > The MacroExpandJL package seems promising, but maybe I'm not able to 
> get 
> >> > it 
> >> > to work. After updating syntax to match julia 0.4, 
> >> > MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2) 
> >> > end 
> >> > end)) 
> >> > module M 
> >> > begin  # line 1: 
> >> > function f(x) # line 1: 
> >> > 1 + @m 2 
> >> > end 
> >> > endend 
> >> > 
> >> > Notice how the @m 2 is still there. Also, why is everything wrapped 
> in 
> >> > an 
> >> > extra do block inside the module? Is this a printing issue, because 
> that 
> >> > expression doesn't have one. 
> >> > 
> >> > How would I go about evaluating a module and it's macros, macro 
> >> > expanding 
> >> > the whole thing, and then dumping it out? @eval seems like, name 
> wise, 
> >> > it 
> >> > should do this but it doesn't. 
> >> > Do you first eval() the module, then @eval the module? That didn't 
> work 
> >> > for 
> >> > me either. 
> >> > 
> >> > Predefining a macro and then trying to evaluate: 
> >> > > macro m(x) 1 end 
> >> > > @eval(:(module M function f(x) @m 2 end end)) 
> >> > : 
> >> > :(module M 
> >> > 
> >> > eval(x) = begin  # none, line 1: 
> >> > top(Core).eval(M,x) 
> >> > end 
> >> > eval(m,x) = begin  # none, line 1: 
> >> > top(Core).eval(m,x) 
> >> > end # none, line 1: 
> >> > function f(x) # none, line 1: 
> >> > @m 2 
> >> > end 
> >> > end) 
> >> > 
> >> > Also doesn't work. 
> >> > 
> >> > On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote: 
> >> > > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote: 
> >> > > > Tim, I'm assuming that module must assume that no macros are 
> defined 
> >> > > 
> >> > > *and* 
> >> > > 
> >> > > > then used within the module body. If that does occur, the only 
> way 
> >> > > > to do 
> >> > > > macro expansion correctly is to evaluate the module since the 
> module 
> >> > > > definition can depend on arbitrary previously evaluated code. 
> >> > > 
> >> > > Probably true. I haven't played with it in a long time, but it's 
> >> > > possible 
> >> > > you 
> >> > > could load the module (so the macros are defined) and then parse 
> the 
> >> > > file...but 
> >> > > I can't remember if that works. 
> >> > > 
> >> > > Best, 
> >> > > --Tim 
> >> > > 
> >> > > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  >> > > 
> >> > > > wrote: 
> >> > > > > It probably needs updating, but 
> >> > > > > https://github.com/timholy/MacroExpandJL.jl 
> >> > > > > might help. It lets you macroexpand a whole source file. 
> >> > > > > 
> >> > > > > Best, 
> >> > > > > --Tim 
> >> > > > > 
> >> > > > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote: 
> >> > > > > > On Sun, Mar 

Re: [julia-users] macroexpand entire module

2016-03-21 Thread Yichao Yu
On Mon, Mar 21, 2016 at 3:57 PM,   wrote:
> So there's no way to macroexpand the module in the module scope itself?
> I don't mind evaluating the module, but how do I then dump out the
> macroexpanded version of it?
>
> If there's a way to "clear" global scope then it would also be possible to
> eval the module, expand in global scope, clean global scope, rinse/repeat on
> next module.
> Is there a way to do that instead?
>

If you want to just repeat what the global interpreter is doing, you
can parse the module, create the module yourself, then macroexpand,
print and evaluate each statement.

>
> On Monday, March 21, 2016 at 11:55:47 AM UTC-7, Tim Holy wrote:
>>
>> Interesting. julia's `macroexpand` function doesn't seem to work for
>> expressions inside a module:
>>
>> julia> macroexpand( :(module M @time(1+1) end))
>> :(module M
>> eval(x) = begin  # none, line 1:
>> top(Core).eval(M,x)
>> end
>> eval(m,x) = begin  # none, line 1:
>> top(Core).eval(m,x)
>> end # none, line 1:
>> @time 1 + 1
>> end)
>>
>> which is the same thing you get back if you omit the `macroexpand`.
>>
>> Try commenting out the module declaration and see if you like it better.
>>
>> Best,
>> --Tim
>>
>> On Monday, March 21, 2016 11:15:56 AM vis...@stanford.edu wrote:
>> > The MacroExpandJL package seems promising, but maybe I'm not able to get
>> > it
>> > to work. After updating syntax to match julia 0.4,
>> > MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2)
>> > end
>> > end))
>> > module M
>> > begin  # line 1:
>> > function f(x) # line 1:
>> > 1 + @m 2
>> > end
>> > endend
>> >
>> > Notice how the @m 2 is still there. Also, why is everything wrapped in
>> > an
>> > extra do block inside the module? Is this a printing issue, because that
>> > expression doesn't have one.
>> >
>> > How would I go about evaluating a module and it's macros, macro
>> > expanding
>> > the whole thing, and then dumping it out? @eval seems like, name wise,
>> > it
>> > should do this but it doesn't.
>> > Do you first eval() the module, then @eval the module? That didn't work
>> > for
>> > me either.
>> >
>> > Predefining a macro and then trying to evaluate:
>> > > macro m(x) 1 end
>> > > @eval(:(module M function f(x) @m 2 end end))
>> > :
>> > :(module M
>> >
>> > eval(x) = begin  # none, line 1:
>> > top(Core).eval(M,x)
>> > end
>> > eval(m,x) = begin  # none, line 1:
>> > top(Core).eval(m,x)
>> > end # none, line 1:
>> > function f(x) # none, line 1:
>> > @m 2
>> > end
>> > end)
>> >
>> > Also doesn't work.
>> >
>> > On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
>> > > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
>> > > > Tim, I'm assuming that module must assume that no macros are defined
>> > >
>> > > *and*
>> > >
>> > > > then used within the module body. If that does occur, the only way
>> > > > to do
>> > > > macro expansion correctly is to evaluate the module since the module
>> > > > definition can depend on arbitrary previously evaluated code.
>> > >
>> > > Probably true. I haven't played with it in a long time, but it's
>> > > possible
>> > > you
>> > > could load the module (so the macros are defined) and then parse the
>> > > file...but
>> > > I can't remember if that works.
>> > >
>> > > Best,
>> > > --Tim
>> > >
>> > > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy > > >
>> > > > wrote:
>> > > > > It probably needs updating, but
>> > > > > https://github.com/timholy/MacroExpandJL.jl
>> > > > > might help. It lets you macroexpand a whole source file.
>> > > > >
>> > > > > Best,
>> > > > > --Tim
>> > > > >
>> > > > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote:
>> > > > > > On Sun, Mar 20, 2016 at 8:26 PM,  > > > > > > >
>> > >
>> > > wrote:
>> > > > > > > Hi all,
>> > > > > > >
>> > > > > > > I'd like to be able to load in a module, then macroexpand the
>> > >
>> > > whole
>> > >
>> > > > > thing,
>> > > > >
>> > > > > > > then print out the macroexpanded version.
>> > > > > > >
>> > > > > > > This should be a full, recursive macroexpand.
>> > > > > > >
>> > > > > > > I've noticed there is a function called macroexpand that
>> > > > > > > normally
>> > >
>> > > does
>> > >
>> > > > > > > what
>> > > > > > >
>> > > > > > > i want:
>> > > > > > >> macro m(x) 1 end
>> > > > > > >
>> > > > > > > ..
>> > > > > > >
>> > > > > > >> @m(2)
>> > > > > > >
>> > > > > > > 1
>> > > > > > >
>> > > > > > >> macroexpand(:(1 + @m(2)))
>> > > > > > >>
>> > > > > > > :(1 + 1)
>> > > > > > >
>> > > > > > > so that is fine and dandy, but inside a module this doesn't
>> > > > > > > seem
>> > >
>> > > to
>> > >
>> > > > > work:
>> > > > > > >> macroexpand(:(
>> > > > > > >>
>> > > > > > >module M
>> > > > > > >macro m(x) 1 end
>> > > > > > >x = 1 + @m(2)
>> > > > > > >  

Re: [julia-users] macroexpand entire module

2016-03-21 Thread vishesh
So there's no way to macroexpand the module in the module scope itself? 
I don't mind evaluating the module, but how do I then dump out the 
macroexpanded version of it?

If there's a way to "clear" global scope then it would also be possible to 
eval the module, expand in global scope, clean global scope, rinse/repeat 
on next module.
Is there a way to do that instead?


On Monday, March 21, 2016 at 11:55:47 AM UTC-7, Tim Holy wrote:
>
> Interesting. julia's `macroexpand` function doesn't seem to work for 
> expressions inside a module: 
>
> julia> macroexpand( :(module M @time(1+1) end)) 
> :(module M 
> eval(x) = begin  # none, line 1: 
> top(Core).eval(M,x) 
> end 
> eval(m,x) = begin  # none, line 1: 
> top(Core).eval(m,x) 
> end # none, line 1: 
> @time 1 + 1 
> end) 
>
> which is the same thing you get back if you omit the `macroexpand`. 
>
> Try commenting out the module declaration and see if you like it better. 
>
> Best, 
> --Tim 
>
> On Monday, March 21, 2016 11:15:56 AM vis...@stanford.edu  
> wrote: 
> > The MacroExpandJL package seems promising, but maybe I'm not able to get 
> it 
> > to work. After updating syntax to match julia 0.4, 
> > MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2) 
> end 
> > end)) 
> > module M 
> > begin  # line 1: 
> > function f(x) # line 1: 
> > 1 + @m 2 
> > end 
> > endend 
> > 
> > Notice how the @m 2 is still there. Also, why is everything wrapped in 
> an 
> > extra do block inside the module? Is this a printing issue, because that 
> > expression doesn't have one. 
> > 
> > How would I go about evaluating a module and it's macros, macro 
> expanding 
> > the whole thing, and then dumping it out? @eval seems like, name wise, 
> it 
> > should do this but it doesn't. 
> > Do you first eval() the module, then @eval the module? That didn't work 
> for 
> > me either. 
> > 
> > Predefining a macro and then trying to evaluate: 
> > > macro m(x) 1 end 
> > > @eval(:(module M function f(x) @m 2 end end)) 
> > : 
> > :(module M 
> > 
> > eval(x) = begin  # none, line 1: 
> > top(Core).eval(M,x) 
> > end 
> > eval(m,x) = begin  # none, line 1: 
> > top(Core).eval(m,x) 
> > end # none, line 1: 
> > function f(x) # none, line 1: 
> > @m 2 
> > end 
> > end) 
> > 
> > Also doesn't work. 
> > 
> > On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote: 
> > > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote: 
> > > > Tim, I'm assuming that module must assume that no macros are defined 
> > > 
> > > *and* 
> > > 
> > > > then used within the module body. If that does occur, the only way 
> to do 
> > > > macro expansion correctly is to evaluate the module since the module 
> > > > definition can depend on arbitrary previously evaluated code. 
> > > 
> > > Probably true. I haven't played with it in a long time, but it's 
> possible 
> > > you 
> > > could load the module (so the macros are defined) and then parse the 
> > > file...but 
> > > I can't remember if that works. 
> > > 
> > > Best, 
> > > --Tim 
> > > 
> > > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  > > 
> > > > wrote: 
> > > > > It probably needs updating, but 
> > > > > https://github.com/timholy/MacroExpandJL.jl 
> > > > > might help. It lets you macroexpand a whole source file. 
> > > > > 
> > > > > Best, 
> > > > > --Tim 
> > > > > 
> > > > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote: 
> > > > > > On Sun, Mar 20, 2016 at 8:26 PM,   > > > > > > 
> > > 
> > > wrote: 
> > > > > > > Hi all, 
> > > > > > > 
> > > > > > > I'd like to be able to load in a module, then macroexpand the 
> > > 
> > > whole 
> > > 
> > > > > thing, 
> > > > > 
> > > > > > > then print out the macroexpanded version. 
> > > > > > > 
> > > > > > > This should be a full, recursive macroexpand. 
> > > > > > > 
> > > > > > > I've noticed there is a function called macroexpand that 
> normally 
> > > 
> > > does 
> > > 
> > > > > > > what 
> > > > > > > 
> > > > > > > i want: 
> > > > > > >> macro m(x) 1 end 
> > > > > > > 
> > > > > > > .. 
> > > > > > > 
> > > > > > >> @m(2) 
> > > > > > > 
> > > > > > > 1 
> > > > > > > 
> > > > > > >> macroexpand(:(1 + @m(2))) 
> > > > > > >> 
> > > > > > > :(1 + 1) 
> > > > > > > 
> > > > > > > so that is fine and dandy, but inside a module this doesn't 
> seem 
> > > 
> > > to 
> > > 
> > > > > work: 
> > > > > > >> macroexpand(:( 
> > > > > > >> 
> > > > > > >module M 
> > > > > > >macro m(x) 1 end 
> > > > > > >x = 1 + @m(2) 
> > > > > > >end 
> > > > > > >)) 
> > > > > > > : 
> > > > > > > :(module M 
> > > > > > > : 
> > > > > > > eval(x) = begin  # none, line 2: 
> > > > > > > top(Core).eval(M,x) 
> > > > > > > 
> > > > > > > end 
> > > > > > > 
> > > > > > > eval(m,x) = begin  # 

[julia-users] Re: Hello, How do I find my mentor for GSoC2016?

2016-03-21 Thread Hyun
Thank you for your interest and advice.

How about focusing on "implementations of massively parallel dense linear 
algebra routines" and "implementing native Julia algorithms involving 
efficient, cache-conscious matrix operations on tiled matrices."? 
Because I interested cache-conscious parallel algorithm using tiled 
matrices.
Can you give me advice to specify the proposal?

I'm sorry, the code I show you might be messy. It's because there are many 
kind of parallelism put together. and I think I can not finish all of 
implementation there until this Friday(Because it should make a dependency 
graph, create parallel queue line using it and make sub-matrices and 
control them to be operated in pipelining parallelism using lowered 
dependency and so on). 
How about I'll give you parallel matrix-matrix multiplication 
implementation in Julia using tiled matrix, I think I can do it within one 
or two days.

P.S. I'm sorry for my late reply, I forgot checking google-groups.

Best,
Yonghyun.


2016년 3월 19일 토요일 오전 11시 13분 1초 UTC+9, Jiahao Chen 님의 말:
>
> > I tried creating my own matrix power function using Python before, I 
> used data parallelism(tiling), task parallelism(using topology) and 
> pipelining parallelism(using lowered dependency).
>
> As you say, attacking all levels of parallelism is very ambitious. 
> However, I think working on just one of these parallelism structures would 
> make for a good summer project. I would recommend you to pick one of these 
> approaches and apply it to a widely used computation such as matrix-matrix 
> multiplication.
>
> I had a look at your Python implementation
>
> https://github.com/usefulhyun/parallel_mmm/blob/master/prllmpow/prllmpow.py
>
> and it is quite hard to understand. If you can translate the essential 
> parts into Julia and show how you can use features like Julia types and 
> overloading of Julia's generic functions like * to make the code readable 
> yet efficient, then I think we can make a good case for your participation 
> in the Google Summer of Code. Without a Julia code sample to evaluate, it 
> is quite difficult to make a strong case for participation.
>


Re: [julia-users] Crash while returning a large matrix from a function

2016-03-21 Thread Eduardo Lenz
Hi.

Actually, the code was using a smaller data set and, in one of the runs, we 
just set a large number of samples. The obvious approach is to parse 
directly from a file, but I get a little bit nervous when it took that long 
and crashed without a warning.

Thanks and sorry for the noise.



On Monday, March 21, 2016 at 4:36:55 PM UTC-3, Stefan Karpinski wrote:
>
> Loading large data by evaluating code that constructs it is not a good 
> approach. You'd be much better of using a standard data format like JLD 
> .
>
> On Mon, Mar 21, 2016 at 3:15 PM, Eduardo Lenz  > wrote:
>
>> Hi.
>>
>> I am trying to load a 180 x 1025 matrix of floats, defined in a .jl file, 
>> like 
>>
>> function Data()
>> data = [1E-4 ... 3.2E-5 
>>  ..
>>  .0.0 ] 
>> end
>>
>> The simple act of including this file takes 50s in my computer and, when 
>> the function
>> is called, 
>>
>> data = Data()
>>
>> it takes more than two hours to finally crash without warning. 
>>
>> I can easily create such matrix with rand(180,1025), such that memory is 
>> not
>> a problem.  I can also use it with a smaller number of lines without any 
>> issues.
>>
>> It is a known limitation or am I doing something wrong here ?
>>
>> I am using the last 0.4.3 Windows version (downloaded directly from the 
>> page)
>>
>> Thanks.
>>
>
>

Re: [julia-users] macroexpand entire module

2016-03-21 Thread Yichao Yu
On Mon, Mar 21, 2016 at 2:55 PM, Tim Holy  wrote:
> Interesting. julia's `macroexpand` function doesn't seem to work for
> expressions inside a module:
>
> julia> macroexpand( :(module M @time(1+1) end))
> :(module M
> eval(x) = begin  # none, line 1:
> top(Core).eval(M,x)
> end
> eval(m,x) = begin  # none, line 1:
> top(Core).eval(m,x)
> end # none, line 1:
> @time 1 + 1
> end)
>
> which is the same thing you get back if you omit the `macroexpand`.
>
> Try commenting out the module declaration and see if you like it better.

Right, it can't for the reason I mentioned (without evaluating the
whole module). It can of course, in principle, blindly expand the
macro in the current global scope but that won't be what the code is
actually doing so I don't think it will be useful.

>
> Best,
> --Tim
>
> On Monday, March 21, 2016 11:15:56 AM vish...@stanford.edu wrote:
>> The MacroExpandJL package seems promising, but maybe I'm not able to get it
>> to work. After updating syntax to match julia 0.4,
>> MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2) end
>> end))
>> module M
>> begin  # line 1:
>> function f(x) # line 1:
>> 1 + @m 2
>> end
>> endend
>>
>> Notice how the @m 2 is still there. Also, why is everything wrapped in an
>> extra do block inside the module? Is this a printing issue, because that
>> expression doesn't have one.
>>
>> How would I go about evaluating a module and it's macros, macro expanding
>> the whole thing, and then dumping it out? @eval seems like, name wise, it
>> should do this but it doesn't.
>> Do you first eval() the module, then @eval the module? That didn't work for
>> me either.
>>
>> Predefining a macro and then trying to evaluate:
>> > macro m(x) 1 end
>> > @eval(:(module M function f(x) @m 2 end end))
>> :
>> :(module M
>>
>> eval(x) = begin  # none, line 1:
>> top(Core).eval(M,x)
>> end
>> eval(m,x) = begin  # none, line 1:
>> top(Core).eval(m,x)
>> end # none, line 1:
>> function f(x) # none, line 1:
>> @m 2
>> end
>> end)
>>
>> Also doesn't work.
>>
>> On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
>> > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
>> > > Tim, I'm assuming that module must assume that no macros are defined
>> >
>> > *and*
>> >
>> > > then used within the module body. If that does occur, the only way to do
>> > > macro expansion correctly is to evaluate the module since the module
>> > > definition can depend on arbitrary previously evaluated code.
>> >
>> > Probably true. I haven't played with it in a long time, but it's possible
>> > you
>> > could load the module (so the macros are defined) and then parse the
>> > file...but
>> > I can't remember if that works.
>> >
>> > Best,
>> > --Tim
>> >
>> > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy > >
>> > > wrote:
>> > > > It probably needs updating, but
>> > > > https://github.com/timholy/MacroExpandJL.jl
>> > > > might help. It lets you macroexpand a whole source file.
>> > > >
>> > > > Best,
>> > > > --Tim
>> > > >
>> > > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote:
>> > > > > On Sun, Mar 20, 2016 at 8:26 PM,  > > > > > >
>> >
>> > wrote:
>> > > > > > Hi all,
>> > > > > >
>> > > > > > I'd like to be able to load in a module, then macroexpand the
>> >
>> > whole
>> >
>> > > > thing,
>> > > >
>> > > > > > then print out the macroexpanded version.
>> > > > > >
>> > > > > > This should be a full, recursive macroexpand.
>> > > > > >
>> > > > > > I've noticed there is a function called macroexpand that normally
>> >
>> > does
>> >
>> > > > > > what
>> > > > > >
>> > > > > > i want:
>> > > > > >> macro m(x) 1 end
>> > > > > >
>> > > > > > ..
>> > > > > >
>> > > > > >> @m(2)
>> > > > > >
>> > > > > > 1
>> > > > > >
>> > > > > >> macroexpand(:(1 + @m(2)))
>> > > > > >>
>> > > > > > :(1 + 1)
>> > > > > >
>> > > > > > so that is fine and dandy, but inside a module this doesn't seem
>> >
>> > to
>> >
>> > > > work:
>> > > > > >> macroexpand(:(
>> > > > > >>
>> > > > > >module M
>> > > > > >macro m(x) 1 end
>> > > > > >x = 1 + @m(2)
>> > > > > >end
>> > > > > >))
>> > > > > > :
>> > > > > > :(module M
>> > > > > > :
>> > > > > > eval(x) = begin  # none, line 2:
>> > > > > > top(Core).eval(M,x)
>> > > > > >
>> > > > > > end
>> > > > > >
>> > > > > > eval(m,x) = begin  # none, line 2:
>> > > > > > top(Core).eval(m,x)
>> > > > > >
>> > > > > > end # none, line 3:
>> > > > > > $(Expr(:macro, :(m(x)), quote  # none, line 3:
>> > > > > > 1
>> > > > > >
>> > > > > > end)) # none, line 4:
>> > > > > > x = 1 + @m(2)
>> > > > > > end)
>> > > > > >
>> > > > > > As you can see in the second to last line, @m(2) is not expanded,
>> >
>> > and
>> >
>> > > > I'm
>> > > >

Re: [julia-users] Crash while returning a large matrix from a function

2016-03-21 Thread Stefan Karpinski
Loading large data by evaluating code that constructs it is not a good
approach. You'd be much better of using a standard data format like JLD
.

On Mon, Mar 21, 2016 at 3:15 PM, Eduardo Lenz 
wrote:

> Hi.
>
> I am trying to load a 180 x 1025 matrix of floats, defined in a .jl file,
> like
>
> function Data()
> data = [1E-4 ... 3.2E-5
>  ..
>  .0.0 ]
> end
>
> The simple act of including this file takes 50s in my computer and, when
> the function
> is called,
>
> data = Data()
>
> it takes more than two hours to finally crash without warning.
>
> I can easily create such matrix with rand(180,1025), such that memory is
> not
> a problem.  I can also use it with a smaller number of lines without any
> issues.
>
> It is a known limitation or am I doing something wrong here ?
>
> I am using the last 0.4.3 Windows version (downloaded directly from the
> page)
>
> Thanks.
>


[julia-users] Re: Crash while returning a large matrix from a function

2016-03-21 Thread Kristoffer Carlsson
You probably shouldn't just include the file which means it will have to be 
parsed and the function compiled etc. It will be much faster to just read a 
file containing the data.

This link might be 
useful: 
https://en.wikibooks.org/wiki/Introducing_Julia/Working_with_text_files#Writing_and_reading_array_to_and_from_a_file
On Monday, March 21, 2016 at 8:15:02 PM UTC+1, Eduardo Lenz wrote:
>
> Hi.
>
> I am trying to load a 180 x 1025 matrix of floats, defined in a .jl file, 
> like 
>
> function Data()
> data = [1E-4 ... 3.2E-5 
>  ..
>  .0.0 ] 
> end
>
> The simple act of including this file takes 50s in my computer and, when 
> the function
> is called, 
>
> data = Data()
>
> it takes more than two hours to finally crash without warning. 
>
> I can easily create such matrix with rand(180,1025), such that memory is 
> not
> a problem.  I can also use it with a smaller number of lines without any 
> issues.
>
> It is a known limitation or am I doing something wrong here ?
>
> I am using the last 0.4.3 Windows version (downloaded directly from the 
> page)
>
> Thanks.
>


Re: [julia-users] CUDArt.jl - segfaults after updating Julia

2016-03-21 Thread Stefan Karpinski
I'm guessing that CUDArt doesn't support the Julia dev – you're probably
safe using a stable release of Julia instead – i.e. the latest v0.4.x
release.

On Mon, Mar 21, 2016 at 3:13 PM, Matthew Pearce 
wrote:

> I recently updated Julia to the latest version:
>
> ```julia
> julia> versioninfo()
> Julia Version 0.5.0-dev+3220
> Commit c18bc53 (2016-03-21 11:09 UTC)```
>
> After doing so I get errors when trying to use CUDArt.jl.
> I'm writing to ask whether my best bet is to fully clean and remake my
> julia build or whether this is likely to be something to do with my
> configuration of CUDArt.
> Prior to this I had CUDArt working fine (previous was commit 83eac1e* (
> 2015-10-13 16:00 UTC))
>
> The error message generated is like:
>
> ```julia
> julia> Pkg.test("CUDArt")
> INFO: Testing CUDArt
>
> signal (11): Segmentation fault
> while loading /home/mcp50/.julia/v0.5/CUDArt/test/gc.jl, in expression
> starting on line 1
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1577
> [inline] at /home/mcp50/soft/julia/src/dump.c:1169
> jl_deserialize_datatype at /home/mcp50/soft/julia/src/dump.c:1568
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1438
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> [inline] at /home/mcp50/soft/julia/src/julia.h:573
> jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1318
> [inline] at /home/mcp50/soft/julia/src/julia.h:573
> jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1436
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1438
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> [inline] at /home/mcp50/soft/julia/src/julia.h:573
> jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
> [inline] at /home/mcp50/soft/julia/src/julia.h:573
> jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1499
> [inline] at /home/mcp50/soft/julia/src/julia.h:573
> jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1492
> jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
> _jl_restore_incremental at /home/mcp50/soft/julia/src/dump.c:2275
> jl_restore_incremental at /home/mcp50/soft/julia/src/dump.c:2353
> _require_from_serialized at ./loading.jl:165
> unknown function (ip: 0x7f60febe4426)
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
> _require_from_serialized at ./loading.jl:193
> require at ./loading.jl:323
> unknown function (ip: 0x7f60f3bb71ac)
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
> eval_import_path_ at /home/mcp50/soft/julia/src/toplevel.c:379
> jl_toplevel_eval_flex at /home/mcp50/soft/julia/src/toplevel.c:471
> jl_parse_eval_all at /home/mcp50/soft/julia/src/ast.c:784
> jl_load at /home/mcp50/soft/julia/src/toplevel.c:580
> include at ./boot.jl:240
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
> include_from_node1 at ./loading.jl:417
> unknown function (ip: 0x7f60f3b56785)
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
> do_call at /home/mcp50/soft/julia/src/interpreter.c:66
> eval at /home/mcp50/soft/julia/src/interpreter.c:185
> jl_toplevel_eval_flex at /home/mcp50/soft/julia/src/toplevel.c:557
> jl_parse_eval_all at /home/mcp50/soft/julia/src/ast.c:784
> jl_load at /home/mcp50/soft/julia/src/toplevel.c:580
> include at ./boot.jl:240
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
> include_from_node1 at ./loading.jl:417
> unknown function (ip: 0x7f60f3ba3755)
> [inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
> 

[julia-users] Crash while returning a large matrix from a function

2016-03-21 Thread Eduardo Lenz
Hi.

I am trying to load a 180 x 1025 matrix of floats, defined in a .jl file, 
like 

function Data()
data = [1E-4 ... 3.2E-5 
 ..
 .0.0 ] 
end

The simple act of including this file takes 50s in my computer and, when 
the function
is called, 

data = Data()

it takes more than two hours to finally crash without warning. 

I can easily create such matrix with rand(180,1025), such that memory is not
a problem.  I can also use it with a smaller number of lines without any 
issues.

It is a known limitation or am I doing something wrong here ?

I am using the last 0.4.3 Windows version (downloaded directly from the 
page)

Thanks.


[julia-users] CUDArt.jl - segfaults after updating Julia

2016-03-21 Thread Matthew Pearce
I recently updated Julia to the latest version:

```julia
julia> versioninfo()
Julia Version 0.5.0-dev+3220
Commit c18bc53 (2016-03-21 11:09 UTC)```

After doing so I get errors when trying to use CUDArt.jl. 
I'm writing to ask whether my best bet is to fully clean and remake my 
julia build or whether this is likely to be something to do with my 
configuration of CUDArt.
Prior to this I had CUDArt working fine (previous was commit 83eac1e* (2015-
10-13 16:00 UTC))

The error message generated is like:

```julia
julia> Pkg.test("CUDArt")
INFO: Testing CUDArt

signal (11): Segmentation fault
while loading /home/mcp50/.julia/v0.5/CUDArt/test/gc.jl, in expression 
starting on line 1
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1577
[inline] at /home/mcp50/soft/julia/src/dump.c:1169
jl_deserialize_datatype at /home/mcp50/soft/julia/src/dump.c:1568
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1438
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
[inline] at /home/mcp50/soft/julia/src/julia.h:573
jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1318
[inline] at /home/mcp50/soft/julia/src/julia.h:573
jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1436
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1438
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
[inline] at /home/mcp50/soft/julia/src/julia.h:573
jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1448
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1595
[inline] at /home/mcp50/soft/julia/src/julia.h:573
jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1568
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1499
[inline] at /home/mcp50/soft/julia/src/julia.h:573
jl_gc_wb at /home/mcp50/soft/julia/src/dump.c:1492
jl_deserialize_value_ at /home/mcp50/soft/julia/src/dump.c:1378
_jl_restore_incremental at /home/mcp50/soft/julia/src/dump.c:2275
jl_restore_incremental at /home/mcp50/soft/julia/src/dump.c:2353
_require_from_serialized at ./loading.jl:165
unknown function (ip: 0x7f60febe4426)
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
_require_from_serialized at ./loading.jl:193
require at ./loading.jl:323
unknown function (ip: 0x7f60f3bb71ac)
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
eval_import_path_ at /home/mcp50/soft/julia/src/toplevel.c:379
jl_toplevel_eval_flex at /home/mcp50/soft/julia/src/toplevel.c:471
jl_parse_eval_all at /home/mcp50/soft/julia/src/ast.c:784
jl_load at /home/mcp50/soft/julia/src/toplevel.c:580
include at ./boot.jl:240
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
include_from_node1 at ./loading.jl:417
unknown function (ip: 0x7f60f3b56785)
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
do_call at /home/mcp50/soft/julia/src/interpreter.c:66
eval at /home/mcp50/soft/julia/src/interpreter.c:185
jl_toplevel_eval_flex at /home/mcp50/soft/julia/src/toplevel.c:557
jl_parse_eval_all at /home/mcp50/soft/julia/src/ast.c:784
jl_load at /home/mcp50/soft/julia/src/toplevel.c:580
include at ./boot.jl:240
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
include_from_node1 at ./loading.jl:417
unknown function (ip: 0x7f60f3ba3755)
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
process_options at ./client.jl:266
_start at ./client.jl:318
unknown function (ip: 0x7f60f3b9d2a2)
[inline] at /home/mcp50/soft/julia/src/julia_internal.h:69
jl_call_method_internal at /home/mcp50/soft/julia/src/gf.c:1863
unknown function (ip: 0x401c1d)
unknown function (ip: 0x402e11)
__libc_start_main at /lib64/libc.so.6 (unknown line)
Allocations: 

Re: [julia-users] macroexpand entire module

2016-03-21 Thread Tim Holy
Interesting. julia's `macroexpand` function doesn't seem to work for 
expressions inside a module:

julia> macroexpand( :(module M @time(1+1) end))
:(module M
eval(x) = begin  # none, line 1:
top(Core).eval(M,x)
end
eval(m,x) = begin  # none, line 1:
top(Core).eval(m,x)
end # none, line 1:
@time 1 + 1
end)

which is the same thing you get back if you omit the `macroexpand`.

Try commenting out the module declaration and see if you like it better.

Best,
--Tim

On Monday, March 21, 2016 11:15:56 AM vish...@stanford.edu wrote:
> The MacroExpandJL package seems promising, but maybe I'm not able to get it
> to work. After updating syntax to match julia 0.4,
> MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2) end
> end))
> module M
> begin  # line 1:
> function f(x) # line 1:
> 1 + @m 2
> end
> endend
> 
> Notice how the @m 2 is still there. Also, why is everything wrapped in an
> extra do block inside the module? Is this a printing issue, because that
> expression doesn't have one.
> 
> How would I go about evaluating a module and it's macros, macro expanding
> the whole thing, and then dumping it out? @eval seems like, name wise, it
> should do this but it doesn't.
> Do you first eval() the module, then @eval the module? That didn't work for
> me either.
> 
> Predefining a macro and then trying to evaluate:
> > macro m(x) 1 end
> > @eval(:(module M function f(x) @m 2 end end))
> :
> :(module M
> 
> eval(x) = begin  # none, line 1:
> top(Core).eval(M,x)
> end
> eval(m,x) = begin  # none, line 1:
> top(Core).eval(m,x)
> end # none, line 1:
> function f(x) # none, line 1:
> @m 2
> end
> end)
> 
> Also doesn't work.
> 
> On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
> > On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
> > > Tim, I'm assuming that module must assume that no macros are defined
> > 
> > *and*
> > 
> > > then used within the module body. If that does occur, the only way to do
> > > macro expansion correctly is to evaluate the module since the module
> > > definition can depend on arbitrary previously evaluated code.
> > 
> > Probably true. I haven't played with it in a long time, but it's possible
> > you
> > could load the module (so the macros are defined) and then parse the
> > file...but
> > I can't remember if that works.
> > 
> > Best,
> > --Tim
> > 
> > > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  > 
> > > wrote:
> > > > It probably needs updating, but
> > > > https://github.com/timholy/MacroExpandJL.jl
> > > > might help. It lets you macroexpand a whole source file.
> > > > 
> > > > Best,
> > > > --Tim
> > > > 
> > > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote:
> > > > > On Sun, Mar 20, 2016 at 8:26 PM,   > > > > >
> > 
> > wrote:
> > > > > > Hi all,
> > > > > > 
> > > > > > I'd like to be able to load in a module, then macroexpand the
> > 
> > whole
> > 
> > > > thing,
> > > > 
> > > > > > then print out the macroexpanded version.
> > > > > > 
> > > > > > This should be a full, recursive macroexpand.
> > > > > > 
> > > > > > I've noticed there is a function called macroexpand that normally
> > 
> > does
> > 
> > > > > > what
> > > > > > 
> > > > > > i want:
> > > > > >> macro m(x) 1 end
> > > > > > 
> > > > > > ..
> > > > > > 
> > > > > >> @m(2)
> > > > > > 
> > > > > > 1
> > > > > > 
> > > > > >> macroexpand(:(1 + @m(2)))
> > > > > >> 
> > > > > > :(1 + 1)
> > > > > > 
> > > > > > so that is fine and dandy, but inside a module this doesn't seem
> > 
> > to
> > 
> > > > work:
> > > > > >> macroexpand(:(
> > > > > >> 
> > > > > >module M
> > > > > >macro m(x) 1 end
> > > > > >x = 1 + @m(2)
> > > > > >end
> > > > > >))
> > > > > > :
> > > > > > :(module M
> > > > > > :
> > > > > > eval(x) = begin  # none, line 2:
> > > > > > top(Core).eval(M,x)
> > > > > > 
> > > > > > end
> > > > > > 
> > > > > > eval(m,x) = begin  # none, line 2:
> > > > > > top(Core).eval(m,x)
> > > > > > 
> > > > > > end # none, line 3:
> > > > > > $(Expr(:macro, :(m(x)), quote  # none, line 3:
> > > > > > 1
> > > > > > 
> > > > > > end)) # none, line 4:
> > > > > > x = 1 + @m(2)
> > > > > > end)
> > > > > > 
> > > > > > As you can see in the second to last line, @m(2) is not expanded,
> > 
> > and
> > 
> > > > I'm
> > > > 
> > > > > > confused as to why that is.
> > > > > > 
> > > > > > Ideally, this macroexpanding of a module would allow me to also
> > > > > > resolve
> > > > > > imports and includes properly, so I could just slurp up a file and
> > > > > > dump
> > > > > > out
> > > > > > the macroexpanded version.
> > > > > 
> > > > > TL;DR this is generally not possible without evaluating the whole
> > > > > module.
> > > > > 
> > > > > Macros are executed 

Re: [julia-users] macroexpand entire module

2016-03-21 Thread vishesh
The MacroExpandJL package seems promising, but maybe I'm not able to get it 
to work. After updating syntax to match julia 0.4,
MacroExpandJL.macroexpand_jl(STDOUT, :(module M function f(x) 1+@m(2) end 
end))
module M
begin  # line 1:
function f(x) # line 1:
1 + @m 2
end
endend

Notice how the @m 2 is still there. Also, why is everything wrapped in an 
extra do block inside the module? Is this a printing issue, because that 
expression doesn't have one.

How would I go about evaluating a module and it's macros, macro expanding 
the whole thing, and then dumping it out? @eval seems like, name wise, it 
should do this but it doesn't.
Do you first eval() the module, then @eval the module? That didn't work for 
me either.
Predefining a macro and then trying to evaluate:

> macro m(x) 1 end
> @eval(:(module M function f(x) @m 2 end end))
:(module M
eval(x) = begin  # none, line 1:
top(Core).eval(M,x)
end
eval(m,x) = begin  # none, line 1:
top(Core).eval(m,x)
end # none, line 1:
function f(x) # none, line 1:
@m 2
end
end)

Also doesn't work.

On Monday, March 21, 2016 at 7:54:59 AM UTC-7, Tim Holy wrote:
>
> On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote: 
> > Tim, I'm assuming that module must assume that no macros are defined 
> *and* 
> > then used within the module body. If that does occur, the only way to do 
> > macro expansion correctly is to evaluate the module since the module 
> > definition can depend on arbitrary previously evaluated code. 
>
> Probably true. I haven't played with it in a long time, but it's possible 
> you 
> could load the module (so the macros are defined) and then parse the 
> file...but 
> I can't remember if that works. 
>
> Best, 
> --Tim 
>
> > 
> > On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  > wrote: 
> > > It probably needs updating, but 
> > > https://github.com/timholy/MacroExpandJL.jl 
> > > might help. It lets you macroexpand a whole source file. 
> > > 
> > > Best, 
> > > --Tim 
> > > 
> > > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote: 
> > > > On Sun, Mar 20, 2016 at 8:26 PM,   
> wrote: 
> > > > > Hi all, 
> > > > > 
> > > > > I'd like to be able to load in a module, then macroexpand the 
> whole 
> > > 
> > > thing, 
> > > 
> > > > > then print out the macroexpanded version. 
> > > > > 
> > > > > This should be a full, recursive macroexpand. 
> > > > > 
> > > > > I've noticed there is a function called macroexpand that normally 
> does 
> > > > > what 
> > > > > 
> > > > > i want: 
> > > > >> macro m(x) 1 end 
> > > > > 
> > > > > .. 
> > > > > 
> > > > >> @m(2) 
> > > > > 
> > > > > 1 
> > > > > 
> > > > >> macroexpand(:(1 + @m(2))) 
> > > > >> 
> > > > > :(1 + 1) 
> > > > > 
> > > > > so that is fine and dandy, but inside a module this doesn't seem 
> to 
> > > 
> > > work: 
> > > > >> macroexpand(:( 
> > > > >> 
> > > > >module M 
> > > > >macro m(x) 1 end 
> > > > >x = 1 + @m(2) 
> > > > >end 
> > > > >)) 
> > > > > : 
> > > > > :(module M 
> > > > > : 
> > > > > eval(x) = begin  # none, line 2: 
> > > > > top(Core).eval(M,x) 
> > > > > 
> > > > > end 
> > > > > 
> > > > > eval(m,x) = begin  # none, line 2: 
> > > > > top(Core).eval(m,x) 
> > > > > 
> > > > > end # none, line 3: 
> > > > > $(Expr(:macro, :(m(x)), quote  # none, line 3: 
> > > > > 1 
> > > > > 
> > > > > end)) # none, line 4: 
> > > > > x = 1 + @m(2) 
> > > > > end) 
> > > > > 
> > > > > As you can see in the second to last line, @m(2) is not expanded, 
> and 
> > > 
> > > I'm 
> > > 
> > > > > confused as to why that is. 
> > > > > 
> > > > > Ideally, this macroexpanding of a module would allow me to also 
> > > > > resolve 
> > > > > imports and includes properly, so I could just slurp up a file and 
> > > > > dump 
> > > > > out 
> > > > > the macroexpanded version. 
> > > > 
> > > > TL;DR this is generally not possible without evaluating the whole 
> > > > module. 
> > > > 
> > > > Macros are executed at parse time and therefore resolved in global 
> > > > scope (since local scope doesn't even exist yet) or in another word 
> > > > module scope. 
> > > > Therefore when doing macro expansion in a new module, the macros 
> needs 
> > > > to be resolved in the new module and since there's no way to 
> > > > statically know what macros are available in a module you can't do 
> > > > that without evaluating the module. 
> > > > 
> > > > > Thank you! 
> > > > > 
> > > > > Vishesh 
>
>

[julia-users] facing problem in building Pakages

2016-03-21 Thread kunal singh
Hi everyone,

I have installed SymEngine package .
I am trying to fix some warning issues in this package.

So after I have made correct changes(tested it on travis-CI and got no more 
warning there) 
in this package i execute the following commands :
 julia -e 'Pkg.build("SymEngine")'
julia -e 'Pkg.test("SymEngine")'

I get the same warnings.
Want I think is whatever changes I make the changes are not being compiled 
by  julia -e 'Pkg.build("SymEngine")'
Need urgent help pls.


[julia-users] Re: Warnings on Julia 0.4

2016-03-21 Thread kunal singh
Thank you 

On Monday, March 21, 2016 at 7:05:53 PM UTC+5:30, Kristoffer Carlsson wrote:
>
> import Base.Operators: ^, -, /
>
> On Monday, March 21, 2016 at 2:26:45 PM UTC+1, kunal singh wrote:
>>
>> Hi,
>>
>> I need a little help.I am  getting the following errors on 
>> Pkg.add("")
>>  WARNING: module  should explicitly import ^ from Base
>> WARNING: module  should explicitly import - from Base
>> WARNING: module   should explicitly import \ from Base
>> WARNING: module   should explicitly import == from Base
>>
>> I tried  import Base: ^, - , \ , ==  and it leads to error.
>>  
>> Can anybody tell me how to fix these warnings?
>>
>

[julia-users] Will Julia adapt to new models of computation?

2016-03-21 Thread Kevin Liu
http://www.cs.cmu.edu/~lblum/PAPERS/TuringMeetsNewton.pdf

Thanks. Kevin


Re: [julia-users] [ANN] JuliaAudio Organization and a slew of packages

2016-03-21 Thread Stefan Karpinski
SampledSignals seems to convey the meaning pretty well.

On Mon, Mar 21, 2016 at 12:40 PM, Spencer Russell  wrote:

> If I were feeling verbose I would say it’s a collection of types for
> managing multi-channel data sampled on a 1D uniform grid. Basically it
> provides stream (read/write) and buffer (array-like, random access) types
> for any regularly-sampled data, like EEG, Software-defined radio (complex
> samples), and audio. The samples themselves (in the sense of individual
> point measurements) are just a type parameter T.
>
> I see your point Tony about “Sample” in the sense of “Example”. My brain
> doesn’t like to say “SampledTypes” though. Maybe just “Sampled.jl” (though
> that could be confused with other “samples”, like sampling from a
> distribution). “SampledSignals.jl” maybe?
>
> -s
>
>
> On Mar 20, 2016, at 7:35 PM, Stefan Karpinski 
> wrote:
>
> I think it's a package of types for defining samples, not a collection of
> types which are sampled, so I don't think that would be clearer (unless I'm
> misunderstanding what the package is for).
>
> On Sun, Mar 20, 2016 at 7:10 AM, Tony Kelman  wrote:
>
>> Would SampledTypes maybe be a bit clearer? Otherwise it reads a bit like
>> it would contain examples.
>>
>>
>>
>> On Sunday, March 20, 2016 at 2:31:28 AM UTC-7, Spencer Russell wrote:
>>>
>>> Hey there, Julians.
>>>
>>> So AudioIO has been languishing for some time now, and I’ve been busily
>>> working away at the next generation. One of the issues with AudioIO is that
>>> it was a lot to swallow if you just wanted to play or record some audio.
>>> I’ve been focusing on getting the fundamental APIs right, so that the fancy
>>> stuff can be built on top.
>>>
>>> There’s a new JuliaAudio  organization and
>>> 5 shiny new packages. They’re still pretty young, but most of them have
>>> good test coverage. I’m planning on registering to METADATA soon, but I
>>> wanted to solicit some feedback first.
>>>
>>> SampleTypes.jl  is the
>>> most important as it defines the architecture that glues together the rest
>>> of the packages. It defines a set of stream and buffer types that should
>>> make it easy to move sampled data around. It’s called SampleTypes instead
>>> of AudioTypes because it should be useful for any sort of
>>> regularly-sampled, multi-channel data (e.g. complex samples from an SDR,
>>> multi-channel EEG, etc.). The types are sample-rate aware, and the
>>> samplerate can be stored in real units using SIUnits.jl. That allows cool
>>> features like reading in data using seconds instead of samples.
>>>
>>> Part of the idea with SampleTypes is to make it really easy to plug in a
>>> streaming audio backend, for instance SampleTypes handles conversion
>>> between formats, channel counts, and sample rates (currently just linear
>>> interpolation), so the underlying device libraries don’t have to. SampleBuf
>>> (the buffer type) is an AbstractArray, and it should be pretty drop-in
>>> replaceable to normal Arrays, but with extra goodies. If there are cases
>>> where it doesn’t act like an Array please file an Issue.
>>>
>>> LibSndFile.jl  and
>>> PortAudio.jl  used to be
>>> part of AudioIO, but are now separate packages. They wrap well-established
>>> cross-platform C libraries for interacting with files and real-time audio
>>> devices, respectively.
>>>
>>> LibSndFile has been designed to work with FileIO, so loading a file is
>>> as easy at `load(“myfile.wav”)` and it will figure out the format from the
>>> extension and magic bytes in the file header.
>>>
>>> PortAudio.jl has been massively simplified from what was in AudioIO.
>>> Test coverage is at 95%, but because PortAudio doesn’t provide a way to
>>> simulate input the tests aren’t very strong. They also don’t run on Travis.
>>>
>>> JACKAudio.jl  is a wrapper
>>> for libjack, a great audio routing tool designed for low-latency, pro audio
>>> applications. Unfortunately we’re not yet at the point where we can get
>>> super low latency from Julia, but it’s working pretty well now and I think
>>> there’s still room to tune it for better performance.
>>>
>>> RingBuffers.jl  is a
>>> small utility package that provides the RingBuffer type. It’s a fixed-size
>>> multi-channel circular ringbuffer with configurable overflow/underflow
>>> behavior. It uses normal Arrays and is not specific to this architecture,
>>> except that is assumes each channel of data is a column of a 2D Array.
>>>
>>> Here’s a screenshot from the PortAudio README that gives a good flavor
>>> for the sort of things these packages can do together:
>>>
>>>
>>> Please kick the tires and let me know what doesn’t work or is confusing.
>>> Also, if you maintain an 

Re: [julia-users] [ANN] JuliaAudio Organization and a slew of packages

2016-03-21 Thread Spencer Russell
If I were feeling verbose I would say it’s a collection of types for managing 
multi-channel data sampled on a 1D uniform grid. Basically it provides stream 
(read/write) and buffer (array-like, random access) types for any 
regularly-sampled data, like EEG, Software-defined radio (complex samples), and 
audio. The samples themselves (in the sense of individual point measurements) 
are just a type parameter T.

I see your point Tony about “Sample” in the sense of “Example”. My brain 
doesn’t like to say “SampledTypes” though. Maybe just “Sampled.jl” (though that 
could be confused with other “samples”, like sampling from a distribution). 
“SampledSignals.jl” maybe?

-s

> On Mar 20, 2016, at 7:35 PM, Stefan Karpinski  wrote:
> 
> I think it's a package of types for defining samples, not a collection of 
> types which are sampled, so I don't think that would be clearer (unless I'm 
> misunderstanding what the package is for).
> 
> On Sun, Mar 20, 2016 at 7:10 AM, Tony Kelman  > wrote:
> Would SampledTypes maybe be a bit clearer? Otherwise it reads a bit like it 
> would contain examples.
> 
> 
> 
> On Sunday, March 20, 2016 at 2:31:28 AM UTC-7, Spencer Russell wrote:
> Hey there, Julians.
> 
> So AudioIO has been languishing for some time now, and I’ve been busily 
> working away at the next generation. One of the issues with AudioIO is that 
> it was a lot to swallow if you just wanted to play or record some audio. I’ve 
> been focusing on getting the fundamental APIs right, so that the fancy stuff 
> can be built on top.
> 
> There’s a new JuliaAudio  organization and 5 
> shiny new packages. They’re still pretty young, but most of them have good 
> test coverage. I’m planning on registering to METADATA soon, but I wanted to 
> solicit some feedback first.
> 
> SampleTypes.jl  is the most 
> important as it defines the architecture that glues together the rest of the 
> packages. It defines a set of stream and buffer types that should make it 
> easy to move sampled data around. It’s called SampleTypes instead of 
> AudioTypes because it should be useful for any sort of regularly-sampled, 
> multi-channel data (e.g. complex samples from an SDR, multi-channel EEG, 
> etc.). The types are sample-rate aware, and the samplerate can be stored in 
> real units using SIUnits.jl. That allows cool features like reading in data 
> using seconds instead of samples. 
> 
> Part of the idea with SampleTypes is to make it really easy to plug in a 
> streaming audio backend, for instance SampleTypes handles conversion between 
> formats, channel counts, and sample rates (currently just linear 
> interpolation), so the underlying device libraries don’t have to. SampleBuf 
> (the buffer type) is an AbstractArray, and it should be pretty drop-in 
> replaceable to normal Arrays, but with extra goodies. If there are cases 
> where it doesn’t act like an Array please file an Issue.
> 
> LibSndFile.jl  and PortAudio.jl 
>  used to be part of AudioIO, but 
> are now separate packages. They wrap well-established cross-platform C 
> libraries for interacting with files and real-time audio devices, 
> respectively.
> 
> LibSndFile has been designed to work with FileIO, so loading a file is as 
> easy at `load(“myfile.wav”)` and it will figure out the format from the 
> extension and magic bytes in the file header.
> 
> PortAudio.jl has been massively simplified from what was in AudioIO. Test 
> coverage is at 95%, but because PortAudio doesn’t provide a way to simulate 
> input the tests aren’t very strong. They also don’t run on Travis.
> 
> JACKAudio.jl  is a wrapper for 
> libjack, a great audio routing tool designed for low-latency, pro audio 
> applications. Unfortunately we’re not yet at the point where we can get super 
> low latency from Julia, but it’s working pretty well now and I think there’s 
> still room to tune it for better performance.
> 
> RingBuffers.jl  is a small 
> utility package that provides the RingBuffer type. It’s a fixed-size 
> multi-channel circular ringbuffer with configurable overflow/underflow 
> behavior. It uses normal Arrays and is not specific to this architecture, 
> except that is assumes each channel of data is a column of a 2D Array.
> 
> Here’s a screenshot from the PortAudio README that gives a good flavor for 
> the sort of things these packages can do together:
> 
> 
> 
> Please kick the tires and let me know what doesn’t work or is confusing. 
> Also, if you maintain an audio-related package and want to plug into this 
> architecture, I’d be happy to start growing the JuliaAudio organization both 
> in maintainers and packages.
> 
> -s
> 



Re: [julia-users] DataFrame from string

2016-03-21 Thread Milan Bouchet-Valat
And with the next release (available from git master) you will be able
to do this directly:
df = csv"""
1, 7.6
2, 45.6
3, 12.1
...
"""

Regards

Le lundi 21 mars 2016 à 09:39 -0600, Jacob Quinn a écrit :
> You should be able to wrap the string in an IOBuffer, which satisfies
> the general IO interface.
> 
> e.g.
> 
> io = IOBuffer(csv)
> readtable(io)
> 
> -Jacob
> 
> On Mon, Mar 21, 2016 at 9:28 AM, jw3126  wrote:
> > I have a string which is secretly a csv like
> > """
> > 1, 7.6
> > 2, 45.6
> > 3, 12.1
> > ...
> > """
> > 
> > I want to turn it into a data frame. I guess I have to use
> > readtable, however readtable accepts only IOStreams (or strings
> > that are filepaths) and I don't know how to feed the string in a
> > sane way to it.
> > 


Re: [julia-users] Re: GSoC 2016 - Simple persistent distributed storage Project

2016-03-21 Thread Shashi Gowda
Hello,

I think ComputeFramework is not a good starting point for this project,
it's meant to be used for computing on large matrices and such. Amit (cc'd
here) has thought about distributed dictionaries and will have more inputs
on your project.

I would say you should start by listing the features you would like to have
in such a package and then figuring out some implementation plan. You
should look at the documentation on the parallel computing primitives in
Julia http://docs.julialang.org/en/latest/manual/parallel-computing/ since
it's likely you will use them as the basic building blocks.

Thanks

On Sat, Mar 19, 2016 at 10:03 PM, Mike Innes  wrote:

> For something ComputeFramework related, it's likely that Shashi Gowda
> would be interested in mentoring. You could try opening issues for queries
> as well. We're flexible about mentoring if you have someone in mind, but in
> general we'll try to match proposals with people who are known to the
> community.
>
> On Fri, 18 Mar 2016 at 12:47 Soujanya Ponnpalli <
> soujanya.ponnapa...@gmail.com> wrote:
>
>> Hello!
>>
>> I've started taking a look at the package ComputeFramework.jl. I have a
>> few queries and would want some information regarding them.
>>
>> Could you tell me how I could get in touch with potential mentors for
>> this project? Is it the case that there are a few fixed mentors amongst
>> which one shall get convinced to mentor me, or is it that anyone with
>> domain specific knowledge and experience in Julia can mentor me on this
>> project?
>>
>> Looking forward to hear from you,
>> Soujanya Ponnapalli.
>>
>> On Sun, Mar 13, 2016 at 11:06 PM, Mike Innes 
>> wrote:
>>
>>> Hey Soujanya,
>>>
>>> Glad to have your interest! I don't know a lot about this project
>>> personally, but you might be interested to take a look at the recent work
>>> on ComputeFramework.jl ,
>>> which is in a similar area. Getting stuck in over there would be a great
>>> way to meet potential mentors and get a hard start on the project.
>>>
>>> Cheers,
>>> Mike
>>>
>>> On Friday, 11 March 2016 14:35:06 UTC, Soujanya Ponnpalli wrote:

 Hello!

 I am Soujanya Ponnapalli, a junior at International Institute of
 Information Technology, Hyderabad (IIIT-H), majoring in Computer Science
 and Engineering (CSE). I got introduced to Julia at International
 Parallel and Distributed Processing Symposium - IPDPS 2015 and was
 taken aback by the features of Julia like performance efficiency, parallel
 and distributed computation support.

 I would rate my competency level in C++, Java and Python as
 intermediate and getting my hands on Julia. I would like to contribute to
 the project " Simple persistent distributed storage" as I'm interested
 in Distributed Systems. My experience in this field includes,
 implementing a file system similar to HDFS in java.

 I would be thankful if I could get some information regarding,
 1. What is expected from your side, in context to this project?
 2. How do I contact the mentors and get involved in the project
 discussions?

 Regards,
 Soujanya Ponnapalli.


>>


Re: [julia-users] DataFrame from string

2016-03-21 Thread Jacob Quinn
You should be able to wrap the string in an IOBuffer, which satisfies the
general IO interface.

e.g.

io = IOBuffer(csv)
readtable(io)

-Jacob

On Mon, Mar 21, 2016 at 9:28 AM, jw3126  wrote:

> I have a string which is secretly a csv like
> """
> 1, 7.6
> 2, 45.6
> 3, 12.1
> ...
> """
>
> I want to turn it into a data frame. I guess I have to use readtable
> ,
> however readtable accepts only IOStreams (or strings that are filepaths)
> and I don't know how to feed the string in a sane way to it.
>


Re: [julia-users] macroexpand entire module

2016-03-21 Thread Tim Holy
On Monday, March 21, 2016 09:34:19 AM Stefan Karpinski wrote:
> Tim, I'm assuming that module must assume that no macros are defined *and*
> then used within the module body. If that does occur, the only way to do
> macro expansion correctly is to evaluate the module since the module
> definition can depend on arbitrary previously evaluated code.

Probably true. I haven't played with it in a long time, but it's possible you 
could load the module (so the macros are defined) and then parse the file...but 
I can't remember if that works.

Best,
--Tim

> 
> On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  wrote:
> > It probably needs updating, but
> > https://github.com/timholy/MacroExpandJL.jl
> > might help. It lets you macroexpand a whole source file.
> > 
> > Best,
> > --Tim
> > 
> > On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote:
> > > On Sun, Mar 20, 2016 at 8:26 PM,   wrote:
> > > > Hi all,
> > > > 
> > > > I'd like to be able to load in a module, then macroexpand the whole
> > 
> > thing,
> > 
> > > > then print out the macroexpanded version.
> > > > 
> > > > This should be a full, recursive macroexpand.
> > > > 
> > > > I've noticed there is a function called macroexpand that normally does
> > > > what
> > > > 
> > > > i want:
> > > >> macro m(x) 1 end
> > > > 
> > > > ..
> > > > 
> > > >> @m(2)
> > > > 
> > > > 1
> > > > 
> > > >> macroexpand(:(1 + @m(2)))
> > > >> 
> > > > :(1 + 1)
> > > > 
> > > > so that is fine and dandy, but inside a module this doesn't seem to
> > 
> > work:
> > > >> macroexpand(:(
> > > >> 
> > > >module M
> > > >macro m(x) 1 end
> > > >x = 1 + @m(2)
> > > >end
> > > >))
> > > > :
> > > > :(module M
> > > > :
> > > > eval(x) = begin  # none, line 2:
> > > > top(Core).eval(M,x)
> > > > 
> > > > end
> > > > 
> > > > eval(m,x) = begin  # none, line 2:
> > > > top(Core).eval(m,x)
> > > > 
> > > > end # none, line 3:
> > > > $(Expr(:macro, :(m(x)), quote  # none, line 3:
> > > > 1
> > > > 
> > > > end)) # none, line 4:
> > > > x = 1 + @m(2)
> > > > end)
> > > > 
> > > > As you can see in the second to last line, @m(2) is not expanded, and
> > 
> > I'm
> > 
> > > > confused as to why that is.
> > > > 
> > > > Ideally, this macroexpanding of a module would allow me to also
> > > > resolve
> > > > imports and includes properly, so I could just slurp up a file and
> > > > dump
> > > > out
> > > > the macroexpanded version.
> > > 
> > > TL;DR this is generally not possible without evaluating the whole
> > > module.
> > > 
> > > Macros are executed at parse time and therefore resolved in global
> > > scope (since local scope doesn't even exist yet) or in another word
> > > module scope.
> > > Therefore when doing macro expansion in a new module, the macros needs
> > > to be resolved in the new module and since there's no way to
> > > statically know what macros are available in a module you can't do
> > > that without evaluating the module.
> > > 
> > > > Thank you!
> > > > 
> > > > Vishesh



Re: [julia-users] Re: When will juno-atom bundle become available?

2016-03-21 Thread Stefan Karpinski
Sounds like a great project to get started contributing to Julia with.

On Mon, Mar 21, 2016 at 10:09 AM, Yao Lu  wrote:

> Yes, I'm expecting a Julia/Atom bundle with the client pre-installed.
>
> 2016-03-21 22:07 GMT+08:00 Chris Rackauckas :
>
>> It's available.
>>
>> https://github.com/JunoLab/atom-julia-client/tree/master/manual
>>
>> Or do you mean a Julia/Atom bundle with the client pre-installed?
>>
>> On Monday, March 21, 2016 at 1:08:45 AM UTC-7, Yao Lu wrote:
>>>
>>> I 'm expecting this bundle.
>>>
>>
>


Re: [julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Erik Schnetter
The architecture-specific, manual BLAS optimizations don't just give
you an additional 20%. They can improve speed by a factor of a few.

If you see a factor of 2.6, then that's probably to be accepted,
unless to really look into the details (generated assembler code,
measure cache misses, introduce manual vectorization and loop
unrolling, etc.) And you'll have to repeat that analysis if you're
using a different system.

-erik

On Mon, Mar 21, 2016 at 10:18 AM, Igor Cerovsky
 wrote:
> Well, maybe the subject of the post is confusing. I've tried to write an
> algorithm which runs approximately as fast as using BLAS functions, but
> using pure Julia implementation. Sure, we know, that BLAS is highly
> optimized, I don't wanted to beat BLAS, jus to be a bit slower, let us say
> ~1.2-times.
>
> If I take a part of the algorithm, and run it separately all works fine.
> Consider code below:
> function rank1update!(A, x, y)
> for j = 1:size(A, 2)
> @fastmath @inbounds @simd for i = 1:size(A, 1)
> A[i,j] += 1.1 * y[j] * x[i]
> end
> end
> end
>
> function rank1updateb!(A, x, y)
> R = BLAS.ger!(1.1, x, y, A)
> end
>
> Here BLAS is ~1.2-times faster.
> However, calling it together with 'mygemv!' in the loop (see code in
> original post), the performance drops to ~2.6 times slower then using BLAS
> functions (gemv, ger)
>
>
>
>
> On Monday, 21 March 2016 13:34:27 UTC+1, Stefan Karpinski wrote:
>>
>> I'm not sure what the expected result here is. BLAS is designed to be as
>> fast as possible at matrix multiply. I'd be more concerned if you write
>> straightforward loop code and beat BLAS, since that means the BLAS is badly
>> mistuned.
>>
>> On Mon, Mar 21, 2016 at 5:58 AM, Igor Cerovsky 
>> wrote:
>>>
>>> Thanks Steven, I've thought there is something more behind...
>>>
>>> I shall note that that I forgot to mention matrix dimensions, which is
>>> 1000 x 1000.
>>>
>>> On Monday, 21 March 2016 10:48:33 UTC+1, Steven G. Johnson wrote:

 You need a lot more than just fast loops to match the performance of an
 optimized BLAS.See e.g. this notebook for some comments on the related
 case of matrix multiplication:


 http://nbviewer.jupyter.org/url/math.mit.edu/~stevenj/18.335/Matrix-multiplication-experiments.ipynb
>>
>>
>



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Igor Cerovsky
Well, maybe the subject of the post is confusing. I've tried to write an 
algorithm which runs approximately as fast as using BLAS functions, but 
using pure Julia implementation. Sure, we know, that BLAS is highly 
optimized, I don't wanted to beat BLAS, jus to be a bit slower, let us say 
~1.2-times. 

If I take a part of the algorithm, and run it separately all works fine. 
Consider code below:
function rank1update!(A, x, y)
for j = 1:size(A, 2)
@fastmath @inbounds @simd for i = 1:size(A, 1)
A[i,j] += 1.1 * y[j] * x[i]
end
end
end

function rank1updateb!(A, x, y)
R = BLAS.ger!(1.1, x, y, A)
end

Here BLAS is ~1.2-times faster.
However, calling it together with 'mygemv!' in the loop (see code in 
original post), the performance drops to ~2.6 times slower then using BLAS 
functions (gemv, ger)




On Monday, 21 March 2016 13:34:27 UTC+1, Stefan Karpinski wrote:
>
> I'm not sure what the expected result here is. BLAS is designed to be as 
> fast as possible at matrix multiply. I'd be more concerned if you write 
> straightforward loop code and beat BLAS, since that means the BLAS is badly 
> mistuned.
>
> On Mon, Mar 21, 2016 at 5:58 AM, Igor Cerovsky  > wrote:
>
>> Thanks Steven, I've thought there is something more behind...
>>
>> I shall note that that I forgot to mention matrix dimensions, which is 
>> 1000 x 1000.
>>
>> On Monday, 21 March 2016 10:48:33 UTC+1, Steven G. Johnson wrote:
>>>
>>> You need a lot more than just fast loops to match the performance of an 
>>> optimized BLAS.See e.g. this notebook for some comments on the related 
>>> case of matrix multiplication:
>>>
>>>
>>> http://nbviewer.jupyter.org/url/math.mit.edu/~stevenj/18.335/Matrix-multiplication-experiments.ipynb
>>>
>>
>

Re: [julia-users] Re: When will juno-atom bundle become available?

2016-03-21 Thread Yao Lu
Yes, I'm expecting a Julia/Atom bundle with the client pre-installed.

2016-03-21 22:07 GMT+08:00 Chris Rackauckas :

> It's available.
>
> https://github.com/JunoLab/atom-julia-client/tree/master/manual
>
> Or do you mean a Julia/Atom bundle with the client pre-installed?
>
> On Monday, March 21, 2016 at 1:08:45 AM UTC-7, Yao Lu wrote:
>>
>> I 'm expecting this bundle.
>>
>


[julia-users] Re: When will juno-atom bundle become available?

2016-03-21 Thread Chris Rackauckas
It's available. 

https://github.com/JunoLab/atom-julia-client/tree/master/manual

Or do you mean a Julia/Atom bundle with the client pre-installed?

On Monday, March 21, 2016 at 1:08:45 AM UTC-7, Yao Lu wrote:
>
> I 'm expecting this bundle.
>


[julia-users] Re: Where to publish numerical work using julia

2016-03-21 Thread Chris Rackauckas
Why not just do Journal of Computational Physics or SIAM Journal on 
Scientific Computing? I recently submitted numerical work done with Julia 
to the latter and no one cares that it's Julia. In fact, with numerical 
work you could use whitespace and no one would care: it's all about the 
derivation of the method. The code that shows the derivation is correct. 
(Unless you mean numerical simulations, in that case, still people won't 
care about the code, just the results).

It will all come down to whether what you coded was novel and interesting 
or not.


Re: [julia-users] Is it possible to grab REPL contents after a .jl file completes execution ?

2016-03-21 Thread Josef Sachs
> On Mon, 21 Mar 2016 09:39:40 -0400, Stefan Karpinski said:

> Could you just pipe the output of non-interactive Julia to the `tee`
> command?

I'd still like to be able to pipe the output of non-non-interactive Julia.

https://github.com/JuliaLang/julia/issues/14776


Re: [julia-users] Is it possible to grab REPL contents after a .jl file completes execution ?

2016-03-21 Thread Stefan Karpinski
Could you just pipe the output of non-interactive Julia to the `tee`
command?

On Sun, Mar 20, 2016 at 11:23 PM, 'Jhan Jar' via julia-users <
julia-users@googlegroups.com> wrote:

> Hi,
> I have a main .jl script which calls other .jl scripts. All scripts
> display output. The sources of output are println()s and tic-toc pair. The
> scripts take hours to complete. So far my guess is that the last thing main
> script should do is save all text displayed on REPL to a file. Is this
> possible?
>
> Does the julia REPL have an equivalent of the Linux "tee" shell command?
>
> I m on MS Win7 and Julia 0.3 release.
>
> Thanks!
>
>


[julia-users] Re: Warnings on Julia 0.4

2016-03-21 Thread Kristoffer Carlsson
import Base.Operators: ^, -, /

On Monday, March 21, 2016 at 2:26:45 PM UTC+1, kunal singh wrote:
>
> Hi,
>
> I need a little help.I am  getting the following errors on 
> Pkg.add("")
>  WARNING: module  should explicitly import ^ from Base
> WARNING: module  should explicitly import - from Base
> WARNING: module   should explicitly import \ from Base
> WARNING: module   should explicitly import == from Base
>
> I tried  import Base: ^, - , \ , ==  and it leads to error.
>  
> Can anybody tell me how to fix these warnings?
>


Re: [julia-users] macroexpand entire module

2016-03-21 Thread Stefan Karpinski
Tim, I'm assuming that module must assume that no macros are defined *and*
then used within the module body. If that does occur, the only way to do
macro expansion correctly is to evaluate the module since the module
definition can depend on arbitrary previously evaluated code.

On Sun, Mar 20, 2016 at 9:00 PM, Tim Holy  wrote:

> It probably needs updating, but
> https://github.com/timholy/MacroExpandJL.jl
> might help. It lets you macroexpand a whole source file.
>
> Best,
> --Tim
>
> On Sunday, March 20, 2016 08:53:49 PM Yichao Yu wrote:
> > On Sun, Mar 20, 2016 at 8:26 PM,   wrote:
> > > Hi all,
> > >
> > > I'd like to be able to load in a module, then macroexpand the whole
> thing,
> > > then print out the macroexpanded version.
> > >
> > > This should be a full, recursive macroexpand.
> > >
> > > I've noticed there is a function called macroexpand that normally does
> > > what
> > >
> > > i want:
> > >> macro m(x) 1 end
> > >
> > > ..
> > >
> > >> @m(2)
> > >
> > > 1
> > >
> > >> macroexpand(:(1 + @m(2)))
> > >>
> > > :(1 + 1)
> > >
> > > so that is fine and dandy, but inside a module this doesn't seem to
> work:
> > >> macroexpand(:(
> > >>
> > >module M
> > >macro m(x) 1 end
> > >x = 1 + @m(2)
> > >end
> > >))
> > > :
> > > :(module M
> > > :
> > > eval(x) = begin  # none, line 2:
> > > top(Core).eval(M,x)
> > >
> > > end
> > >
> > > eval(m,x) = begin  # none, line 2:
> > > top(Core).eval(m,x)
> > >
> > > end # none, line 3:
> > > $(Expr(:macro, :(m(x)), quote  # none, line 3:
> > > 1
> > >
> > > end)) # none, line 4:
> > > x = 1 + @m(2)
> > > end)
> > >
> > > As you can see in the second to last line, @m(2) is not expanded, and
> I'm
> > > confused as to why that is.
> > >
> > > Ideally, this macroexpanding of a module would allow me to also resolve
> > > imports and includes properly, so I could just slurp up a file and dump
> > > out
> > > the macroexpanded version.
> >
> > TL;DR this is generally not possible without evaluating the whole module.
> >
> > Macros are executed at parse time and therefore resolved in global
> > scope (since local scope doesn't even exist yet) or in another word
> > module scope.
> > Therefore when doing macro expansion in a new module, the macros needs
> > to be resolved in the new module and since there's no way to
> > statically know what macros are available in a module you can't do
> > that without evaluating the module.
> >
> > > Thank you!
> > >
> > > Vishesh
>
>


[julia-users] Julia 0.4 hangs during pkg.Udpate

2016-03-21 Thread Stanislav Se
julia> Pkg.update()
INFO: Updating METADATA...
INFO: Computing changes...
INFO: Upgrading BayesianDataFusion: v1.0.0 => v1.1.0

hangs forever.




[julia-users] Is it possible to grab REPL contents after a .jl file completes execution ?

2016-03-21 Thread 'Jhan Jar' via julia-users
Hi,
I have a main .jl script which calls other .jl scripts. All scripts display 
output. The sources of output are println()s and tic-toc pair. The scripts 
take hours to complete. So far my guess is that the last thing main script 
should do is save all text displayed on REPL to a file. Is this possible?

Does the julia REPL have an equivalent of the Linux "tee" shell command?

I m on MS Win7 and Julia 0.3 release.

Thanks! 



[julia-users] Is there any package related to fractional calculus for julia-lang.

2016-03-21 Thread aseaday
Hi, everyone:
  
I am work on my paper about numerical computing of fractional calculus. And 
I search fractional calculus and failed to get it. I want to know  if there 
is one for that. And if no, I could write a package.

Best Regards
Aseaday


[julia-users] Warnings on Julia 0.4

2016-03-21 Thread kunal singh
Hi,

I need a little help.I am  getting the following errors on 
Pkg.add("")
 WARNING: module  should explicitly import ^ from Base
WARNING: module  should explicitly import - from Base
WARNING: module   should explicitly import \ from Base
WARNING: module   should explicitly import == from Base

I tried  import Base: ^, - , \ , ==  and it leads to error.
 
Can anybody tell me how to fix these warnings?


Re: [julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Stefan Karpinski
I'm not sure what the expected result here is. BLAS is designed to be as
fast as possible at matrix multiply. I'd be more concerned if you write
straightforward loop code and beat BLAS, since that means the BLAS is badly
mistuned.

On Mon, Mar 21, 2016 at 5:58 AM, Igor Cerovsky 
wrote:

> Thanks Steven, I've thought there is something more behind...
>
> I shall note that that I forgot to mention matrix dimensions, which is
> 1000 x 1000.
>
> On Monday, 21 March 2016 10:48:33 UTC+1, Steven G. Johnson wrote:
>>
>> You need a lot more than just fast loops to match the performance of an
>> optimized BLAS.See e.g. this notebook for some comments on the related
>> case of matrix multiplication:
>>
>>
>> http://nbviewer.jupyter.org/url/math.mit.edu/~stevenj/18.335/Matrix-multiplication-experiments.ipynb
>>
>


Re: [julia-users] Re: cross-module exports / extending modules

2016-03-21 Thread Andreas Lobinger
Hello colleague,

merci, that was the discussion i was looking for (#14472 from Tim is in a 
similar area).
I do not have a clear need, otherwise i would have a full example. However 
this providing optional functions in modules - which is the only 'real' 
encapsulation mechanism in Julia shows up in some places and is/might be a 
0.6 issue. Because knowing in advance what type of system should be 
supported by a package is tricky...

On Monday, March 21, 2016 at 11:24:36 AM UTC+1, Milan Bouchet-Valat wrote:
>
> Le lundi 21 mars 2016 à 03:19 -0700, Andreas Lobinger a écrit : 
> > Hello colleague, 
> > 
> > sorry i wasn't clear enough. I'm aware of the MoO for packages and 
> > extensions and i'm a great fan of modularisation. However the topic 
> > of optional include/import or something like "require" (deprecated) 
> > is still around. Not everything can be formulated into a tree of 
> > dependencies that exist on all systems. I'm experimenting with libxcb 
> > (so close to X11) and i assume this will not be available for MS- 
> > Windows based systems.  
> It's still not clear to me why you'd need to add functions to an 
> existing module... 
>
> > But i'm still trying to find this in the issues. 
> Maybe you're looking for this? 
> https://github.com/JuliaLang/julia/pull/6884 
>
>
> Regards 
>
> > > If you want to add functions to the Cairo module, open pull 
> > > requests or use your fork for the time being. eval'ing into modules 
> > > that aren't yours is not a good practice. You can create a 
> > > CairoExtensions.jl package that depends on Cairo and uses its copy 
> > > of libcairo. 
>


Re: [julia-users] Registering / renaming CppWrapper

2016-03-21 Thread Bart Janssens
I like the sound of CxxWrap.jl. I was reluctant to use Cxx, to avoid giving 
the impression there is a dependency on Cxx.jl, but in the future I may 
actually use Cxx.jl to replace the current ccall usage, so then that is no 
longer an objection.

I'll give it a few more days and then proceed with the rename to and 
registration of CxxWrap.jl

Cheers,

Bart

On Sunday, March 20, 2016 at 12:37:52 PM UTC+1, Morten Piibeleht wrote:
>
> I would second Erik, CXX seems to be the standard way of referring to C++ 
> (e.g. in makefiles), and it would be consistent with Cxx.jl.
>
> Also, maybe the shorter "CxxWrap.jl"? Sounds a tiny bit better to me 
> ("wrap C++ code in Julia") whereas CxxWrapper could be interpreted as 
> "wrapper around something (C++ toolchain?)".
>
>

Re: [julia-users] Re: cross-module exports / extending modules

2016-03-21 Thread Milan Bouchet-Valat
Le lundi 21 mars 2016 à 03:19 -0700, Andreas Lobinger a écrit :
> Hello colleague,
> 
> sorry i wasn't clear enough. I'm aware of the MoO for packages and
> extensions and i'm a great fan of modularisation. However the topic
> of optional include/import or something like "require" (deprecated)
> is still around. Not everything can be formulated into a tree of
> dependencies that exist on all systems. I'm experimenting with libxcb
> (so close to X11) and i assume this will not be available for MS-
> Windows based systems. 
It's still not clear to me why you'd need to add functions to an
existing module...

> But i'm still trying to find this in the issues.
Maybe you're looking for this?
https://github.com/JuliaLang/julia/pull/6884


Regards

> > If you want to add functions to the Cairo module, open pull
> > requests or use your fork for the time being. eval'ing into modules
> > that aren't yours is not a good practice. You can create a
> > CairoExtensions.jl package that depends on Cairo and uses its copy
> > of libcairo.


[julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Igor Cerovsky
Thanks Steven, I've thought there is something more behind...

I shall note that that I forgot to mention matrix dimensions, which is 1000 
x 1000.

On Monday, 21 March 2016 10:48:33 UTC+1, Steven G. Johnson wrote:
>
> You need a lot more than just fast loops to match the performance of an 
> optimized BLAS.See e.g. this notebook for some comments on the related 
> case of matrix multiplication:
>
>
> http://nbviewer.jupyter.org/url/math.mit.edu/~stevenj/18.335/Matrix-multiplication-experiments.ipynb
>


[julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Steven G. Johnson
You need a lot more than just fast loops to match the performance of an 
optimized BLAS.See e.g. this notebook for some comments on the related 
case of matrix multiplication:

http://nbviewer.jupyter.org/url/math.mit.edu/~stevenj/18.335/Matrix-multiplication-experiments.ipynb


Re: [julia-users] Re: Nothing conditional operator

2016-03-21 Thread Milan Bouchet-Valat
Le dimanche 20 mars 2016 à 16:56 -0700, Jeffrey Sarnoff a écrit :
> Redis itself is written in C. They document GET key:
> > Get the value of key. If the key does not exist the special
> > value nil is returned. 
> 
> An error is returned if the value stored at key is not a string,
> because GET only handles string values.
>  
>  digging deeper
> > The client library API should not return an empty string, but a nil
> > object, when the server replies with a Null Bulk String. 
> 
> For example a Ruby library should return 'nil' while a C library
> should return NULL (or set a special flag in the reply object), and
> so forth.
> > Single elements of an Array may be Null. This is used in Redis
> > replies in order to signal that this elements are missing and not
> > empty strings. 
> 
> This can happen with the SORT command when used with the
> GET pattern option when the specified key is missing.
> 
> [For example, if the] second element is a Null. The client library
> should return something like this: ["foo", nil, "bar"]
> The Redis nil indicates a non-present value (missing or unavailable
> or, not extant: a domain|-> range error).
> From a semiotic viewpoint, Julia's nothing is closer to "absence"
> than it is to "an absent value"; of course
> the operational machinery supplies an actual entity to be nothing (a
> singleton realization ofthe type Void).
> 
> A much more contextual fit be the use of Nullable, although that may
> require more of the client; a simpler way
> to handle the Redis nil without doing something with nothing is to
> use a dedicated const symbol or a singleton
> to be that sentinel, perhaps:  
>  const RedisNil = :RedisNil# or
>  type RedisNIL end; RedisNil =
> RedisNIL()
> 
> Somewhere there are lengthy and informative discussions about Julia
> and nil / NULL / nothing.
> (I noted this thread to the Redis.jl project).
Interesting. Though this wouldn't be an actual sentinel, as a symbol is
a different type from a string, so type instability would remain.

It seems that Redis.jl has chosen to follow the Python API, which isn't
type-stable. In Julia, it could make sense to retain a type-stable
solution, i.e. always return a Nullable. This would be a good test to
check what improvements Julia needs to make working with Nullable
pleasant enough.

Actually, get() from Redis.jl is faced with the very same design issue
as get() from Julia Base. The latter raises an error when a key is
missing, and there's been discussions about offering an alternative
function which would return a Nullable:
https://github.com/JuliaLang/julia/issues/13055

I would argue that Redis.jl should follow the same pattern as Base for
consistency. It would be interesting to get comments from the package
author about this.


Regards


> 
> 
> 
> 
> > Redis.jl returns nothing when requesting a the value of a key that
> > doesn't exist:
> > 
> > using Redis
> > conn = RedisConnection()
> > r = get(conn, "non_existent_key")
> > disconnect(conn)
> > r == nothing    # true
> > 
> > 
> > > For now I don't know of a good solution to this pattern, but
> > > there's 
> > > been some discussion about it: 
> > > https://github.com/JuliaLang/julia/issues/15174 
> > > 
> > > You should definitely use a Nullable instead of returning
> > > nothing. 
> > > 
> > > 
> > > Regards 
> > > 
> > > Le samedi 19 mars 2016 à 02:58 -0700, Jeffrey Sarnoff a écrit : 
> > > > You may be misusing nothing.  It is unusual that a function
> > > would 
> > > > return nothing some of the time and something other times. 
> > > > Take a look at http://docs.julialang.org/en/latest/manual/faq/#
> > > nothin 
> > > > gness-and-missing-values 
> > > > If you have additional questions about this, please give an
> > > example 
> > > > of what get_a(...) is getting and why it would be nothing some
> > > of the 
> > > > time. 
> > > > 
> > > > > Hi All 
> > > > > 
> > > > > 
> > > > > I found my self writing code like this a lot: 
> > > > > 
> > > > > x = get_a(...) 
> > > > > 
> > > > > if x != nothing 
> > > > >     y::A = x 
> > > > >     do_sth(y, ...) 
> > > > > end 
> > > > > 
> > > > > In the above, I have to check for nothing first, and if it is
> > > not 
> > > > > nothing, then I do a type assert to make sure the type is
> > > what I 
> > > > > expected. 
> > > > > 
> > > > > Is there any function or macro in Julia that can help this? 
> > > > > 
> > > > > I know in F#, I have option.bind, so option.bind f x is
> > > equivalent 
> > > > > to a pattern match:  if x is None - > None; if x is something
> > > -> 
> > > > > f(something) 
> > > > > 
> > > > > Also in C#, I have "customers?[0]?.Orders?.Count();"  (as
> > > long as 
> > > > > there is null before ?, it returns null immediately) 
> > > > > 
> > > > > Does Julia have something similar? 
> > > > > 
> > > > > 


[julia-users] Re: performace of loops Julia vs Blas

2016-03-21 Thread Ján Adamčák
You can use blas_set_num_threads(1) in julia and it will use only 1 
thread... but this is not right answer

Dňa pondelok, 21. marca 2016 9:49:23 UTC+1 Igor Cerovsky napísal(-a):
>
> Hi,
>
> Trying to write custom and using BLAS functions implementation of 
> Gram-Schmidt algorithm I got more than 2-times slower performance for Julia 
> 0.4.3 on my computer Intel i7 6700HQ (on older processor i7 5500 the 
> performance gain is 1.2-times). The code below is a bit longer, but I got 
> the slow performance in the whole context. Trying to profile parts of the 
> algorithm I got only slightly different performance.
>
> Custom implementation:
>
> function rank1update!(DST, A, R, k)
> rows, cols = size(A)
> for j = k+1:cols
> @simd for i = 1:rows
> @inbounds DST[i,j] -= A[i, k] * R[k, j]
> end
> end
> end
>
> function mygemv!(DST, A, k, alpha)
> rows, cols = size(A)
> for j in k+1:cols
> s = 0.0
> @simd for i in 1:rows
> @inbounds s += A[i, k] * A[i, j]
> end  
> DST[k, j] = s * alpha
> end
> end
>
> function mgsf(M)
> rows, cols = size(M)
> Q = copy(M)
> R = eye(cols)
>
> for k in 1:cols
> alpha = 1.0 / sumabs2(sub(Q, :, k))
> mygemv!(R, Q, k, alpha)
> rank1update!(Q, Q, R, k)
> end
> Q, R
> end
>
> Implementation using BLAS functions:
> function mgs_blas(M)
> cols = size(M, 2)
> Q = copy(M)
> R = eye(cols)
>
> for k in 1:cols
> q_k = sub(Q, :, k)
> Q_sub = sub(Q, :, k+1:cols)
> R_sub = sub(R, k, k+1:cols)
>
> alpha = 1.0 / sumabs2(q_k)
> R[k, k+1:cols] = BLAS.gemv('T', alpha, Q_sub, q_k)
> BLAS.ger!(-1.0,  q_k, vec(R_sub), Q_sub)
> end
> 
> Q, R
> end
>
> And results; using BLAS the performance gain is ~2.6 times:
>
> # custom implementation
> Q2, R2 = @time mgsf(T);
>
>   0.714916 seconds (4.99 k allocations: 15.411 MB, 0.08% gc time)
>
>
> # implementation using BLAS functions 
>
> Q5, R5 = @time mgs_blas(T);
>
>   0.339278 seconds (16.45 k allocations: 23.521 MB, 0.76% gc time)
>
>
>
> A hint: Looking at performance graph in the Task Manager it seems BLAS 
> uses more cores.
> The question that remains is: what is going on?
>
> Thanks for explanation.
>
>

[julia-users] performace of loops Julia vs Blas

2016-03-21 Thread Igor Cerovsky
Hi,

Trying to write custom and using BLAS functions implementation of 
Gram-Schmidt algorithm I got more than 2-times slower performance for Julia 
0.4.3 on my computer Intel i7 6700HQ (on older processor i7 5500 the 
performance gain is 1.2-times). The code below is a bit longer, but I got 
the slow performance in the whole context. Trying to profile parts of the 
algorithm I got only slightly different performance.

Custom implementation:

function rank1update!(DST, A, R, k)
rows, cols = size(A)
for j = k+1:cols
@simd for i = 1:rows
@inbounds DST[i,j] -= A[i, k] * R[k, j]
end
end
end

function mygemv!(DST, A, k, alpha)
rows, cols = size(A)
for j in k+1:cols
s = 0.0
@simd for i in 1:rows
@inbounds s += A[i, k] * A[i, j]
end  
DST[k, j] = s * alpha
end
end

function mgsf(M)
rows, cols = size(M)
Q = copy(M)
R = eye(cols)

for k in 1:cols
alpha = 1.0 / sumabs2(sub(Q, :, k))
mygemv!(R, Q, k, alpha)
rank1update!(Q, Q, R, k)
end
Q, R
end

Implementation using BLAS functions:
function mgs_blas(M)
cols = size(M, 2)
Q = copy(M)
R = eye(cols)

for k in 1:cols
q_k = sub(Q, :, k)
Q_sub = sub(Q, :, k+1:cols)
R_sub = sub(R, k, k+1:cols)

alpha = 1.0 / sumabs2(q_k)
R[k, k+1:cols] = BLAS.gemv('T', alpha, Q_sub, q_k)
BLAS.ger!(-1.0,  q_k, vec(R_sub), Q_sub)
end

Q, R
end

And results; using BLAS the performance gain is ~2.6 times:

# custom implementation
Q2, R2 = @time mgsf(T);

  0.714916 seconds (4.99 k allocations: 15.411 MB, 0.08% gc time)


# implementation using BLAS functions 

Q5, R5 = @time mgs_blas(T);

  0.339278 seconds (16.45 k allocations: 23.521 MB, 0.76% gc time)



A hint: Looking at performance graph in the Task Manager it seems BLAS uses 
more cores.
The question that remains is: what is going on?

Thanks for explanation.



[julia-users] Re: cross-module exports / extending modules

2016-03-21 Thread Tony Kelman
If you want to add functions to the Cairo module, open pull requests or use 
your fork for the time being. eval'ing into modules that aren't yours is not a 
good practice. You can create a CairoExtensions.jl package that depends on 
Cairo and uses its copy of libcairo.

Re: [julia-users] pre-compile with userimg.jl instructions

2016-03-21 Thread Tony Kelman
You actually don't have to build from source, there is a build_sysimg.jl script 
that has instructions for rebuilding the system image in a binary. This script 
had regressed a bit on some 0.4 releases, try the latest 0.4.5 release and it 
should be working again across platforms.

[julia-users] When will juno-atom bundle become available?

2016-03-21 Thread Yao Lu
I 'm expecting this bundle.


Re: [julia-users] broadcast performance is slow

2016-03-21 Thread Igor Cerovsky
Stefan, thanks.

On Friday, 18 March 2016 15:11:22 UTC+1, Stefan Karpinski wrote:
>
> This was one of the motivating issues for the function redesign which is 
> now on Julia master. If you try the latest development version of Julia, 
> these should both be much faster.
>
> On Fri, Mar 18, 2016 at 8:50 AM, Igor Cerovsky  > wrote:
>
>> Hello,
>>
>> By writing a code for unit normal scaling I've found a big differences 
>> related to where a function used in broadcast is defined, *globally* vs* 
>> locally*. Consider functions below:
>>
>> function scun2!(A)
>> shift = mean( A, 1)
>> stretch = std(A, 1)
>> 
>> f(a, b, c) = (a - b) / c   # defined locally
>> broadcast!(f, A, A, shift, stretch)
>> 
>> shift, stretch
>> end
>>
>> f_scun(a, b, c) = (a - b) / c  # defined globally
>> function scun3!(A)
>> shift = mean( A)
>> stretch = std(A, 1)
>> 
>> broadcast!(f_scun, A, A, shift, stretch)
>> 
>> shift, stretch
>> end
>>
>> Resulting performance is:
>>
>> R2 = copy(T)
>>
>> @time sh2, sc2 = scun2!(R2);
>>
>>   0.035527 seconds (19.51 k allocations: 967.273 KB)
>>
>>
>> R3 = copy(T)
>>
>> @time sh3, sc3 = scun3!(R3);
>>
>>   0.009705 seconds (54 allocations: 17.547 KB)
>>
>>
>> How can be explained, that if f_scun is defined outside the function the 
>> performance is 3.6 times better (number of allocations is also large)? I'm 
>> using Julia 0.4.3
>>
>> Thank you,
>> Igor
>>
>>
>>
>>
>>
>

[julia-users] Re: cross-module exports / extending modules

2016-03-21 Thread Andreas Lobinger
No.

More concrete, i want to put another function into Cairo and i want to 
avoid to load a second time libcairo.so. 

On Sunday, March 20, 2016 at 1:01:58 PM UTC+1, Gregory Salvan wrote:
>
> Hi,
> when you want to add methods on Base module (like getindex, getfield...) 
> you use "import Base.getindex" then write a new function with new types 
> args.
> for exemple to add methods in B.jl on functionA from A.jl
> import A.functionA
>
> function functionA(...)
>
> end
>
> Is what you were looking for?
>
> Le dimanche 20 mars 2016 11:25:10 UTC+1, Andreas Lobinger a écrit :
>>
>> Hello colleagues,
>>
>> i remember a discussion about this, but maybe without conclusion and 
>> maybe without the right keywords.
>>
>> Let's have module A (from package A.jl) with certain funcitionality and 
>> maybe some types.
>> Now module/package/code B.jl that somehow extends A with optional 
>> functions. How to put these functions under the A. API?
>> I'm pretty sure exports across modules don't work, but is there somewhere 
>> some functionality on this?
>>
>> Wishing a happy day,
>> Andreas
>>
>

[julia-users] Re: Announcing JuDE: autocomplete and jump to definition support for Atom

2016-03-21 Thread Nitin Arora
Awesome :) . I will try this tomm.

On Sunday, March 20, 2016 at 11:58:15 AM UTC-7, James Dang wrote:
>
> Hi All, Julia has been great for me, and I wanted to give back a little. 
> LightTable and Atom are great editors, but I was really starting to miss 
> good intellisense-like autocomplete and basic navigation features like 
> jump-to-definition, especially on larger codebases. It's really quite a 
> slog to remember exactly where in which file a function was defined, or 
> what its exact arguments are. And maybe with better tooling, more people 
> will be drawn to the community. So I put a bit of work into a new package 
> for Atom that gives you that!
>
> https://atom.io/packages/jude
>
>
> 
>
>
> This is a bit different from what you get out of julia-client and 
> autocomplete-julia because it does a full syntax parsing and scope 
> resolution of your codebase without executing it in a Julia process. It 
> reparses very quickly on the fly without needing to save. And the matching 
> is precise, not fuzzy, giving you exactly what names are available in the 
> scope you are in currently. It's quite new and unpolished, but please try 
> it out and let me know what you think!
>
> Cheers,
> James
>
>