Re: [julia-users] why sum(abs(A)) is very slow

2014-08-25 Thread Dahua Lin
If A is not a global variable (i.e within a function), @devec would be much 
faster (comparable to sumabs)

Dahua


On Monday, August 25, 2014 4:26:22 AM UTC+8, Adam Smith wrote:
>
> I've run into this a few times (and a few hundred times in python), so I 
> made an @iterize macro. Not sure how useful it is, but you can put it in 
> front of a bunch of chained function calls and it will make iterators 
> automatically to avoid creating any temp arrays:
>
> A = randn(1<<26)
> @time sum(abs(A))
> @time @iterize sum(abs(A))
> @time sumabs(A)
>
> println(sum(abs(A)))
> println(@iterize sum(abs(A)))
> println(sumabs(A))
>
> println(sum(A))
> println(@iterize sum(A))
>
> println(sum(ceil(floor(abs(A)
> println(@iterize sum(ceil(floor(abs(A)
>
> Output:
> elapsed time: 0.367873796 seconds (537878296 bytes allocated, 2.48% gc 
> time)
> elapsed time: 0.107278414 seconds (577616 bytes allocated)
> elapsed time: 0.045590637 seconds (639580 bytes allocated)
> 5.3551932868680775e7
> 5.3551932868672036e7
> 5.3551932868678436e7
> 658.6904827808266
> 658.6904827808266
> 2.4537098e7
> 2.4537098e7
>
> The macro is in a gist: Iterize.jl 
> 
>
> I had tried using @devec, but that actually made it about 100x slower.
>
> On Saturday, August 23, 2014 8:15:44 AM UTC-4, Stefan Karpinski wrote:
>>
>> On Sat, Aug 23, 2014 at 7:23 AM,  wrote:
>>
>>> >To do any of that justice, you end up with a language that looks 
>>> basically like Haskell. So why not just use Haskell?
>>>
>>> Because I don't know anything about it (yet), except the name and the 
>>> fact that you often associated it with lazy evaluation.
>>>
>>> Because (#2), this could be a way to make sumabs and the likes obsolete 
>>> in *Julia*. :)
>>>
>>
>> We do really want to get rid of things like sumabs., so it's certainly 
>> worth considering. I know I've thought about it many times, but I don't 
>> think it's the right answer – you really want to preserve eager evaluation 
>> semantics, even if you end up moving around the actual evaluation of things.
>>  
>

[julia-users] Re: PyPlot problems, no GUI backend for matplotlib

2014-08-25 Thread Viral Shah
I get the same error. I think there is an old open issue on the topic in 
the IJulia repository.

-viral

On Monday, August 25, 2014 9:33:13 AM UTC+5:30, Stelios Georgiou wrote:
>
> Hello, 
>
> I can't get PyPlot to work in julia or IJulia notebook, this is the error 
> I get. When I test matplotlib also I get no visuals. I can save images and 
> view them after but cannot see live figures. Do you have any ideas? 
>
> ( I use homebrew python, with packages installed with pip ) 
> ( Julia is installed from standalone .dmg installer )  
> ( Using OS X 10.9 )
>
> *julia> *
> *using PyPlot*
> *WARNING: No working GUI backend found for matplotlib.*
>
> *INFO: Loading help data...*
> ipython (2.2.0)
>
> matplotlib (1.3.1)
>


[julia-users] Re: JuliaCon Opening Session Videos Posted!

2014-08-25 Thread Tomas Lycken
I've seen a lot of the videos posted (read: aggregated) to 
juliabloggers.com now - which is fabulous in itself =) - but is there any 
chance that the JuliaCon site could link to them as well? It would be much 
easier to get an overview and view the videos in the order I find 
interesting that way.

Thanks a lot!

// T

On Sunday, August 24, 2014 6:08:03 AM UTC+2, Joshua Job wrote:
>
> Is there any word on when we may expect the rest of the videos? I'm 
> particularly anxious to see the Gadfly session. :D
>
> On Monday, August 11, 2014 4:53:18 AM UTC-7, Jacob Quinn wrote:
>>
>> Hey all,
>>
>> Gather round and here the tales of a wonderous new language, presented by 
>> Tim 
>> Holy 
>> ,
>>  
>> Pontus Stenetrop 
>> ,
>>  
>> and Arch Robison 
>> 
>> .
>>
>> Check out the JuliaCon youtube playlist 
>> 
>> , Blog post announcement 
>> , and feel 
>> free to jump in on the discussions at /r/programming 
>> 
>>  
>> and Hacker News .
>>
>> The plan is to release another session of videos every few days, so keep 
>> on the lookout for more Julia goodness.
>>
>> -Jacob
>>
>

Re: [julia-users] Re: JuliaCon Opening Session Videos Posted!

2014-08-25 Thread Viral Shah
Submit a PR to JuliaCon.Github.io? :-)

-viral
On 25 Aug 2014 17:53, "Tomas Lycken"  wrote:

> I've seen a lot of the videos posted (read: aggregated) to
> juliabloggers.com now - which is fabulous in itself =) - but is there any
> chance that the JuliaCon site could link to them as well? It would be much
> easier to get an overview and view the videos in the order I find
> interesting that way.
>
> Thanks a lot!
>
> // T
>
> On Sunday, August 24, 2014 6:08:03 AM UTC+2, Joshua Job wrote:
>>
>> Is there any word on when we may expect the rest of the videos? I'm
>> particularly anxious to see the Gadfly session. :D
>>
>> On Monday, August 11, 2014 4:53:18 AM UTC-7, Jacob Quinn wrote:
>>>
>>> Hey all,
>>>
>>> Gather round and here the tales of a wonderous new language, presented
>>> by Tim Holy
>>> ,
>>> Pontus Stenetrop
>>> ,
>>> and Arch Robison
>>> 
>>> .
>>>
>>> Check out the JuliaCon youtube playlist
>>> 
>>> , Blog post announcement
>>> , and feel
>>> free to jump in on the discussions at /r/programming
>>> 
>>> and Hacker News .
>>>
>>> The plan is to release another session of videos every few days, so keep
>>> on the lookout for more Julia goodness.
>>>
>>> -Jacob
>>>
>>


Re: [julia-users] BinDeps: How to test new provider?

2014-08-25 Thread Lucas Beyer
The corresponding PR: https://github.com/JuliaLang/BinDeps.jl/pull/101. See
the referencing Images.jl PR (and more to come) for usage, though it's
pretty straightforward.


[julia-users] Re: Announcement: Playground.jl

2014-08-25 Thread Steven Sagaert
Nice! This will definitely be useful for playing with different versions.

On Saturday, August 23, 2014 10:01:45 PM UTC+2, Rory Finnegan wrote:
>
> Hi everyone,
>
> I've published my Playground.jl 
>  package to create julia 
> sandboxes like python virtual environments, if anyone wants to give it a 
> try.  So far I've tested it on Funtoo and Linux Mint, but I'm looking for 
> people to try it out on other platforms (like Windows and OSX).
>
> Cheers,
> Rory
>


[julia-users] Blaze

2014-08-25 Thread Steven Sagaert
Hi,
I find Pydata's Blaze project (http://*blaze*.*pydata*.org) very 
interesting. Some related but more restricted in scope approaches are 
dplyer (https://github.com/hadley/dplyr) and LINQ (.NET) & type providers 
(F#). It would be awesome to have something like Blaze in Julia in the long 
term. I wonder if there are any plans to build something like that in the 
future?

Sincerely,
Steven Sagaert.


Re: [julia-users] Re: JuliaCon Opening Session Videos Posted!

2014-08-25 Thread Tomas Lycken
Ha, of course it was hosted on github ;)

And as usual, asking for something means you're the one doing it - I hope I 
copy-pasted all the URL's correctly: 
https://github.com/JuliaCon/juliacon.github.io/pull/20

// T

On Monday, August 25, 2014 2:25:35 PM UTC+2, Viral Shah wrote:
>
> Submit a PR to JuliaCon.Github.io? :-)
>
> -viral
> On 25 Aug 2014 17:53, "Tomas Lycken" > 
> wrote:
>
>> I've seen a lot of the videos posted (read: aggregated) to 
>> juliabloggers.com now - which is fabulous in itself =) - but is there 
>> any chance that the JuliaCon site could link to them as well? It would be 
>> much easier to get an overview and view the videos in the order I find 
>> interesting that way.
>>
>> Thanks a lot!
>>
>> // T
>>
>> On Sunday, August 24, 2014 6:08:03 AM UTC+2, Joshua Job wrote:
>>>
>>> Is there any word on when we may expect the rest of the videos? I'm 
>>> particularly anxious to see the Gadfly session. :D
>>>
>>> On Monday, August 11, 2014 4:53:18 AM UTC-7, Jacob Quinn wrote:

 Hey all,

 Gather round and here the tales of a wonderous new language, presented 
 by Tim Holy 
 ,
  
 Pontus Stenetrop 
 ,
  
 and Arch Robison 
 
 .

 Check out the JuliaCon youtube playlist 
 
 , Blog post announcement 
 , and 
 feel free to jump in on the discussions at /r/programming 
 
  
 and Hacker News .

 The plan is to release another session of videos every few days, so 
 keep on the lookout for more Julia goodness.

 -Jacob

>>>  

[julia-users] Can this magic square function be further optimized?

2014-08-25 Thread Phillip Berndt
Hi julia-users,

I've recently stumbled over Julia and wanted to give it a try. 

To assess it's speed, I've implemented another micro-benchmark, namely a 
version of Matlab's magic() function that generates magic squares. Since I 
have no experience writing optimal Julia code, I started off with literal 
translations of two different implementations - Matlab's and the one from 
magic_square.py from PyPy, which is an optimized version for NumPy. I then 
timed the calculation of all magic squares from N=3 to N=1000. The table 
from Julia's homepage suggests that in most cases, it is significantly 
faster than Python and Matlab. In my case, it's significantly slower, which 
is somehow disappointing ;) My question now is:

Can the implementation be optimized to outperform the other two?

*The times:*

Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428 
bytes allocated, 12.54% gc time)
Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 bytes 
allocated, 26.99% gc time)
Matlab: Elapsed time is 4.994960 seconds.
Python: 1 loops, best of 3: 2.09 s per loop

My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM, 
running a current Linux Mint and Julia 0.3 stable. To be fair, Python does 
not seem to gc during this loop (disabling gc doesn't alter the time here), 
so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's 
still much slower than Python. (By the way, even Octave only needs 4.46 
seconds.) If I translate the matrices in magic_python to account for 
column-major storage, the execution time does not significantly improve.

*The code:*

Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
IPython: import magic_square; %timeit [ magic_square.magic(x) for x in 
range(3, 1001) ];
Julia: I've uploaded the code to a Gist at 
https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will paste a 
copy below this post.


Cheers,
Phillip


function magic_matlab(n::Int64)
# Works exactly as Matlab's magic.m

if n % 2 == 1
p = (1:n)
M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) + 
mod(broadcast(+, p', 2p - 2), n) + 1
return M
elseif n % 4 == 0
J = div([1:n] % 4, 2)
K = J' .== J
M = broadcast(+, [1:n:(n*n)]', [0:n-1])
M[K] = n^2 + 1 - M[K]
return M
else
p = div(n, 2)
M = magic_matlab(p)
M = [M M+2p^2; M+3p^2 M+p^2]
if n == 2
return M
end
i = (1:p)
k = (n-2)/4
j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
M[[i; i+p],j] = M[[i+p; i],j]
i = k+1
j = [1; i]
M[[i; i+p],j] = M[[i+p; i],j]
return M
end
end
@vectorize_1arg Int magic_matlab

function magic_python(n::Int64)
# Works exactly as magic_square.py (from pypy)

if n % 2 == 1
m = (n >> 1) + 1
b = n^2 + 1

M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
return M
elseif n % 4 == 0
b = n^2 + 1
d = reshape(1:b-1, n, n)

d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]

return d
else
m = n >> 1
k = m >> 1
b = m^2

d = repmat(magic_python(m), 2, 2)

d[1:m, 1:k] += 3*b
d[1+m:end, 1+k:m] += 3*b
d[1+k, 1+k] += 3*b
d[1+k, 1] -= 3*b
d[1+m+k, 1] += 3*b
d[1+m+k, 1+k] -= 3*b
d[1:m,1+m:n-k+1] += b+b
d[1+m:end, 1+m:n-k+1] += b
d[1:m, 1+n-k+1:end] += b
d[1+m:end, 1+n-k+1:end] += b+b

return d
   end
end
@vectorize_1arg Int magic_python

print("Matlab version: ")
@time magic_matlab(3:1000)

print("Python version: ")
@time magic_python(3:1000)




[julia-users] IJulia loses syntax coloring

2014-08-25 Thread Joosep Pata
Hi,

Does anyone else have issues in persisting the syntax highlighting in IJulia? 
For me, the Julia-specific coloring seems to erratically disappear when I 
restart the kernel. When I convert each cell to something other than code and 
back again, it seems to work, until it again loses formatting. This is a minor 
annoyance and does not hinder productivity, but I would like my notebooks to 
keep all their nice features even when closed.

For the record, I’m using HEAD on OSX 10.9.4, this happens both with Safari and 
Firefox under IPython 2.1.

Joosep

[julia-users] problem after upgrading to v0.3.0

2014-08-25 Thread Steven Sagaert
when running a file non-interactively I get:

 julia VMRecommender.jl
ERROR: syntax: incomplete: unterminated multi-line comment #= ... =#
 in include at ./boot.jl:245
 in include_from_node1 at loading.jl:128
 in process_options at ./client.jl:285
 in _start at ./client.jl:354
 in _start_3B_1716 at /usr/bin/../lib/x86_64-linux-gnu/julia/sys.so


any idea?


[julia-users] Re: problem after upgrading to v0.3.0

2014-08-25 Thread Tobias Knopp
The multiline comments #= ... =# were introduced in version 0.3.
If your VMRecommender.jl code has a comment that starts with #= this will 
not be parsable anymore with Julia 0.3.

Cheers,

Tobi

Am Montag, 25. August 2014 16:21:43 UTC+2 schrieb Steven Sagaert:
>
> when running a file non-interactively I get:
>
>  julia VMRecommender.jl
> ERROR: syntax: incomplete: unterminated multi-line comment #= ... =#
>  in include at ./boot.jl:245
>  in include_from_node1 at loading.jl:128
>  in process_options at ./client.jl:285
>  in _start at ./client.jl:354
>  in _start_3B_1716 at /usr/bin/../lib/x86_64-linux-gnu/julia/sys.so
>
>
> any idea?
>


[julia-users] Re: How to import module from another file in same directory?

2014-08-25 Thread Andrei Zh
Valentin, thanks for your answer, but it seems like I need to give you some 
more context (sorry for not mentioning it earlier). I'm trying to repeat my 
experience of interactive development in languages like Python or Lisp. In 
these languages I can load some module/file contents to REPL ("__main__" 
module in Python, "user" namespace in Clojure, etc.) and play around with 
the code just like if I was "inside" of module under development. E.g. I 
can modify some function, send new definition to REPL and immediately try 
it out. I can also import any other modules/packages/namespaces. In Python, 
for example, being in __main__ (with loaded definitions from target module) 
I can refer to any other module on PYTHONPATH by its full name. Same thing 
with Clojure - any namespace on CLASSPATH is available for loading. 

In Julia there's Main module too. I can load some code and play around with 
it, just like in REPLs of other lanuages. E.g. I can start editor, open 
some file "linreg.jl", send all its contents to REPL, see how it works, 
update, reload, etc. Works like a charm... until I try to import another 
module. 

Unlike Python or Clojure, Julia's module system is decoupled from source 
files and directory structure. Correct me if I'm wrong, but it seems like 
there's no way to load module other than include() its source file. At the 
same time, I cannot include files all here and there. E.g. in example above 
when I work on module A (from REPL/Main) I cannot include "P.jl", because 
"P.jl" contains recursive include() of "a.jl", and they just re-include 
each other endlessly.

So the only way we can make it work is to load module system from the top 
level ("P.jl") and then refer to other modules with respect to it (e.g. 
like "using .A" or "import ..U"). It works fine with third party packages, 
but I find it really frustrating when working on some internal module (e.g. 
A). 

Thus any tips and tricks on loading modules when working from REPL/Main are 
welcome. 



On Sunday, August 24, 2014 5:38:53 PM UTC+3, Valentin Churavy wrote:
>
> What you are looking for is described in 
> http://julia.readthedocs.org/en/latest/manual/modules/#relative-and-absolute-module-paths
>  
> 
>
> in P.jl you include all your submodules 
> module P
>  include("u.jl")
>  include("a.jl")
>  include("b.jl")
>
>  using .A, .B
>
>  export f, g
> end
>
>  u.jl 
> module U
>  g() = 5
>  f() = 6
> end
>
> a.jl and b.jl both lokk like this
>
> module A
>  import ..U
>
>  f = U.f
>  g = U.g
>
>  export f, g
> end
>
> so one dot as a prefix looks in the namespace of the current module and 
> two dots as prefix looks in the namespace of the parent module.
>
> Hope that helps
>
> On Sunday, 24 August 2014 14:10:58 UTC+2, Andrei Zh wrote:
>>
>> Let's say I have following project layout: 
>>
>> P.jl that contains module P -- main package module, exposes code from 
>> a.jl and b.jl
>> a.jl that contains module A and
>> b.jl that contains module B -- some domain specific modules
>> u.jl that contains module U -- util functions
>>
>> Now I want to use functions in U from modules A and B. In simplest case I 
>> would just include("u.jl") inside of a.jl and b.jl, but this way functions 
>> from U will be defined in both - A and B. So I really want to import U, not 
>> include u.jl, but I can't do this since u.jl is not on the LOAD_PATH (and 
>> messing with it manually looks somewhat bad to me).  
>>
>> Is there some standard way to tackle it? 
>>
>> (Note, that A, B and U are here just for code splitting, other ways to do 
>> same stuff are ok too.)
>>
>

[julia-users] Re: How to import module from another file in same directory?

2014-08-25 Thread Tobias Knopp
There is https://github.com/JuliaLang/julia/issues/4600 but there was 
recently quite some dicussion on the julia-dev mailing list as well as 
https://github.com/JuliaLang/julia/issues/8014

Cheers,

Tobi

Am Montag, 25. August 2014 16:29:03 UTC+2 schrieb Andrei Zh:
>
> Valentin, thanks for your answer, but it seems like I need to give you 
> some more context (sorry for not mentioning it earlier). I'm trying to 
> repeat my experience of interactive development in languages like Python or 
> Lisp. In these languages I can load some module/file contents to REPL 
> ("__main__" module in Python, "user" namespace in Clojure, etc.) and play 
> around with the code just like if I was "inside" of module under 
> development. E.g. I can modify some function, send new definition to REPL 
> and immediately try it out. I can also import any other 
> modules/packages/namespaces. In Python, for example, being in __main__ 
> (with loaded definitions from target module) I can refer to any other 
> module on PYTHONPATH by its full name. Same thing with Clojure - any 
> namespace on CLASSPATH is available for loading. 
>
> In Julia there's Main module too. I can load some code and play around 
> with it, just like in REPLs of other lanuages. E.g. I can start editor, 
> open some file "linreg.jl", send all its contents to REPL, see how it 
> works, update, reload, etc. Works like a charm... until I try to import 
> another module. 
>
> Unlike Python or Clojure, Julia's module system is decoupled from source 
> files and directory structure. Correct me if I'm wrong, but it seems like 
> there's no way to load module other than include() its source file. At the 
> same time, I cannot include files all here and there. E.g. in example above 
> when I work on module A (from REPL/Main) I cannot include "P.jl", because 
> "P.jl" contains recursive include() of "a.jl", and they just re-include 
> each other endlessly.
>
> So the only way we can make it work is to load module system from the top 
> level ("P.jl") and then refer to other modules with respect to it (e.g. 
> like "using .A" or "import ..U"). It works fine with third party packages, 
> but I find it really frustrating when working on some internal module (e.g. 
> A). 
>
> Thus any tips and tricks on loading modules when working from REPL/Main 
> are welcome. 
>
>
>
> On Sunday, August 24, 2014 5:38:53 PM UTC+3, Valentin Churavy wrote:
>>
>> What you are looking for is described in 
>> http://julia.readthedocs.org/en/latest/manual/modules/#relative-and-absolute-module-paths
>>  
>> 
>>
>> in P.jl you include all your submodules 
>> module P
>>  include("u.jl")
>>  include("a.jl")
>>  include("b.jl")
>>
>>  using .A, .B
>>
>>  export f, g
>> end
>>
>>  u.jl 
>> module U
>>  g() = 5
>>  f() = 6
>> end
>>
>> a.jl and b.jl both lokk like this
>>
>> module A
>>  import ..U
>>
>>  f = U.f
>>  g = U.g
>>
>>  export f, g
>> end
>>
>> so one dot as a prefix looks in the namespace of the current module and 
>> two dots as prefix looks in the namespace of the parent module.
>>
>> Hope that helps
>>
>> On Sunday, 24 August 2014 14:10:58 UTC+2, Andrei Zh wrote:
>>>
>>> Let's say I have following project layout: 
>>>
>>> P.jl that contains module P -- main package module, exposes code from 
>>> a.jl and b.jl
>>> a.jl that contains module A and
>>> b.jl that contains module B -- some domain specific modules
>>> u.jl that contains module U -- util functions
>>>
>>> Now I want to use functions in U from modules A and B. In simplest case 
>>> I would just include("u.jl") inside of a.jl and b.jl, but this way 
>>> functions from U will be defined in both - A and B. So I really want to 
>>> import U, not include u.jl, but I can't do this since u.jl is not on the 
>>> LOAD_PATH (and messing with it manually looks somewhat bad to me).  
>>>
>>> Is there some standard way to tackle it? 
>>>
>>> (Note, that A, B and U are here just for code splitting, other ways to 
>>> do same stuff are ok too.)
>>>
>>

[julia-users] Re: PyPlot problems, no GUI backend for matplotlib

2014-08-25 Thread Stelios Georgiou
Ok I fixed it, I didn't have qt installed, so I installed pyqt, did 
Pkg.build("PyPlot"). That worked

On Monday, August 25, 2014 1:11:21 PM UTC+1, Viral Shah wrote:
>
> I get the same error. I think there is an old open issue on the topic in 
> the IJulia repository.
>
> -viral
>
> On Monday, August 25, 2014 9:33:13 AM UTC+5:30, Stelios Georgiou wrote:
>>
>> Hello, 
>>
>> I can't get PyPlot to work in julia or IJulia notebook, this is the error 
>> I get. When I test matplotlib also I get no visuals. I can save images and 
>> view them after but cannot see live figures. Do you have any ideas? 
>>
>> ( I use homebrew python, with packages installed with pip ) 
>> ( Julia is installed from standalone .dmg installer )  
>> ( Using OS X 10.9 )
>>
>> *julia> *
>> *using PyPlot*
>> *WARNING: No working GUI backend found for matplotlib.*
>>
>> *INFO: Loading help data...*
>> ipython (2.2.0)
>>
>> matplotlib (1.3.1)
>>
>

[julia-users] Re: problem after upgrading to v0.3.0

2014-08-25 Thread Steven Sagaert
Hi Tobias,
Thanks. I had lines like #= blabla  and those were the 
problem.

On Monday, August 25, 2014 4:21:43 PM UTC+2, Steven Sagaert wrote:
>
> when running a file non-interactively I get:
>
>  julia VMRecommender.jl
> ERROR: syntax: incomplete: unterminated multi-line comment #= ... =#
>  in include at ./boot.jl:245
>  in include_from_node1 at loading.jl:128
>  in process_options at ./client.jl:285
>  in _start at ./client.jl:354
>  in _start_3B_1716 at /usr/bin/../lib/x86_64-linux-gnu/julia/sys.so
>
>
> any idea?
>


[julia-users] Re: problem after upgrading to v0.3.0

2014-08-25 Thread Tobias Knopp
:-)

If you are interested in the history of multiline comments: 
https://github.com/JuliaLang/julia/issues/69

Am Montag, 25. August 2014 16:41:13 UTC+2 schrieb Steven Sagaert:
>
> Hi Tobias,
> Thanks. I had lines like #= blabla  and those were the 
> problem.
>
> On Monday, August 25, 2014 4:21:43 PM UTC+2, Steven Sagaert wrote:
>>
>> when running a file non-interactively I get:
>>
>>  julia VMRecommender.jl
>> ERROR: syntax: incomplete: unterminated multi-line comment #= ... =#
>>  in include at ./boot.jl:245
>>  in include_from_node1 at loading.jl:128
>>  in process_options at ./client.jl:285
>>  in _start at ./client.jl:354
>>  in _start_3B_1716 at /usr/bin/../lib/x86_64-linux-gnu/julia/sys.so
>>
>>
>> any idea?
>>
>

[julia-users] Re: Problem with v 0.3.0 on MacOSX 10.9.4

2014-08-25 Thread Henry Smith
Many thanks for the replies! A big help...
I did the rm thing and also did a Pkg.rm("Stats") so it no longer appears 
at all in a status request.
Now however when I do a Pkg.update I also get an error which I have not 
succeeded in figuring it out.
Below is what I get when I request a Pkg.update() - Do I need to completely 
re-install the pkgs I have or some such? I would think not but... And what 
does it mean by "Terminal's requirements..." (I did this from the Mac 
Terminal program - the default on OSX, it seems...)

TIA again

Henry
  

julia> Pkg.update()
INFO: Updating METADATA...
INFO: Computing changes...
ERROR: Terminals's requirements can't be satisfied because of the following 
fixed packages: julia
 in error at error.jl:22
 in resolve at 
/Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
 in update at 
/Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
 in anonymous at pkg/dir.jl:28
 in cd at 
/Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
 in __cd#227__ at 
/Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
 in update at 
/Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib 
(repeats 2 times)

julia> 

  
On Friday, August 22, 2014 4:52:16 PM UTC-4, Henry Smith wrote:
>
> Hi,
>
> Just d/led it and tried it out.  I had a couple of old versions of 0.2.x 
> (and still have 0.2.1 installed but trashed the others - some rc's). The 
> computer is an iMac with 20 GB of RAM, 2.7 GHz quad i5.
>
> When I asked about the Pkg.status(), it came up with an error and similar 
> for PKG.installed() and Pkg.update(). I copy the output below (not too big, 
> I hope) I can't figure out what if anything I did "wrong" and did not find 
> anything about problems on the Mac -- TIA for any help
>
> Henry 
>
> Last login: Fri Aug 22 16:07:45 on ttys009
> iMac-162:~ hs$ exec 
> '/Applications/Julia-0.3.0.app/Contents/Resources/julia/bin/julia'
>_
>_   _ _(_)_ |  A fresh approach to technical computing
>   (_) | (_) (_)|  Documentation: http://docs.julialang.org
>_ _   _| |_  __ _   |  Type "help()" for help.
>   | | | | | | |/ _` |  |
>   | | |_| | | | (_| |  |  Version 0.3.0 (2014-08-20 20:43 UTC)
>  _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ release
> |__/   |  x86_64-apple-darwin13.3.0
>
> julia> help()
>
>  Welcome to Julia. The full manual is available at
>
> http://docs.julialang.org
>
>  To get help, try help(function), help("@macro"), or help("variable").
>  To search all help text, try apropos("string").
>
> julia> Pkg.status()
> ERROR: failed process: Process(`git 
> --git-dir=/Users/hs/.julia/.cache/Stats merge-base 
> 87d1c8d890962dfcfd0b45b82907464787ac7c64 
> 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]
>  in readbytes at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in readchomp at pkg/git.jl:24
>  in installed_version at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in installed at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in status at pkg/entry.jl:107
>  in anonymous at pkg/dir.jl:28
>  in cd at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in cd at pkg/dir.jl:28
>  in status at pkg.jl:28 (repeats 2 times)
>
> julia> Pkg.installed()
> ERROR: failed process: Process(`git 
> --git-dir=/Users/hs/.julia/.cache/Stats merge-base 
> 87d1c8d890962dfcfd0b45b82907464787ac7c64 
> 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]
>  in readbytes at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in readchomp at pkg/git.jl:24
>  in installed_version at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in installed at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib 
> (repeats 3 times)
>  in anonymous at pkg/dir.jl:28
>  in cd at 
> /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
>  in cd at pkg/dir.jl:28
>  in installed at pkg.jl:25
>
> julia> Pkg.add("Distributions")
> INFO: Nothing to be done
> INFO: METADATA is out-of-date — you may not have the latest version of 
> Distributions
> INFO: Use `Pkg.update()` to get the latest versions of your packages
>
> julia> 
>
> julia> Pkg.update()
> INFO: Updating METADATA...
> INFO: Updating cache of IniFile...
> INFO: Updating cache of Cairo...
> INFO: Updating cache of PyPlot...
> INFO: Updating cache of Debug...
> INFO: Updating cache of Calculus...
> INFO: Updating cache of Units...
> INFO: Updating cache of HDF5...
> INFO: Updating cache of ICU...
> INFO: Updating cache of Homebrew...
> INFO: Updating cache of BinDeps...
> INFO: Updating cache of Compose...
> INFO: Updating cache of Color...
> INFO: Updating cache of TimeSeries...
> INFO: Updating cache of Gadfly...
> ERROR: failed 

[julia-users] Re: What's new in 0.3?

2014-08-25 Thread Ed Scheinerman
Thanks again for the pointer to the release notes. 

The issue I raised was not dealt with in the release notes: namely, 1:5 == 
[1:5] evaluates as true in Julia 0.2 but as false in Julia 0.3. 

I think the new behavior is a problem. I was happy with the old behavior, 
but if this is a bad idea for some reason, I would prefer that Julia raised 
an error in this situation rather than give a result that (in my opinion) 
is wrong. Certainly Julia compares objects of different type for equality 
(e.g. 0==0.0) so the fact that 1:5 and [1:5] are different types is not the 
issue. 
 

>
> On Saturday, August 23, 2014 9:06:50 AM UTC-4, Valentin Churavy wrote:
>>
>> There is https://github.com/JuliaLang/julia/blob/v0.3.0/NEWS.md 
>>
>> On Saturday, 23 August 2014 15:02:56 UTC+2, Ed Scheinerman wrote:
>>>
>>> Is there a document describing new features and significant changes 
>>> between versions 0.2 and 0.3? 
>>>
>>> One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as 
>>> true, but in 0.3 it's false. 
>>>
>>

Re: [julia-users] Re: What's new in 0.3?

2014-08-25 Thread John Myles White
The NEWS.md file does cover this:

• Ranges and arrays with the same elements are now unequal. This allows hashing 
and comparing ranges to be faster. (#5778)

On Aug 25, 2014, at 8:45 AM, Ed Scheinerman  
wrote:

> Thanks again for the pointer to the release notes. 
> 
> The issue I raised was not dealt with in the release notes: namely, 1:5 == 
> [1:5] evaluates as true in Julia 0.2 but as false in Julia 0.3. 
> 
> I think the new behavior is a problem. I was happy with the old behavior, but 
> if this is a bad idea for some reason, I would prefer that Julia raised an 
> error in this situation rather than give a result that (in my opinion) is 
> wrong. Certainly Julia compares objects of different type for equality (e.g. 
> 0==0.0) so the fact that 1:5 and [1:5] are different types is not the issue. 
>  
> 
> On Saturday, August 23, 2014 9:06:50 AM UTC-4, Valentin Churavy wrote:
> There is https://github.com/JuliaLang/julia/blob/v0.3.0/NEWS.md 
> 
> On Saturday, 23 August 2014 15:02:56 UTC+2, Ed Scheinerman wrote:
> Is there a document describing new features and significant changes between 
> versions 0.2 and 0.3? 
> 
> One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as true, 
> but in 0.3 it's false. 



Re: [julia-users] What's new in 0.3?

2014-08-25 Thread Jacob Quinn
See this issue: https://github.com/JuliaLang/julia/issues/7867 and the
discussion in https://github.com/JuliaLang/julia/issues/5778 for
information on the change.

-Jacob


On Sat, Aug 23, 2014 at 9:02 AM, Ed Scheinerman <
edward.scheiner...@gmail.com> wrote:

> Is there a document describing new features and significant changes
> between versions 0.2 and 0.3?
>
> One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as
> true, but in 0.3 it's false.
>


[julia-users] Re: Announcing Julia 0.3.0 final

2014-08-25 Thread M Long
I am quite curious on opinions when Julia Studio may have 0.30 cooked in?  

I know this is not the Studio news group, but I am curious on opinions on 
when that may happen, as I am new and don't have a good feeling for the 
history of when these things happen.

Thanks!


Re: [julia-users] What's new in 0.3?

2014-08-25 Thread Stefan Karpinski
If you (or anyone) can come up with a clever scheme for hashing arrays and
ranges so that 1:n and [1:n] hash the same but hash(1:n) isn't an O(n)
operations, I'd be thrilled to switch this back. I could not figure out a
good way to do this, however.


On Mon, Aug 25, 2014 at 11:47 AM, Jacob Quinn 
wrote:

> See this issue: https://github.com/JuliaLang/julia/issues/7867 and the
> discussion in https://github.com/JuliaLang/julia/issues/5778 for
> information on the change.
>
> -Jacob
>
>
> On Sat, Aug 23, 2014 at 9:02 AM, Ed Scheinerman <
> edward.scheiner...@gmail.com> wrote:
>
>> Is there a document describing new features and significant changes
>> between versions 0.2 and 0.3?
>>
>> One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as
>> true, but in 0.3 it's false.
>>
>
>


[julia-users] NLreg on windows

2014-08-25 Thread Mark Sale
Anyone have any success running NLreg on Windows 7, 64 bit? When I try

using NLreg,
I'm getting the error.
Error:  FP not defined
  in include at boot.j1:245
  in include_from_node1 at loading.j1:128
while loading c:\users\Mark\.julia\v0.3\NLreg\src\Nlreg.j1, in expression 
starting on line 1.

I've tried version 1.2, 2.0, 3.0 (and the development, which looks like it 
it identical to 3.0).

Any suggestions?
thanks
Mark



Re: [julia-users] What's new in 0.3?

2014-08-25 Thread Ed Scheinerman
Given a choice between 1:10==[1:10] returning false or throwing an error,
I'd vote for "error".


On Mon, Aug 25, 2014 at 12:06 PM, Stefan Karpinski 
wrote:

> If you (or anyone) can come up with a clever scheme for hashing arrays and
> ranges so that 1:n and [1:n] hash the same but hash(1:n) isn't an O(n)
> operations, I'd be thrilled to switch this back. I could not figure out a
> good way to do this, however.
>
>
> On Mon, Aug 25, 2014 at 11:47 AM, Jacob Quinn 
> wrote:
>
>> See this issue: https://github.com/JuliaLang/julia/issues/7867 and the
>> discussion in https://github.com/JuliaLang/julia/issues/5778 for
>> information on the change.
>>
>> -Jacob
>>
>>
>> On Sat, Aug 23, 2014 at 9:02 AM, Ed Scheinerman <
>> edward.scheiner...@gmail.com> wrote:
>>
>>> Is there a document describing new features and significant changes
>>> between versions 0.2 and 0.3?
>>>
>>> One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as
>>> true, but in 0.3 it's false.
>>>
>>
>>
>


-- 
Ed Scheinerman (e...@scheinerman.net)


Re: [julia-users] Re: What's new in 0.3?

2014-08-25 Thread Ed Scheinerman
My bad. Didn't read this carefully enough to realize this comment applied
to my issue.


On Mon, Aug 25, 2014 at 11:47 AM, John Myles White  wrote:

> The NEWS.md file does cover this:
>
> • Ranges and arrays with the same elements are now unequal. This allows
> hashing and comparing ranges to be faster. (#5778)
>
> On Aug 25, 2014, at 8:45 AM, Ed Scheinerman 
> wrote:
>
> > Thanks again for the pointer to the release notes.
> >
> > The issue I raised was not dealt with in the release notes: namely, 1:5
> == [1:5] evaluates as true in Julia 0.2 but as false in Julia 0.3.
> >
> > I think the new behavior is a problem. I was happy with the old
> behavior, but if this is a bad idea for some reason, I would prefer that
> Julia raised an error in this situation rather than give a result that (in
> my opinion) is wrong. Certainly Julia compares objects of different type
> for equality (e.g. 0==0.0) so the fact that 1:5 and [1:5] are different
> types is not the issue.
> >
> >
> > On Saturday, August 23, 2014 9:06:50 AM UTC-4, Valentin Churavy wrote:
> > There is https://github.com/JuliaLang/julia/blob/v0.3.0/NEWS.md
> >
> > On Saturday, 23 August 2014 15:02:56 UTC+2, Ed Scheinerman wrote:
> > Is there a document describing new features and significant changes
> between versions 0.2 and 0.3?
> >
> > One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as
> true, but in 0.3 it's false.
>
>


-- 
Ed Scheinerman (e...@scheinerman.net)


Re: [julia-users] What's new in 0.3?

2014-08-25 Thread Stefan Karpinski
Any pair of objects can be checked for equality – that is never an error.


On Mon, Aug 25, 2014 at 12:22 PM, Ed Scheinerman <
edward.scheiner...@gmail.com> wrote:

> Given a choice between 1:10==[1:10] returning false or throwing an error,
> I'd vote for "error".
>
>
> On Mon, Aug 25, 2014 at 12:06 PM, Stefan Karpinski 
> wrote:
>
>> If you (or anyone) can come up with a clever scheme for hashing arrays
>> and ranges so that 1:n and [1:n] hash the same but hash(1:n) isn't an O(n)
>> operations, I'd be thrilled to switch this back. I could not figure out a
>> good way to do this, however.
>>
>>
>> On Mon, Aug 25, 2014 at 11:47 AM, Jacob Quinn 
>> wrote:
>>
>>> See this issue: https://github.com/JuliaLang/julia/issues/7867 and the
>>> discussion in https://github.com/JuliaLang/julia/issues/5778 for
>>> information on the change.
>>>
>>> -Jacob
>>>
>>>
>>> On Sat, Aug 23, 2014 at 9:02 AM, Ed Scheinerman <
>>> edward.scheiner...@gmail.com> wrote:
>>>
 Is there a document describing new features and significant changes
 between versions 0.2 and 0.3?

 One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated
 as true, but in 0.3 it's false.

>>>
>>>
>>
>
>
> --
> Ed Scheinerman (e...@scheinerman.net)
>


[julia-users] Re: Announcing Julia 0.3.0 final

2014-08-25 Thread Westley Hennigh
Hey M Long,

I'm speaking for myself here, and it's entirely possible there are things I 
don't know, but it's really unlikely Julia Studio will receive any real 
support. There were two of us at Forio working on the project, but neither 
of us is there now and they've marked it as extremely low priority. I think 
it's possible they may release a bundle that comes with Julia 0.3, but it 
probably won't integrate any of the cool new REPL features or anything like 
that.

The project is open source, and I believe that if you build from master and 
download Julia 0.3 independently it will work, but I would really suggest 
just going with something like Sublime Text and using the built-in Julia 
REPL.

At some I think it would be awesome to use the QtCreator (which Julia 
Studio is based on) GDB interface with whatever comes out of the Julia 
debugger work that's happening - and at that point it might be worth 
investing the time into integrating the Julia REPL into Julia Studio as 
well... but none of that is in the cards right now, we'll have to see what 
happens.

On Monday, August 25, 2014 11:52:09 AM UTC-4, M Long wrote:
>
> I am quite curious on opinions when Julia Studio may have 0.30 cooked in?  
>
> I know this is not the Studio news group, but I am curious on opinions on 
> when that may happen, as I am new and don't have a good feeling for the 
> history of when these things happen.
>
> Thanks!
>


Re: [julia-users] Can this magic square function be further optimized?

2014-08-25 Thread Kevin Squire
Hi Phillip,

Others may respond with more specific answers, but have you had the chance
to read through the Julia performance tips in the Julia manual?

http://julia.readthedocs.org/en/latest/manual/performance-tips/

Cheers,
   Kevin

On Monday, August 25, 2014, Phillip Berndt 
wrote:

> Hi julia-users,
>
> I've recently stumbled over Julia and wanted to give it a try.
>
> To assess it's speed, I've implemented another micro-benchmark, namely a
> version of Matlab's magic() function that generates magic squares. Since I
> have no experience writing optimal Julia code, I started off with literal
> translations of two different implementations - Matlab's and the one from
> magic_square.py from PyPy, which is an optimized version for NumPy. I then
> timed the calculation of all magic squares from N=3 to N=1000. The table
> from Julia's homepage suggests that in most cases, it is significantly
> faster than Python and Matlab. In my case, it's significantly slower, which
> is somehow disappointing ;) My question now is:
>
> Can the implementation be optimized to outperform the other two?
>
> *The times:*
>
> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428
> bytes allocated, 12.54% gc time)
> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792
> bytes allocated, 26.99% gc time)
> Matlab: Elapsed time is 4.994960 seconds.
> Python: 1 loops, best of 3: 2.09 s per loop
>
> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM,
> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does
> not seem to gc during this loop (disabling gc doesn't alter the time here),
> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's
> still much slower than Python. (By the way, even Octave only needs 4.46
> seconds.) If I translate the matrices in magic_python to account for
> column-major storage, the execution time does not significantly improve.
>
> *The code:*
>
> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in
> range(3, 1001) ];
> Julia: I've uploaded the code to a Gist at
> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will paste
> a copy below this post.
>
>
> Cheers,
> Phillip
>
>
> function magic_matlab(n::Int64)
> # Works exactly as Matlab's magic.m
>
> if n % 2 == 1
> p = (1:n)
> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) +
> mod(broadcast(+, p', 2p - 2), n) + 1
> return M
> elseif n % 4 == 0
> J = div([1:n] % 4, 2)
> K = J' .== J
> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
> M[K] = n^2 + 1 - M[K]
> return M
> else
> p = div(n, 2)
> M = magic_matlab(p)
> M = [M M+2p^2; M+3p^2 M+p^2]
> if n == 2
> return M
> end
> i = (1:p)
> k = (n-2)/4
> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
> M[[i; i+p],j] = M[[i+p; i],j]
> i = k+1
> j = [1; i]
> M[[i; i+p],j] = M[[i+p; i],j]
> return M
> end
> end
> @vectorize_1arg Int magic_matlab
>
> function magic_python(n::Int64)
> # Works exactly as magic_square.py (from pypy)
>
> if n % 2 == 1
> m = (n >> 1) + 1
> b = n^2 + 1
>
> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
> return M
> elseif n % 4 == 0
> b = n^2 + 1
> d = reshape(1:b-1, n, n)
>
> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>
> return d
> else
> m = n >> 1
> k = m >> 1
> b = m^2
>
> d = repmat(magic_python(m), 2, 2)
>
> d[1:m, 1:k] += 3*b
> d[1+m:end, 1+k:m] += 3*b
> d[1+k, 1+k] += 3*b
> d[1+k, 1] -= 3*b
> d[1+m+k, 1] += 3*b
> d[1+m+k, 1+k] -= 3*b
> d[1:m,1+m:n-k+1] += b+b
> d[1+m:end, 1+m:n-k+1] += b
> d[1:m, 1+n-k+1:end] += b
> d[1+m:end, 1+n-k+1:end] += b+b
>
> return d
>end
> end
> @vectorize_1arg Int magic_python
>
> print("Matlab version: ")
> @time magic_matlab(3:1000)
>
> print("Python version: ")
> @time magic_python(3:1000)
>
>
>


Re: [julia-users] Can this magic square function be further optimized?

2014-08-25 Thread Tim Holy
As usual, it depends on how much you value speed vs simplicity. You can go 
farther towards the simplicity direction by noting that, in the matlab-
inspired version,
broadcast(+, a, b)
is equivalent to
a .+ b
which is a much nicer syntax than is available in Matlab. 

The Python version looks more tuned for speed, but the fact that you're 
allocating so much memory is an indication there's much more you can do. The 
manual section that Kevin pointed out will help. Basically, you want to 
replace operations like
d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
with something like
subtract_d_from_b!(d, 1:4, 1:4, b)
and write that function in devectorized form.

--Tim

On Monday, August 25, 2014 06:38:11 AM Phillip Berndt wrote:
> Hi julia-users,
> 
> I've recently stumbled over Julia and wanted to give it a try.
> 
> To assess it's speed, I've implemented another micro-benchmark, namely a
> version of Matlab's magic() function that generates magic squares. Since I
> have no experience writing optimal Julia code, I started off with literal
> translations of two different implementations - Matlab's and the one from
> magic_square.py from PyPy, which is an optimized version for NumPy. I then
> timed the calculation of all magic squares from N=3 to N=1000. The table
> from Julia's homepage suggests that in most cases, it is significantly
> faster than Python and Matlab. In my case, it's significantly slower, which
> is somehow disappointing ;) My question now is:
> 
> Can the implementation be optimized to outperform the other two?
> 
> *The times:*
> 
> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428
> bytes allocated, 12.54% gc time)
> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 bytes
> allocated, 26.99% gc time)
> Matlab: Elapsed time is 4.994960 seconds.
> Python: 1 loops, best of 3: 2.09 s per loop
> 
> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM,
> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does
> not seem to gc during this loop (disabling gc doesn't alter the time here),
> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's
> still much slower than Python. (By the way, even Octave only needs 4.46
> seconds.) If I translate the matrices in magic_python to account for
> column-major storage, the execution time does not significantly improve.
> 
> *The code:*
> 
> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in
> range(3, 1001) ];
> Julia: I've uploaded the code to a Gist at
> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will paste a
> copy below this post.
> 
> 
> Cheers,
> Phillip
> 
> 
> function magic_matlab(n::Int64)
> # Works exactly as Matlab's magic.m
> 
> if n % 2 == 1
> p = (1:n)
> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) +
> mod(broadcast(+, p', 2p - 2), n) + 1
> return M
> elseif n % 4 == 0
> J = div([1:n] % 4, 2)
> K = J' .== J
> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
> M[K] = n^2 + 1 - M[K]
> return M
> else
> p = div(n, 2)
> M = magic_matlab(p)
> M = [M M+2p^2; M+3p^2 M+p^2]
> if n == 2
> return M
> end
> i = (1:p)
> k = (n-2)/4
> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
> M[[i; i+p],j] = M[[i+p; i],j]
> i = k+1
> j = [1; i]
> M[[i; i+p],j] = M[[i+p; i],j]
> return M
> end
> end
> @vectorize_1arg Int magic_matlab
> 
> function magic_python(n::Int64)
> # Works exactly as magic_square.py (from pypy)
> 
> if n % 2 == 1
> m = (n >> 1) + 1
> b = n^2 + 1
> 
> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
> return M
> elseif n % 4 == 0
> b = n^2 + 1
> d = reshape(1:b-1, n, n)
> 
> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
> 
> return d
> else
> m = n >> 1
> k = m >> 1
> b = m^2
> 
> d = repmat(magic_python(m), 2, 2)
> 
> d[1:m, 1:k] += 3*b
> d[1+m:end, 1+k:m] += 3*b
> d[1+k, 1+k] += 3*b
> d[1+k, 1] -= 3*b
> d[1+m+k, 1] += 3*b
> d[1+m+k, 1+k] -= 3*b
> d[1:m,1+m:n-k+1] += b+b
> d[1+m:end, 1+m:n-k+1] += b
> d[1:m, 1+n-k+1:end] += b
> d[1+m:end, 1+n-k+1:end] += b+b
> 
> return d
>end
> end
> @vectorize_1arg Int magic_python
> 
> print("Matlab ver

[julia-users] Re: NLreg on windows

2014-08-25 Thread Iain Dunning
I don't think NLreg is maintained, see automatic testing info here 


On Monday, August 25, 2014 12:20:31 PM UTC-4, Mark Sale wrote:
>
> Anyone have any success running NLreg on Windows 7, 64 bit? When I try
>
> using NLreg,
> I'm getting the error.
> Error:  FP not defined
>   in include at boot.j1:245
>   in include_from_node1 at loading.j1:128
> while loading c:\users\Mark\.julia\v0.3\NLreg\src\Nlreg.j1, in expression 
> starting on line 1.
>
> I've tried version 1.2, 2.0, 3.0 (and the development, which looks like 
> it it identical to 3.0).
>
> Any suggestions?
> thanks
> Mark
>
>

[julia-users] Re: Can this magic square function be further optimized?

2014-08-25 Thread Iain Dunning
Hah I usually love making things faster but that code is so impenetrable I 
think I'd rather implement it from scratch.

Its definitely un-Julian though, so not surprised its slower.

Profile reports these lines in the MATLAB version as being problematic:

94 ...unning/Desktop/magic.jl magic_matlab  6
31 ...unning/Desktop/magic.jl magic_matlab  10
19 ...unning/Desktop/magic.jl magic_matlab  11
13 ...unning/Desktop/magic.jl magic_matlab  27

On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>
> Hi julia-users,
>
> I've recently stumbled over Julia and wanted to give it a try. 
>
> To assess it's speed, I've implemented another micro-benchmark, namely a 
> version of Matlab's magic() function that generates magic squares. Since I 
> have no experience writing optimal Julia code, I started off with literal 
> translations of two different implementations - Matlab's and the one from 
> magic_square.py from PyPy, which is an optimized version for NumPy. I then 
> timed the calculation of all magic squares from N=3 to N=1000. The table 
> from Julia's homepage suggests that in most cases, it is significantly 
> faster than Python and Matlab. In my case, it's significantly slower, which 
> is somehow disappointing ;) My question now is:
>
> Can the implementation be optimized to outperform the other two?
>
> *The times:*
>
> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428 
> bytes allocated, 12.54% gc time)
> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 
> bytes allocated, 26.99% gc time)
> Matlab: Elapsed time is 4.994960 seconds.
> Python: 1 loops, best of 3: 2.09 s per loop
>
> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM, 
> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does 
> not seem to gc during this loop (disabling gc doesn't alter the time here), 
> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's 
> still much slower than Python. (By the way, even Octave only needs 4.46 
> seconds.) If I translate the matrices in magic_python to account for 
> column-major storage, the execution time does not significantly improve.
>
> *The code:*
>
> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in 
> range(3, 1001) ];
> Julia: I've uploaded the code to a Gist at 
> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will paste 
> a copy below this post.
>
>
> Cheers,
> Phillip
>
>
> function magic_matlab(n::Int64)
> # Works exactly as Matlab's magic.m
>
> if n % 2 == 1
> p = (1:n)
> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) + 
> mod(broadcast(+, p', 2p - 2), n) + 1
> return M
> elseif n % 4 == 0
> J = div([1:n] % 4, 2)
> K = J' .== J
> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
> M[K] = n^2 + 1 - M[K]
> return M
> else
> p = div(n, 2)
> M = magic_matlab(p)
> M = [M M+2p^2; M+3p^2 M+p^2]
> if n == 2
> return M
> end
> i = (1:p)
> k = (n-2)/4
> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
> M[[i; i+p],j] = M[[i+p; i],j]
> i = k+1
> j = [1; i]
> M[[i; i+p],j] = M[[i+p; i],j]
> return M
> end
> end
> @vectorize_1arg Int magic_matlab
>
> function magic_python(n::Int64)
> # Works exactly as magic_square.py (from pypy)
>
> if n % 2 == 1
> m = (n >> 1) + 1
> b = n^2 + 1
>
> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
> return M
> elseif n % 4 == 0
> b = n^2 + 1
> d = reshape(1:b-1, n, n)
>
> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>
> return d
> else
> m = n >> 1
> k = m >> 1
> b = m^2
>
> d = repmat(magic_python(m), 2, 2)
>
> d[1:m, 1:k] += 3*b
> d[1+m:end, 1+k:m] += 3*b
> d[1+k, 1+k] += 3*b
> d[1+k, 1] -= 3*b
> d[1+m+k, 1] += 3*b
> d[1+m+k, 1+k] -= 3*b
> d[1:m,1+m:n-k+1] += b+b
> d[1+m:end, 1+m:n-k+1] += b
> d[1:m, 1+n-k+1:end] += b
> d[1+m:end, 1+n-k+1:end] += b+b
>
> return d
>end
> end
> @vectorize_1arg Int magic_python
>
> print("Matlab version: ")
> @time magic_matlab(3:1000)
>
> print("Python version: ")
> @time magic_python(3:1000)
>
>
>

[julia-users] Re: Can this magic square function be further optimized?

2014-08-25 Thread Iain Dunning
(using wikipedia page to implement my own now)

On Monday, August 25, 2014 1:08:48 PM UTC-4, Iain Dunning wrote:
>
> Hah I usually love making things faster but that code is so impenetrable I 
> think I'd rather implement it from scratch.
>
> Its definitely un-Julian though, so not surprised its slower.
>
> Profile reports these lines in the MATLAB version as being problematic:
>
> 94 ...unning/Desktop/magic.jl magic_matlab  6
> 31 ...unning/Desktop/magic.jl magic_matlab  10
> 19 ...unning/Desktop/magic.jl magic_matlab  11
> 13 ...unning/Desktop/magic.jl magic_matlab  27
>
> On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>>
>> Hi julia-users,
>>
>> I've recently stumbled over Julia and wanted to give it a try. 
>>
>> To assess it's speed, I've implemented another micro-benchmark, namely a 
>> version of Matlab's magic() function that generates magic squares. Since I 
>> have no experience writing optimal Julia code, I started off with literal 
>> translations of two different implementations - Matlab's and the one from 
>> magic_square.py from PyPy, which is an optimized version for NumPy. I then 
>> timed the calculation of all magic squares from N=3 to N=1000. The table 
>> from Julia's homepage suggests that in most cases, it is significantly 
>> faster than Python and Matlab. In my case, it's significantly slower, which 
>> is somehow disappointing ;) My question now is:
>>
>> Can the implementation be optimized to outperform the other two?
>>
>> *The times:*
>>
>> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428 
>> bytes allocated, 12.54% gc time)
>> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 
>> bytes allocated, 26.99% gc time)
>> Matlab: Elapsed time is 4.994960 seconds.
>> Python: 1 loops, best of 3: 2.09 s per loop
>>
>> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM, 
>> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does 
>> not seem to gc during this loop (disabling gc doesn't alter the time here), 
>> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's 
>> still much slower than Python. (By the way, even Octave only needs 4.46 
>> seconds.) If I translate the matrices in magic_python to account for 
>> column-major storage, the execution time does not significantly improve.
>>
>> *The code:*
>>
>> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
>> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in 
>> range(3, 1001) ];
>> Julia: I've uploaded the code to a Gist at 
>> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will 
>> paste a copy below this post.
>>
>>
>> Cheers,
>> Phillip
>>
>>
>> function magic_matlab(n::Int64)
>> # Works exactly as Matlab's magic.m
>>
>> if n % 2 == 1
>> p = (1:n)
>> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) + 
>> mod(broadcast(+, p', 2p - 2), n) + 1
>> return M
>> elseif n % 4 == 0
>> J = div([1:n] % 4, 2)
>> K = J' .== J
>> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
>> M[K] = n^2 + 1 - M[K]
>> return M
>> else
>> p = div(n, 2)
>> M = magic_matlab(p)
>> M = [M M+2p^2; M+3p^2 M+p^2]
>> if n == 2
>> return M
>> end
>> i = (1:p)
>> k = (n-2)/4
>> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
>> M[[i; i+p],j] = M[[i+p; i],j]
>> i = k+1
>> j = [1; i]
>> M[[i; i+p],j] = M[[i+p; i],j]
>> return M
>> end
>> end
>> @vectorize_1arg Int magic_matlab
>>
>> function magic_python(n::Int64)
>> # Works exactly as magic_square.py (from pypy)
>>
>> if n % 2 == 1
>> m = (n >> 1) + 1
>> b = n^2 + 1
>>
>> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
>> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
>> return M
>> elseif n % 4 == 0
>> b = n^2 + 1
>> d = reshape(1:b-1, n, n)
>>
>> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
>> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
>> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
>> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
>> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
>> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
>> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
>> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>>
>> return d
>> else
>> m = n >> 1
>> k = m >> 1
>> b = m^2
>>
>> d = repmat(magic_python(m), 2, 2)
>>
>> d[1:m, 1:k] += 3*b
>> d[1+m:end, 1+k:m] += 3*b
>> d[1+k, 1+k] += 3*b
>> d[1+k, 1] -= 3*b
>> d[1+m+k, 1] += 3*b
>> d[1+m+k, 1+k] -= 3*b
>> d[1:m,1+m:n-k+1] += b+b
>> d[1+m:end, 1+m:n-k+1] += b
>> d[1:m, 1+n-k+1:end] += b
>> d[1+m:end, 1+n-k+1:end] += b

[julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang
I have some ideas from my experience in C++11, but I'd like to learn the 
proper "Julian" way :)

My goal is to implement a function that computes an Array. I do this if I 
want speed:

function computestuff!(A::Array{FloatingPoint,1})
A=fill!(A,0); #reset array values
#modify the contents of A
end


#in the calling routine...
A::Array{FloatingPoint,1}=Array(FloatingPoint,5);
computestuff!(A);

Question 1: Is there an even faster way in Julia?


Question 2: I wish to hide the details inside that function (i.e. allocate 
the size of A inside of the function) without sacrificing speed. This way, 
in the calling routine, I can just write

#in the calling routine...
A=computestuff(5);


Is this possible?
I think any new memory allocated inside that function will be 
undefined/freed once the function exits. If the pointer A is assigned to 
this memory, I'd get undefined results. Julia probably checks against this 
kind of situation and assigns a deep-copy instead, which slow things down.

Thanks!




[julia-users] ANN: major upgrade to HDF5/JLD

2014-08-25 Thread Tim Holy
I'd like to announce the merger of a massive revamping of HDF5 & JLD. The 
bottom line is that now many immutables and types will be saved in a far more 
efficient fashion to disk, particularly for arrays of immutables. In such 
cases, 
the performance gains are quite extraordinary, often hundred-fold or larger in 
terms of time, and the resulting files are much smaller. For many situations, 
HDF5 is now comparable to the serializer (and sometimes faster) for reading 
from and writing to disk.

In addition to the performance enhancement, the whole system in JLD for saving 
and loading Julia types has been given a facelift. Rather than barfing on 
broken or missing types, there's a sophisticated method to reconstruct types 
based on data in the disk file. You might appreciate this if you save data, 
change your type definitions in your code, and then want the data back again. 
While these types won't allow you to proceed entirely as if nothing is wrong, 
it does give you an easier path to recovery.

Finally, all this has been done with an effort to preserve backwards 
compatibility. All tests pass with the new format (and many new tests have 
been added), and the JLDArchives tests pass, suggesting that at least for the 
files checked into JLDArchives, there's been no hit to the ability to read old 
files.

Before tagging a new version, we're hoping that a few hardy souls will 
experiment by checking out master for the HDF5 package. Testing is needed 
because we're talking about a long-term data storage format here. Once this 
becomes "official" (by tagging a new version), I'll deposit a copy of a test 
file 
in JLDArchives, so that format becomes something we have to support going 
forward. It would be much nicer to uncover any problems quickly so we don't 
have to worry about ugly kludges :-).

My role in this has exclusively been that of cheerleader: the work was done by 
Simon Kornblith and incorporated some great preliminary work by Matt Baumann. 
This is a 30-commit, 2,000 change, so it's a huge effort and very beautifully 
done. My sincere thanks go to them!

--Tim



[julia-users] Re: Can this magic square function be further optimized?

2014-08-25 Thread Iain Dunning
For a magic square of size , my times are

my version: 1.9 seconds
MATLAB-style: 9.2 seconds
Python-style: 5.1 seconds

function iain_magic(n::Int)
if n % 2 == 1
# Odd-order magic square
# 
http://en.wikipedia.org/wiki/Magic_square#Method_for_constructing_a_magic_square_of_odd_order
M = zeros(Int, n, n)
for I = 1:n  # row
for J = 1:n
@inbounds M[I,J] = n*((I+J-1+div(n,2))%n) + ((I+2J-2)%n) + 1
end
end
return M
end
end

On Monday, August 25, 2014 1:11:39 PM UTC-4, Iain Dunning wrote:
>
> (using wikipedia page to implement my own now)
>
> On Monday, August 25, 2014 1:08:48 PM UTC-4, Iain Dunning wrote:
>>
>> Hah I usually love making things faster but that code is so impenetrable 
>> I think I'd rather implement it from scratch.
>>
>> Its definitely un-Julian though, so not surprised its slower.
>>
>> Profile reports these lines in the MATLAB version as being problematic:
>>
>> 94 ...unning/Desktop/magic.jl magic_matlab  6
>> 31 ...unning/Desktop/magic.jl magic_matlab  10
>> 19 ...unning/Desktop/magic.jl magic_matlab  11
>> 13 ...unning/Desktop/magic.jl magic_matlab  27
>>
>> On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>>>
>>> Hi julia-users,
>>>
>>> I've recently stumbled over Julia and wanted to give it a try. 
>>>
>>> To assess it's speed, I've implemented another micro-benchmark, namely a 
>>> version of Matlab's magic() function that generates magic squares. Since I 
>>> have no experience writing optimal Julia code, I started off with literal 
>>> translations of two different implementations - Matlab's and the one from 
>>> magic_square.py from PyPy, which is an optimized version for NumPy. I then 
>>> timed the calculation of all magic squares from N=3 to N=1000. The table 
>>> from Julia's homepage suggests that in most cases, it is significantly 
>>> faster than Python and Matlab. In my case, it's significantly slower, which 
>>> is somehow disappointing ;) My question now is:
>>>
>>> Can the implementation be optimized to outperform the other two?
>>>
>>> *The times:*
>>>
>>> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428 
>>> bytes allocated, 12.54% gc time)
>>> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 
>>> bytes allocated, 26.99% gc time)
>>> Matlab: Elapsed time is 4.994960 seconds.
>>> Python: 1 loops, best of 3: 2.09 s per loop
>>>
>>> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM, 
>>> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does 
>>> not seem to gc during this loop (disabling gc doesn't alter the time here), 
>>> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's 
>>> still much slower than Python. (By the way, even Octave only needs 4.46 
>>> seconds.) If I translate the matrices in magic_python to account for 
>>> column-major storage, the execution time does not significantly improve.
>>>
>>> *The code:*
>>>
>>> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
>>> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in 
>>> range(3, 1001) ];
>>> Julia: I've uploaded the code to a Gist at 
>>> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will 
>>> paste a copy below this post.
>>>
>>>
>>> Cheers,
>>> Phillip
>>>
>>>
>>> function magic_matlab(n::Int64)
>>> # Works exactly as Matlab's magic.m
>>>
>>> if n % 2 == 1
>>> p = (1:n)
>>> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) + 
>>> mod(broadcast(+, p', 2p - 2), n) + 1
>>> return M
>>> elseif n % 4 == 0
>>> J = div([1:n] % 4, 2)
>>> K = J' .== J
>>> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
>>> M[K] = n^2 + 1 - M[K]
>>> return M
>>> else
>>> p = div(n, 2)
>>> M = magic_matlab(p)
>>> M = [M M+2p^2; M+3p^2 M+p^2]
>>> if n == 2
>>> return M
>>> end
>>> i = (1:p)
>>> k = (n-2)/4
>>> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
>>> M[[i; i+p],j] = M[[i+p; i],j]
>>> i = k+1
>>> j = [1; i]
>>> M[[i; i+p],j] = M[[i+p; i],j]
>>> return M
>>> end
>>> end
>>> @vectorize_1arg Int magic_matlab
>>>
>>> function magic_python(n::Int64)
>>> # Works exactly as magic_square.py (from pypy)
>>>
>>> if n % 2 == 1
>>> m = (n >> 1) + 1
>>> b = n^2 + 1
>>>
>>> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
>>> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
>>> return M
>>> elseif n % 4 == 0
>>> b = n^2 + 1
>>> d = reshape(1:b-1, n, n)
>>>
>>> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
>>> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
>>> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
>>> d[4:4:n, 

Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread John Myles White
Yes, this is possible. A common Julian pattern is:

function foo!(dest::Array, src::Array)
mutate!(dest, src)
end

function foo(src::Array)
dest = copy(src)
foo!(dest, src)
return dest
end

Some other points to note:

Array{FloatingPoint} isn't related to Array{Float64}. Julia's type system 
always employs invariance for parametric types: 
https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)

The numeric parameter of an Array type is its tensor order, not its number of 
elements. So it's rare you'd work with Array{FP, 5}.

 -- John

On Aug 25, 2014, at 10:16 AM, Roy Wang  wrote:

> I have some ideas from my experience in C++11, but I'd like to learn the 
> proper "Julian" way :)
> 
> My goal is to implement a function that computes an Array. I do this if I 
> want speed:
> 
> function computestuff!(A::Array{FloatingPoint,1})
> A=fill!(A,0); #reset array values
> #modify the contents of A
> end
> 
> 
> #in the calling routine...
> A::Array{FloatingPoint,1}=Array(FloatingPoint,5);
> computestuff!(A);
> 
> Question 1: Is there an even faster way in Julia?
> 
> 
> Question 2: I wish to hide the details inside that function (i.e. allocate 
> the size of A inside of the function) without sacrificing speed. This way, in 
> the calling routine, I can just write
> 
> #in the calling routine...
> A=computestuff(5);
> 
> 
> Is this possible?
> I think any new memory allocated inside that function will be undefined/freed 
> once the function exits. If the pointer A is assigned to this memory, I'd get 
> undefined results. Julia probably checks against this kind of situation and 
> assigns a deep-copy instead, which slow things down.
> 
> Thanks!
> 
> 



[julia-users] Re: Announcing Julia 0.3.0 final

2014-08-25 Thread gentlebeldin
I've installed Julia via Ubuntu Software-Center, so it installed v0.2.1, 
months ago. To switch to v0.3, do I have to remove that installation and 
install the new version with the (Linux) package manager, or is there a 
gentler way? Probably, there isn't, and what will happen to IJulia etc.? I 
guess I'll have to re-install that, too, with the Julia package manager, is 
that correct? 

Am Donnerstag, 21. August 2014 01:46:21 UTC+2 schrieb Elliot Saba:
>
> We are pleased to announce the immediate release of Julia 0.3.0.  This 
> release contains numerous improvements across the board from standard 
> library changes to pure performance enhancements as well as an expanded 
> ecosystem of packages as compared to the 0.2 releases. A summary of changes 
> is available in NEWS.md 
> 
>  in 
> our main repository, and binaries are now available on our main download 
> page .
>
> We are now transitioning into the 0.4 development cycle, and encourage 
> users to use the 0.3.X line if they need a stable julia environment.  Many 
> breaking changes will be entering the environment over the course of the 
> next few months, and to denote this builds will have use the versioning 
> scheme 0.4.0-dev.  Once the major breaking changes have been merged and the 
> development cycle progresses towards a stable release, the version will 
> shift to 0.4.0-pre, at which point package authors and users should start 
> to think about transitioning the codebases over to the 0.4.X line.
>
> The release-0.3 branch of the codebase will remain open for bugfixes 
> during this time, and we encourage users facing problems to open issues on 
> our GitHub tracker , or email 
> the julia-users mailing list 
> .
>
> Happy coding.
>


[julia-users] Trouble deducing type parameter with inner constructor

2014-08-25 Thread Magnus Lie Hetland
What is the right approach if I want (1) to use an inner constructor, which 
does some modification/normalization to the arguments, and (2) I want 
proper type parameter deduction from the arguments? Do I need to write a 
separate function or something?

I'm not even sure what's going on here (i.e., why it doesn't work with the 
inner constructor) – or how I could make it work with the outer constructor…

immutable A{N}

x::NTuple{N, Int}

y::Int

A(x, y) = new(x, y + 1)

end


#a = A((1, 2), 3)   # Doesn't work

a = A{2}((1, 2), 3) # Works (w/cumbersome param)

println(a)


abstract C


immutable B{N} 

x::NTuple{N, Int}

y::Int

end


#B(x, y) = B(x, y + 1)  # Stack overflow...

b = B((1, 2), 3)# Works (w/wrong answer)

println(b)

What am I doing wrong?


[julia-users] Re: Can this magic square function be further optimized?

2014-08-25 Thread Iain Dunning
Updated gist for the doubly-even order case
https://gist.github.com/IainNZ/9b5f1eb1bcf923ed02d9

For a magic square of size 1:
Mine: 0.47
Matlab-style: 1.7
Python-style: 1.0

So, probably faster than MATLAB-in-MATLAB at this point, maybe same as PyPy?

On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>
> Hi julia-users,
>
> I've recently stumbled over Julia and wanted to give it a try. 
>
> To assess it's speed, I've implemented another micro-benchmark, namely a 
> version of Matlab's magic() function that generates magic squares. Since I 
> have no experience writing optimal Julia code, I started off with literal 
> translations of two different implementations - Matlab's and the one from 
> magic_square.py from PyPy, which is an optimized version for NumPy. I then 
> timed the calculation of all magic squares from N=3 to N=1000. The table 
> from Julia's homepage suggests that in most cases, it is significantly 
> faster than Python and Matlab. In my case, it's significantly slower, which 
> is somehow disappointing ;) My question now is:
>
> Can the implementation be optimized to outperform the other two?
>
> *The times:*
>
> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428 
> bytes allocated, 12.54% gc time)
> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792 
> bytes allocated, 26.99% gc time)
> Matlab: Elapsed time is 4.994960 seconds.
> Python: 1 loops, best of 3: 2.09 s per loop
>
> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM, 
> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does 
> not seem to gc during this loop (disabling gc doesn't alter the time here), 
> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's 
> still much slower than Python. (By the way, even Octave only needs 4.46 
> seconds.) If I translate the matrices in magic_python to account for 
> column-major storage, the execution time does not significantly improve.
>
> *The code:*
>
> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in 
> range(3, 1001) ];
> Julia: I've uploaded the code to a Gist at 
> https://gist.github.com/phillipberndt/2db94bf5e0c16161dedc and will paste 
> a copy below this post.
>
>
> Cheers,
> Phillip
>
>
> function magic_matlab(n::Int64)
> # Works exactly as Matlab's magic.m
>
> if n % 2 == 1
> p = (1:n)
> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) + 
> mod(broadcast(+, p', 2p - 2), n) + 1
> return M
> elseif n % 4 == 0
> J = div([1:n] % 4, 2)
> K = J' .== J
> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
> M[K] = n^2 + 1 - M[K]
> return M
> else
> p = div(n, 2)
> M = magic_matlab(p)
> M = [M M+2p^2; M+3p^2 M+p^2]
> if n == 2
> return M
> end
> i = (1:p)
> k = (n-2)/4
> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
> M[[i; i+p],j] = M[[i+p; i],j]
> i = k+1
> j = [1; i]
> M[[i; i+p],j] = M[[i+p; i],j]
> return M
> end
> end
> @vectorize_1arg Int magic_matlab
>
> function magic_python(n::Int64)
> # Works exactly as magic_square.py (from pypy)
>
> if n % 2 == 1
> m = (n >> 1) + 1
> b = n^2 + 1
>
> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
> return M
> elseif n % 4 == 0
> b = n^2 + 1
> d = reshape(1:b-1, n, n)
>
> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>
> return d
> else
> m = n >> 1
> k = m >> 1
> b = m^2
>
> d = repmat(magic_python(m), 2, 2)
>
> d[1:m, 1:k] += 3*b
> d[1+m:end, 1+k:m] += 3*b
> d[1+k, 1+k] += 3*b
> d[1+k, 1] -= 3*b
> d[1+m+k, 1] += 3*b
> d[1+m+k, 1+k] -= 3*b
> d[1:m,1+m:n-k+1] += b+b
> d[1+m:end, 1+m:n-k+1] += b
> d[1:m, 1+n-k+1:end] += b
> d[1+m:end, 1+n-k+1:end] += b+b
>
> return d
>end
> end
> @vectorize_1arg Int magic_python
>
> print("Matlab version: ")
> @time magic_matlab(3:1000)
>
> print("Python version: ")
> @time magic_python(3:1000)
>
>
>

Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Steven G. Johnson
This is now implemented in Color.jl; not tagged yet, but you can of course 
do Pkg.checkout("Color")

Fun thing to try:

using Interact, Color
@manipulate for m = 1:50, n = 1:100
RGB[RGB(i/m,j/n,0) for i=1:m, j=1:n]
end

On Monday, June 9, 2014 2:07:22 PM UTC-4, Stefan Karpinski wrote:
>
> That does seem like a rather nice solution. Makes sense for matrices too – 
> displaying a color matrix as a 2D color swatch would be handy.
>
>
> On Mon, Jun 9, 2014 at 1:54 PM, Steven G. Johnson  > wrote:
>
>> Rather than defining a ColorVector type to display color vectors as 
>> rainbow swatches, it might be nice to update the writemime function for 
>> AbstractVector{<:ColorValue} in Color.jl 
>>  
>> so 
>> that it displays long vectors more nicely.  That is, shrink the width of 
>> the swatch size further for long vectors, e.g. in order to fix the overall 
>> width.
>>
>
>

Re: [julia-users] Re: Can this magic square function be further optimized?

2014-08-25 Thread Stefan Karpinski
This is a classic case of the need for vectorization in Python and Matlab
making the algorithm completely incomprehensible. You're often much better
off porting to Julia from simple for-loop based C or Java codes instead.


On Mon, Aug 25, 2014 at 2:05 PM, Iain Dunning  wrote:

> Updated gist for the doubly-even order case
> https://gist.github.com/IainNZ/9b5f1eb1bcf923ed02d9
>
> For a magic square of size 1:
> Mine: 0.47
> Matlab-style: 1.7
> Python-style: 1.0
>
> So, probably faster than MATLAB-in-MATLAB at this point, maybe same as
> PyPy?
>
>
> On Monday, August 25, 2014 9:38:11 AM UTC-4, Phillip Berndt wrote:
>
>> Hi julia-users,
>>
>> I've recently stumbled over Julia and wanted to give it a try.
>>
>> To assess it's speed, I've implemented another micro-benchmark, namely a
>> version of Matlab's magic() function that generates magic squares. Since I
>> have no experience writing optimal Julia code, I started off with literal
>> translations of two different implementations - Matlab's and the one from
>> magic_square.py from PyPy, which is an optimized version for NumPy. I then
>> timed the calculation of all magic squares from N=3 to N=1000. The table
>> from Julia's homepage suggests that in most cases, it is significantly
>> faster than Python and Matlab. In my case, it's significantly slower, which
>> is somehow disappointing ;) My question now is:
>>
>> Can the implementation be optimized to outperform the other two?
>>
>> *The times:*
>>
>> Julia, Matlab version: elapsed time: 18.495374216 seconds (13404087428
>> bytes allocated, 12.54% gc time)
>> Julia, Python version: elapsed time: 8.107275449 seconds (13532473792
>> bytes allocated, 26.99% gc time)
>> Matlab: Elapsed time is 4.994960 seconds.
>> Python: 1 loops, best of 3: 2.09 s per loop
>>
>> My test machine is a 4 Core i7-4600 Notebook with 2.1 GHz and 8 GiB RAM,
>> running a current Linux Mint and Julia 0.3 stable. To be fair, Python does
>> not seem to gc during this loop (disabling gc doesn't alter the time here),
>> so one should compare with 8.1 s * (1.-.2699) = 5.91 s for Julia. That's
>> still much slower than Python. (By the way, even Octave only needs 4.46
>> seconds.) If I translate the matrices in magic_python to account for
>> column-major storage, the execution time does not significantly improve.
>>
>> *The code:*
>>
>> Matlab: tic; arrayfun(@magic, 3:1000, 'UniformOutput', false); toc
>> IPython: import magic_square; %timeit [ magic_square.magic(x) for x in
>> range(3, 1001) ];
>> Julia: I've uploaded the code to a Gist at https://gist.github.com/
>> phillipberndt/2db94bf5e0c16161dedc and will paste a copy below this post.
>>
>>
>> Cheers,
>> Phillip
>>
>>
>> function magic_matlab(n::Int64)
>> # Works exactly as Matlab's magic.m
>>
>> if n % 2 == 1
>> p = (1:n)
>> M = n * mod(broadcast(+, p', p - div(n+3, 2)), n) +
>> mod(broadcast(+, p', 2p - 2), n) + 1
>> return M
>> elseif n % 4 == 0
>> J = div([1:n] % 4, 2)
>> K = J' .== J
>> M = broadcast(+, [1:n:(n*n)]', [0:n-1])
>> M[K] = n^2 + 1 - M[K]
>> return M
>> else
>> p = div(n, 2)
>> M = magic_matlab(p)
>> M = [M M+2p^2; M+3p^2 M+p^2]
>> if n == 2
>> return M
>> end
>> i = (1:p)
>> k = (n-2)/4
>> j = convert(Array{Int}, [(1:k); ((n-k+2):n)])
>> M[[i; i+p],j] = M[[i+p; i],j]
>> i = k+1
>> j = [1; i]
>> M[[i; i+p],j] = M[[i+p; i],j]
>> return M
>> end
>> end
>> @vectorize_1arg Int magic_matlab
>>
>> function magic_python(n::Int64)
>> # Works exactly as magic_square.py (from pypy)
>>
>> if n % 2 == 1
>> m = (n >> 1) + 1
>> b = n^2 + 1
>>
>> M = reshape(repmat(1:n:b-n, 1, n+2)[m:end-m], n+1, n)[2:end, :] +
>> reshape(repmat(0:(n-1), 1, n+2), n+2, n)[2:end-1, :]'
>> return M
>> elseif n % 4 == 0
>> b = n^2 + 1
>> d = reshape(1:b-1, n, n)
>>
>> d[1:4:n, 1:4:n] = b - d[1:4:n, 1:4:n]
>> d[1:4:n, 4:4:n] = b - d[1:4:n, 4:4:n]
>> d[4:4:n, 1:4:n] = b - d[4:4:n, 1:4:n]
>> d[4:4:n, 4:4:n] = b - d[4:4:n, 4:4:n]
>> d[2:4:n, 2:4:n] = b - d[2:4:n, 2:4:n]
>> d[2:4:n, 3:4:n] = b - d[2:4:n, 3:4:n]
>> d[3:4:n, 2:4:n] = b - d[3:4:n, 2:4:n]
>> d[3:4:n, 3:4:n] = b - d[3:4:n, 3:4:n]
>>
>> return d
>> else
>> m = n >> 1
>> k = m >> 1
>> b = m^2
>>
>> d = repmat(magic_python(m), 2, 2)
>>
>> d[1:m, 1:k] += 3*b
>> d[1+m:end, 1+k:m] += 3*b
>> d[1+k, 1+k] += 3*b
>> d[1+k, 1] -= 3*b
>> d[1+m+k, 1] += 3*b
>> d[1+m+k, 1+k] -= 3*b
>> d[1:m,1+m:n-k+1] += b+b
>> d[1+m:end, 1+m:n-k+1] += b
>> d[1:m, 1+n-k+1:end] += b
>> d[1+m:end, 1+n-k+1:end] += b+b
>>
>> return d
>>end
>> end
>> @vectorize_1arg Int magic_python
>>
>> print("Matlab version: ")

Re: [julia-users] ANN: major upgrade to HDF5/JLD

2014-08-25 Thread Stefan Karpinski
That sounds amazing.


On Mon, Aug 25, 2014 at 1:18 PM, Tim Holy  wrote:

> I'd like to announce the merger of a massive revamping of HDF5 & JLD. The
> bottom line is that now many immutables and types will be saved in a far
> more
> efficient fashion to disk, particularly for arrays of immutables. In such
> cases,
> the performance gains are quite extraordinary, often hundred-fold or
> larger in
> terms of time, and the resulting files are much smaller. For many
> situations,
> HDF5 is now comparable to the serializer (and sometimes faster) for reading
> from and writing to disk.
>
> In addition to the performance enhancement, the whole system in JLD for
> saving
> and loading Julia types has been given a facelift. Rather than barfing on
> broken or missing types, there's a sophisticated method to reconstruct
> types
> based on data in the disk file. You might appreciate this if you save data,
> change your type definitions in your code, and then want the data back
> again.
> While these types won't allow you to proceed entirely as if nothing is
> wrong,
> it does give you an easier path to recovery.
>
> Finally, all this has been done with an effort to preserve backwards
> compatibility. All tests pass with the new format (and many new tests have
> been added), and the JLDArchives tests pass, suggesting that at least for
> the
> files checked into JLDArchives, there's been no hit to the ability to read
> old
> files.
>
> Before tagging a new version, we're hoping that a few hardy souls will
> experiment by checking out master for the HDF5 package. Testing is needed
> because we're talking about a long-term data storage format here. Once this
> becomes "official" (by tagging a new version), I'll deposit a copy of a
> test file
> in JLDArchives, so that format becomes something we have to support going
> forward. It would be much nicer to uncover any problems quickly so we don't
> have to worry about ugly kludges :-).
>
> My role in this has exclusively been that of cheerleader: the work was
> done by
> Simon Kornblith and incorporated some great preliminary work by Matt
> Baumann.
> This is a 30-commit, 2,000 change, so it's a huge effort and very
> beautifully
> done. My sincere thanks go to them!
>
> --Tim
>
>


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Patrick O'Leary
On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>
> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type system 
> always employs invariance for parametric types: 
> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>  
> 
>

To underline this point a bit, it's even a bit worse than that: 
Array{FloatingPoint} will work just fine for a lot of things, but it stores 
all elements as heap pointers, so array-like operations (such as linear 
algebra routines) will often be extremely slow.

As a rule, you almost never use an abstract type as the type parameter of a 
parametric type for this reason. Where you wish to be generic over a 
specific family of types under an abstract type, you can use type 
constraints:

function foo{T<:FloatingPoint}(src::Array{T,1})
 ...
end

But often type annotations can be omitted completely.


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang

Thanks Patrick. So to clarify: *FloatingPoint* is not a concrete types, so 
explicitly defining variables or function inputs using it will not speed 
things up. Instead, I should use *Float64*, *Float32*, etc. 

Is *Int* an abstract type as well? I'm wondering if I should go back and 
rename everything *my_var::Int* to *my_var::Int32*.


On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>
> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>
>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type system 
>> always employs invariance for parametric types: 
>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>  
>> 
>>
>
> To underline this point a bit, it's even a bit worse than that: 
> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
> all elements as heap pointers, so array-like operations (such as linear 
> algebra routines) will often be extremely slow.
>
> As a rule, you almost never use an abstract type as the type parameter of 
> a parametric type for this reason. Where you wish to be generic over a 
> specific family of types under an abstract type, you can use type 
> constraints:
>
> function foo{T<:FloatingPoint}(src::Array{T,1})
>  ...
> end
>
> But often type annotations can be omitted completely.
>


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang

Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
explicitly defining variables or function inputs using it will not speed 
things up. Instead, I should use Float64, Float32, etc.

Is Int an abstract type as well? I'm wondering if I should go back and 
rename everything my_var::Int to my_var::Int32.

John: I couldn't find the mutate!() function in the Julia Standard Library 
v0.3. Do you mean my own function that mutates the source array?

On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>
> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>
>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type system 
>> always employs invariance for parametric types: 
>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>  
>> 
>>
>
> To underline this point a bit, it's even a bit worse than that: 
> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
> all elements as heap pointers, so array-like operations (such as linear 
> algebra routines) will often be extremely slow.
>
> As a rule, you almost never use an abstract type as the type parameter of 
> a parametric type for this reason. Where you wish to be generic over a 
> specific family of types under an abstract type, you can use type 
> constraints:
>
> function foo{T<:FloatingPoint}(src::Array{T,1})
>  ...
> end
>
> But often type annotations can be omitted completely.
>


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Tomas Lycken


Actually, Int (and UInt) are aliases to the “native size integer”, so if 
you specify Int you will get Int32 on a 32-bit system and Int64 on a 64-bit 
system. So no, don’t change my_var::Int to my_var::Int32 - that’ll make 
your code *worse* on 64-bit systems ;)

// T

On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:


> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
> explicitly defining variables or function inputs using it will not speed 
> things up. Instead, I should use Float64, Float32, etc.
>
> Is Int an abstract type as well? I'm wondering if I should go back and 
> rename everything my_var::Int to my_var::Int32.
>
> John: I couldn't find the mutate!() function in the Julia Standard Library 
> v0.3. Do you mean my own function that mutates the source array?
>
> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>>
>> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>>
>>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
>>> system always employs invariance for parametric types: 
>>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>>  
>>> 
>>>
>>
>> To underline this point a bit, it's even a bit worse than that: 
>> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
>> all elements as heap pointers, so array-like operations (such as linear 
>> algebra routines) will often be extremely slow.
>>
>> As a rule, you almost never use an abstract type as the type parameter of 
>> a parametric type for this reason. Where you wish to be generic over a 
>> specific family of types under an abstract type, you can use type 
>> constraints:
>>
>> function foo{T<:FloatingPoint}(src::Array{T,1})
>>  ...
>> end
>>
>> But often type annotations can be omitted completely.
>>
> ​


Re: [julia-users] ANN: major upgrade to HDF5/JLD

2014-08-25 Thread Ross Boylan
Is the new work for Julia 0.3, 0.4, or both?
Ross Boylan


Re: [julia-users] Re: Announcing Julia 0.3.0 final

2014-08-25 Thread Elliot Saba
If you add the official Ubuntu releases PPA, it will automatically pick up
Julia 0.3 as a normal software update.  Your packages will likely need to
be reinstalled since major versions of julia separate their packages from
one another.
-E


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang
Thanks Tom. Pweh, that's what I suspected. 

I glanced at boot.jl, and it doesn't seem Julia has a typealias for 
doubles. I'll define my own to check for 32 vs. 64-bit systems.

On Monday, 25 August 2014 15:10:30 UTC-4, Tomas Lycken wrote:
>
> Actually, Int (and UInt) are aliases to the “native size integer”, so if 
> you specify Int you will get Int32 on a 32-bit system and Int64 on a 
> 64-bit system. So no, don’t change my_var::Int to my_var::Int32 - that’ll 
> make your code *worse* on 64-bit systems ;)
>
> // T
>
> On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:
>
>
>> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
>> explicitly defining variables or function inputs using it will not speed 
>> things up. Instead, I should use Float64, Float32, etc.
>>
>> Is Int an abstract type as well? I'm wondering if I should go back and 
>> rename everything my_var::Int to my_var::Int32.
>>
>> John: I couldn't find the mutate!() function in the Julia Standard 
>> Library v0.3. Do you mean my own function that mutates the source array?
>>
>> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>>>
>>> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:

 Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
 system always employs invariance for parametric types: 
 https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
  
 

>>>
>>> To underline this point a bit, it's even a bit worse than that: 
>>> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
>>> all elements as heap pointers, so array-like operations (such as linear 
>>> algebra routines) will often be extremely slow.
>>>
>>> As a rule, you almost never use an abstract type as the type parameter 
>>> of a parametric type for this reason. Where you wish to be generic over a 
>>> specific family of types under an abstract type, you can use type 
>>> constraints:
>>>
>>> function foo{T<:FloatingPoint}(src::Array{T,1})
>>>  ...
>>> end
>>>
>>> But often type annotations can be omitted completely.
>>>
>> ​
>


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Patrick O'Leary
Even 32 bit systems have double-precision floating point units.

In general, explicitly defining variables or function inputs will not speed 
things up.

On Monday, August 25, 2014 2:38:44 PM UTC-5, Roy Wang wrote:
>
> Thanks Tom. Pweh, that's what I suspected. 
>
> I glanced at boot.jl, and it doesn't seem Julia has a typealias for 
> doubles. I'll define my own to check for 32 vs. 64-bit systems.
>
> On Monday, 25 August 2014 15:10:30 UTC-4, Tomas Lycken wrote:
>>
>> Actually, Int (and UInt) are aliases to the “native size integer”, so if 
>> you specify Int you will get Int32 on a 32-bit system and Int64 on a 
>> 64-bit system. So no, don’t change my_var::Int to my_var::Int32 - 
>> that’ll make your code *worse* on 64-bit systems ;)
>>
>> // T
>>
>> On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:
>>
>>
>>> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
>>> explicitly defining variables or function inputs using it will not speed 
>>> things up. Instead, I should use Float64, Float32, etc.
>>>
>>> Is Int an abstract type as well? I'm wondering if I should go back and 
>>> rename everything my_var::Int to my_var::Int32.
>>>
>>> John: I couldn't find the mutate!() function in the Julia Standard 
>>> Library v0.3. Do you mean my own function that mutates the source array?
>>>
>>> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:

 On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>
> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
> system always employs invariance for parametric types: 
> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>  
> 
>

 To underline this point a bit, it's even a bit worse than that: 
 Array{FloatingPoint} will work just fine for a lot of things, but it 
 stores 
 all elements as heap pointers, so array-like operations (such as linear 
 algebra routines) will often be extremely slow.

 As a rule, you almost never use an abstract type as the type parameter 
 of a parametric type for this reason. Where you wish to be generic over a 
 specific family of types under an abstract type, you can use type 
 constraints:

 function foo{T<:FloatingPoint}(src::Array{T,1})
  ...
 end

 But often type annotations can be omitted completely.

>>> ​
>>
>

Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Tobias Knopp
Thats for a reason. Float64 and Float32 are the same on 64 and 32 bit 
computers. Its only the integer types where this matters.

Am Montag, 25. August 2014 21:38:44 UTC+2 schrieb Roy Wang:
>
> Thanks Tom. Pweh, that's what I suspected. 
>
> I glanced at boot.jl, and it doesn't seem Julia has a typealias for 
> doubles. I'll define my own to check for 32 vs. 64-bit systems.
>
> On Monday, 25 August 2014 15:10:30 UTC-4, Tomas Lycken wrote:
>>
>> Actually, Int (and UInt) are aliases to the “native size integer”, so if 
>> you specify Int you will get Int32 on a 32-bit system and Int64 on a 
>> 64-bit system. So no, don’t change my_var::Int to my_var::Int32 - 
>> that’ll make your code *worse* on 64-bit systems ;)
>>
>> // T
>>
>> On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:
>>
>>
>>> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
>>> explicitly defining variables or function inputs using it will not speed 
>>> things up. Instead, I should use Float64, Float32, etc.
>>>
>>> Is Int an abstract type as well? I'm wondering if I should go back and 
>>> rename everything my_var::Int to my_var::Int32.
>>>
>>> John: I couldn't find the mutate!() function in the Julia Standard 
>>> Library v0.3. Do you mean my own function that mutates the source array?
>>>
>>> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:

 On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>
> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
> system always employs invariance for parametric types: 
> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>  
> 
>

 To underline this point a bit, it's even a bit worse than that: 
 Array{FloatingPoint} will work just fine for a lot of things, but it 
 stores 
 all elements as heap pointers, so array-like operations (such as linear 
 algebra routines) will often be extremely slow.

 As a rule, you almost never use an abstract type as the type parameter 
 of a parametric type for this reason. Where you wish to be generic over a 
 specific family of types under an abstract type, you can use type 
 constraints:

 function foo{T<:FloatingPoint}(src::Array{T,1})
  ...
 end

 But often type annotations can be omitted completely.

>>> ​
>>
>

Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang
I didn't know Float64 and Float32 are the same on 32-bit systems. Thanks.

On Monday, 25 August 2014 15:48:30 UTC-4, Tobias Knopp wrote:
>
> Thats for a reason. Float64 and Float32 are the same on 64 and 32 bit 
> computers. Its only the integer types where this matters.
>
> Am Montag, 25. August 2014 21:38:44 UTC+2 schrieb Roy Wang:
>>
>> Thanks Tom. Pweh, that's what I suspected. 
>>
>> I glanced at boot.jl, and it doesn't seem Julia has a typealias for 
>> doubles. I'll define my own to check for 32 vs. 64-bit systems.
>>
>> On Monday, 25 August 2014 15:10:30 UTC-4, Tomas Lycken wrote:
>>>
>>> Actually, Int (and UInt) are aliases to the “native size integer”, so 
>>> if you specify Int you will get Int32 on a 32-bit system and Int64 on a 
>>> 64-bit system. So no, don’t change my_var::Int to my_var::Int32 - 
>>> that’ll make your code *worse* on 64-bit systems ;)
>>>
>>> // T
>>>
>>> On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:
>>>
>>>
 Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
 explicitly defining variables or function inputs using it will not speed 
 things up. Instead, I should use Float64, Float32, etc.

 Is Int an abstract type as well? I'm wondering if I should go back and 
 rename everything my_var::Int to my_var::Int32.

 John: I couldn't find the mutate!() function in the Julia Standard 
 Library v0.3. Do you mean my own function that mutates the source array?

 On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>
> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>
>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
>> system always employs invariance for parametric types: 
>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>  
>> 
>>
>
> To underline this point a bit, it's even a bit worse than that: 
> Array{FloatingPoint} will work just fine for a lot of things, but it 
> stores 
> all elements as heap pointers, so array-like operations (such as linear 
> algebra routines) will often be extremely slow.
>
> As a rule, you almost never use an abstract type as the type parameter 
> of a parametric type for this reason. Where you wish to be generic over a 
> specific family of types under an abstract type, you can use type 
> constraints:
>
> function foo{T<:FloatingPoint}(src::Array{T,1})
>  ...
> end
>
> But often type annotations can be omitted completely.
>
 ​
>>>
>>

Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Tobias Knopp
Sorry I was not clear enough. They are not the same but there is no reason 
to use Float32 on a 32 bit system and Float64 on a 64 bit system. On both 
32bit and 64bit CPUs you will usually have 80bit floating point units. So 
both will be equally fast (if we ignore caching for a moment). In contrast 
integers and in particular array indices will be slower if one uses Int64 
on a 32bit system. Therefore we have Int as a shortcut for the "native" 
integer type.

Cheers,

Tobi

Am Montag, 25. August 2014 21:57:40 UTC+2 schrieb Roy Wang:
>
> I didn't know Float64 and Float32 are the same on 32-bit systems. Thanks.
>
> On Monday, 25 August 2014 15:48:30 UTC-4, Tobias Knopp wrote:
>>
>> Thats for a reason. Float64 and Float32 are the same on 64 and 32 bit 
>> computers. Its only the integer types where this matters.
>>
>> Am Montag, 25. August 2014 21:38:44 UTC+2 schrieb Roy Wang:
>>>
>>> Thanks Tom. Pweh, that's what I suspected. 
>>>
>>> I glanced at boot.jl, and it doesn't seem Julia has a typealias for 
>>> doubles. I'll define my own to check for 32 vs. 64-bit systems.
>>>
>>> On Monday, 25 August 2014 15:10:30 UTC-4, Tomas Lycken wrote:

 Actually, Int (and UInt) are aliases to the “native size integer”, so 
 if you specify Int you will get Int32 on a 32-bit system and Int64 on 
 a 64-bit system. So no, don’t change my_var::Int to my_var::Int32 - 
 that’ll make your code *worse* on 64-bit systems ;)

 // T

 On Monday, August 25, 2014 9:05:06 PM UTC+2, Roy Wang wrote:


> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
> explicitly defining variables or function inputs using it will not speed 
> things up. Instead, I should use Float64, Float32, etc.
>
> Is Int an abstract type as well? I'm wondering if I should go back and 
> rename everything my_var::Int to my_var::Int32.
>
> John: I couldn't find the mutate!() function in the Julia Standard 
> Library v0.3. Do you mean my own function that mutates the source array?
>
> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>>
>> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>>
>>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
>>> system always employs invariance for parametric types: 
>>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>>  
>>> 
>>>
>>
>> To underline this point a bit, it's even a bit worse than that: 
>> Array{FloatingPoint} will work just fine for a lot of things, but it 
>> stores 
>> all elements as heap pointers, so array-like operations (such as linear 
>> algebra routines) will often be extremely slow.
>>
>> As a rule, you almost never use an abstract type as the type 
>> parameter of a parametric type for this reason. Where you wish to be 
>> generic over a specific family of types under an abstract type, you can 
>> use 
>> type constraints:
>>
>> function foo{T<:FloatingPoint}(src::Array{T,1})
>>  ...
>> end
>>
>> But often type annotations can be omitted completely.
>>
> ​

>>>

Re: [julia-users] ANN: major upgrade to HDF5/JLD

2014-08-25 Thread Tim Holy
On Monday, August 25, 2014 12:13:53 PM Ross Boylan wrote:
> Is the new work for Julia 0.3, 0.4, or both?

Both



Re: [julia-users] Re: Announcing Julia 0.3.0 final

2014-08-25 Thread gentlebeldin
Works like a charm, thanks!

Am Montag, 25. August 2014 21:24:44 UTC+2 schrieb Elliot Saba:
>
> If you add the official Ubuntu releases PPA, it will automatically pick up 
> Julia 0.3 as a normal software update.  Your packages will likely need to 
> be reinstalled since major versions of julia separate their packages from 
> one another.
> -E
>


[julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Tomas Lycken
Now that 0.3 is released, I want to update my Travis scripts and get rid of 
those ugly conditionals on `JULIAVERSION` in favor of just running package 
tests with `Pkg.test()`, since it also handles dependency resolution etc.

However, I've seen a multitude of ways to install the package under test 
correctly on Travis, and I can't figure out what way is "the best", or 
recommended, way to do it. I've seen the following "in the wild":

* Manually installing dependencies, `ls`-ing and pinning:

```
- julia -e 'Pkg.init(); Pkg.add("ImmutableArrays")'
- julia -e 'run(`ln -s $(pwd()) $(Pkg.dir("Contour"))`); 
Pkg.pin("Contour"); Pkg.resolve();' 
```

followed by `julia test/runtest.jl`, possibly with `--coverage`.

* Cloning pwd

```
- julia -e 'Pkg.clone(pwd()); Pkg.test("Contour")'
```

possibly with a `coverage=true` kwarg to `Pkg.test()`. This seems to be 
what the file generated by `Pkg.generate()` does at the moment, too. There 
is certainly other possibilities out there as well.

I definitely prefer the latter version - if nothing else because it's 
shorter - but I can't say I know how Travis works well enough to say for 
sure if it has drawbacks that I don't understand. Is there a "recommended 
best practice" on how to do this? Is the file generated by `Pkg` updated to 
reflect this best practice?

// T


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Elliot Saba
I think the simplest and easiest way of doing this is to init(), clone(),
and test():

julia -e 'Pkg.init(); Pkg.clone(pwd()); Pkg.test("")'

If you want to include code coverage and whatnot, you'll need to put your
Pkg.test() call on a separate line and pass --code-coverage to the julia
executable.
-E


Re: [julia-users] Re: How to import module from another file in same directory?

2014-08-25 Thread Andrei
Tobias, thanks a lot. From first link I figured out that if module name
matches file name, then I can actually import it directly. That is:

# U.jl
module U
f() = 42
end

# A.jl
module A
import U
U.f()
end


Which looks pretty obvious, so that's strange that haven't found it earlier.




On Mon, Aug 25, 2014 at 5:36 PM, Tobias Knopp 
wrote:

> There is https://github.com/JuliaLang/julia/issues/4600 but there was
> recently quite some dicussion on the julia-dev mailing list as well as
> https://github.com/JuliaLang/julia/issues/8014
>
> Cheers,
>
> Tobi
>
> Am Montag, 25. August 2014 16:29:03 UTC+2 schrieb Andrei Zh:
>
>> Valentin, thanks for your answer, but it seems like I need to give you
>> some more context (sorry for not mentioning it earlier). I'm trying to
>> repeat my experience of interactive development in languages like Python or
>> Lisp. In these languages I can load some module/file contents to REPL
>> ("__main__" module in Python, "user" namespace in Clojure, etc.) and play
>> around with the code just like if I was "inside" of module under
>> development. E.g. I can modify some function, send new definition to REPL
>> and immediately try it out. I can also import any other
>> modules/packages/namespaces. In Python, for example, being in __main__
>> (with loaded definitions from target module) I can refer to any other
>> module on PYTHONPATH by its full name. Same thing with Clojure - any
>> namespace on CLASSPATH is available for loading.
>>
>> In Julia there's Main module too. I can load some code and play around
>> with it, just like in REPLs of other lanuages. E.g. I can start editor,
>> open some file "linreg.jl", send all its contents to REPL, see how it
>> works, update, reload, etc. Works like a charm... until I try to import
>> another module.
>>
>> Unlike Python or Clojure, Julia's module system is decoupled from source
>> files and directory structure. Correct me if I'm wrong, but it seems like
>> there's no way to load module other than include() its source file. At the
>> same time, I cannot include files all here and there. E.g. in example above
>> when I work on module A (from REPL/Main) I cannot include "P.jl", because
>> "P.jl" contains recursive include() of "a.jl", and they just re-include
>> each other endlessly.
>>
>> So the only way we can make it work is to load module system from the top
>> level ("P.jl") and then refer to other modules with respect to it (e.g.
>> like "using .A" or "import ..U"). It works fine with third party packages,
>> but I find it really frustrating when working on some internal module (e.g.
>> A).
>>
>> Thus any tips and tricks on loading modules when working from REPL/Main
>> are welcome.
>>
>>
>>
>> On Sunday, August 24, 2014 5:38:53 PM UTC+3, Valentin Churavy wrote:
>>>
>>> What you are looking for is described in http://julia.readthedocs.
>>> org/en/latest/manual/modules/#relative-and-absolute-module-paths
>>> 
>>>
>>> in P.jl you include all your submodules
>>> module P
>>>  include("u.jl")
>>>  include("a.jl")
>>>  include("b.jl")
>>>
>>>  using .A, .B
>>>
>>>  export f, g
>>> end
>>>
>>>  u.jl
>>> module U
>>>  g() = 5
>>>  f() = 6
>>> end
>>>
>>> a.jl and b.jl both lokk like this
>>>
>>> module A
>>>  import ..U
>>>
>>>  f = U.f
>>>  g = U.g
>>>
>>>  export f, g
>>> end
>>>
>>> so one dot as a prefix looks in the namespace of the current module and
>>> two dots as prefix looks in the namespace of the parent module.
>>>
>>> Hope that helps
>>>
>>> On Sunday, 24 August 2014 14:10:58 UTC+2, Andrei Zh wrote:

 Let's say I have following project layout:

 P.jl that contains module P -- main package module, exposes code from
 a.jl and b.jl
 a.jl that contains module A and
 b.jl that contains module B -- some domain specific modules
 u.jl that contains module U -- util functions

 Now I want to use functions in U from modules A and B. In simplest case
 I would just include("u.jl") inside of a.jl and b.jl, but this way
 functions from U will be defined in both - A and B. So I really want to
 import U, not include u.jl, but I can't do this since u.jl is not on the
 LOAD_PATH (and messing with it manually looks somewhat bad to me).

 Is there some standard way to tackle it?

 (Note, that A, B and U are here just for code splitting, other ways to
 do same stuff are ok too.)

>>>


Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread John Myles White
Yes, I meant for mutate! to be your mutating implementation of the function in 
question.

 -- John

On Aug 25, 2014, at 12:05 PM, Roy Wang  wrote:

> 
> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
> explicitly defining variables or function inputs using it will not speed 
> things up. Instead, I should use Float64, Float32, etc.
> 
> Is Int an abstract type as well? I'm wondering if I should go back and rename 
> everything my_var::Int to my_var::Int32.
> 
> John: I couldn't find the mutate!() function in the Julia Standard Library 
> v0.3. Do you mean my own function that mutates the source array?
> 
> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type system 
> always employs invariance for parametric types: 
> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
> 
> To underline this point a bit, it's even a bit worse than that: 
> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
> all elements as heap pointers, so array-like operations (such as linear 
> algebra routines) will often be extremely slow.
> 
> As a rule, you almost never use an abstract type as the type parameter of a 
> parametric type for this reason. Where you wish to be generic over a specific 
> family of types under an abstract type, you can use type constraints:
> 
> function foo{T<:FloatingPoint}(src::Array{T,1})
>  ...
> end
> 
> But often type annotations can be omitted completely.



Re: [julia-users] Re: Does Julia have something similar to Python's documentation string?

2014-08-25 Thread Job van der Zwan
On Monday, 25 August 2014 01:23:26 UTC+2, Jason Knight wrote:

> Happy reading: https://github.com/JuliaLang/julia/issues/3988 :)
>

Thanks, that was indeed interesting :)

On Monday, 25 August 2014 01:43:11 UTC+2, Stefan Karpinski wrote:
>
> I really like godoc – that's basically what I want plus a convention that 
> the doc strings are markdown.
>

>From what I understand of the discussion linked above, the suggested 
approach is a @doc macro followed by a string, making documentation part of 
compiling the code, correct? The godoc approach is different in two ways: 
documentation is not part of the runtime but a separate tool that parses Go 
source files, and it extracts documentation from the *comments*, based on 
where they are placed.

The former part of the difference is just a consequence of how Go and Julia 
are used differently, so probably not that relevant, but Go's approach of 
using comments to indicate documentation sounds more sensible to me - 
documentation is what comments are for, are they not? Then why not suggest 
an idiomatic way to use the comments, and make a tool/the Julia runtime 
capable of extracting documentation information from that structure?

Mind you, I don't use Python so perhaps this is also a personal matter of 
not being used to docstrings.


Re: [julia-users] Re: Does Julia have something similar to Python's documentation string?

2014-08-25 Thread John Myles White
The issue is that you want to have all code documentation show up in REPL. In 
the GoDoc approach, this might require an explicit "build" step -- which is a 
non-trivial cost in usability.

 -- John

On Aug 25, 2014, at 3:01 PM, Job van der Zwan  wrote:

> On Monday, 25 August 2014 01:23:26 UTC+2, Jason Knight wrote:
> Happy reading: https://github.com/JuliaLang/julia/issues/3988 :)
> 
> Thanks, that was indeed interesting :)
> 
> On Monday, 25 August 2014 01:43:11 UTC+2, Stefan Karpinski wrote:
> I really like godoc – that's basically what I want plus a convention that the 
> doc strings are markdown.
> 
> From what I understand of the discussion linked above, the suggested approach 
> is a @doc macro followed by a string, making documentation part of compiling 
> the code, correct? The godoc approach is different in two ways: documentation 
> is not part of the runtime but a separate tool that parses Go source files, 
> and it extracts documentation from the comments, based on where they are 
> placed.
> 
> The former part of the difference is just a consequence of how Go and Julia 
> are used differently, so probably not that relevant, but Go's approach of 
> using comments to indicate documentation sounds more sensible to me - 
> documentation is what comments are for, are they not? Then why not suggest an 
> idiomatic way to use the comments, and make a tool/the Julia runtime capable 
> of extracting documentation information from that structure?
> 
> Mind you, I don't use Python so perhaps this is also a personal matter of not 
> being used to docstrings.



Re: [julia-users] Multivariate Normal versus Multivariate Normal Canon in Distributions package

2014-08-25 Thread John Myles White
This looks like a failure to find functions from NumericFuns.

What versions of Julia and stats packages are you using?

 -- John

On Aug 25, 2014, at 9:03 AM, asim  wrote:

> 



Re: [julia-users] Question about returning an Array from a function

2014-08-25 Thread Roy Wang
Okay guys, thanks!

On Monday, 25 August 2014 17:59:39 UTC-4, John Myles White wrote:
>
> Yes, I meant for mutate! to be your mutating implementation of the 
> function in question.
>
>  -- John
>
> On Aug 25, 2014, at 12:05 PM, Roy Wang > 
> wrote:
>
>
> Thanks guys. So to clarify: FloatingPoint is not a concrete types, so 
> explicitly defining variables or function inputs using it will not speed 
> things up. Instead, I should use Float64, Float32, etc.
>
> Is Int an abstract type as well? I'm wondering if I should go back and 
> rename everything my_var::Int to my_var::Int32.
>
> John: I couldn't find the mutate!() function in the Julia Standard Library 
> v0.3. Do you mean my own function that mutates the source array?
>
> On Monday, 25 August 2014 14:54:14 UTC-4, Patrick O'Leary wrote:
>>
>> On Monday, August 25, 2014 12:28:00 PM UTC-5, John Myles White wrote:
>>>
>>> Array{FloatingPoint} isn't related to Array{Float64}. Julia's type 
>>> system always employs invariance for parametric types: 
>>> https://en.wikipedia.org/wiki/Covariance_and_contravariance_(computer_science)
>>>  
>>> 
>>>
>>
>> To underline this point a bit, it's even a bit worse than that: 
>> Array{FloatingPoint} will work just fine for a lot of things, but it stores 
>> all elements as heap pointers, so array-like operations (such as linear 
>> algebra routines) will often be extremely slow.
>>
>> As a rule, you almost never use an abstract type as the type parameter of 
>> a parametric type for this reason. Where you wish to be generic over a 
>> specific family of types under an abstract type, you can use type 
>> constraints:
>>
>> function foo{T<:FloatingPoint}(src::Array{T,1})
>>  ...
>> end
>>
>> But often type annotations can be omitted completely.
>>
>
>

Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Yakir Gagnon
wow, that Interact package is interesting... I guess I'll have to start
using IJulia then. I'm still stuck with a Vim session and a Julia terminal.
I tried the checkout version of Color, and it's the same (see attached),
i.e. wrong: the blues should be close to the 400 mark and the reds closer
to the 700. the UV purple and IR "black" should be closer to the ends than
what we see. Any idea what's going wrong?


Yakir Gagnon
The Queensland Brain Institute (Building #79)
The University of Queensland
Brisbane QLD 4072
Australia

cell +61 (0)424 393 332
work +61 (0)733 654 089


On Tue, Aug 26, 2014 at 4:21 AM, Steven G. Johnson 
wrote:

> This is now implemented in Color.jl; not tagged yet, but you can of course
> do Pkg.checkout("Color")
>
> Fun thing to try:
>
> using Interact, Color
> @manipulate for m = 1:50, n = 1:100
> RGB[RGB(i/m,j/n,0) for i=1:m, j=1:n]
> end
>
> On Monday, June 9, 2014 2:07:22 PM UTC-4, Stefan Karpinski wrote:
>>
>> That does seem like a rather nice solution. Makes sense for matrices too
>> – displaying a color matrix as a 2D color swatch would be handy.
>>
>>
>> On Mon, Jun 9, 2014 at 1:54 PM, Steven G. Johnson 
>> wrote:
>>
>>> Rather than defining a ColorVector type to display color vectors as
>>> rainbow swatches, it might be nice to update the writemime function for
>>> AbstractVector{<:ColorValue} in Color.jl
>>>  
>>> so
>>> that it displays long vectors more nicely.  That is, shrink the width of
>>> the swatch size further for long vectors, e.g. in order to fix the overall
>>> width.
>>>
>>
>>


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Tony Kelman
+1 to what Elliot said.

If you have binary dependencies you may also need to add a call to 
Pkg.build("") in there, since Pkg.clone doesn't trigger bindeps.

And I personally like adding a versioninfo() so I can cross-reference the 
Julia sha in the build logs.


On Monday, August 25, 2014 2:05:29 PM UTC-7, Elliot Saba wrote:
>
> I think the simplest and easiest way of doing this is to init(), clone(), 
> and test():
>
> julia -e 'Pkg.init(); Pkg.clone(pwd()); Pkg.test("")'
>
> If you want to include code coverage and whatnot, you'll need to put your 
> Pkg.test() call on a separate line and pass --code-coverage to the julia 
> executable.
> -E
>


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Elliot Saba
I think that should be considered a bug, since clone() calls resolve()
which calls build(). but only for the stuff that resolve() thinks has
changed.  We should probably just call build() after resolve() in clone.
-E


Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Steven G. Johnson


On Monday, August 25, 2014 6:35:43 PM UTC-4, Yakir Gagnon wrote:

> I tried the checkout version of Color, and it's the same (see attached), 
> i.e. wrong: the blues should be close to the 400 mark and the reds closer 
> to the 700. the UV purple and IR "black" should be closer to the ends than 
> what we see. Any idea what's going wrong? 
>

You didn't give any indication of how you made that plot... 


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Iain Dunning
Check what I just did for HttpCommon, only thing lacking is a Pkg.build

https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml

script: 
- julia -e 'Pkg.init(); Pkg.clone(pwd())'
- julia -e 'Pkg.test("HttpCommon", coverage=true)'
after_success:
- julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using Coverage; 
Coveralls.submit(Coveralls.process_folder())'

This what I hope to see most packages look like now

On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
>
> I think that should be considered a bug, since clone() calls resolve() 
> which calls build(). but only for the stuff that resolve() thinks has 
> changed.  We should probably just call build() after resolve() in clone.
> -E
>


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Iain Dunning
The symlink style needs to go away, I think. If you see something, say 
something, and hopefully Travis configs will become a happier place.

On Monday, August 25, 2014 9:45:42 PM UTC-4, Iain Dunning wrote:
>
> Check what I just did for HttpCommon, only thing lacking is a Pkg.build
>
> https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
>
> script: 
> - julia -e 'Pkg.init(); Pkg.clone(pwd())'
> - julia -e 'Pkg.test("HttpCommon", coverage=true)'
> after_success:
> - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using 
> Coverage; Coveralls.submit(Coveralls.process_folder())'
>
> This what I hope to see most packages look like now
>
> On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
>>
>> I think that should be considered a bug, since clone() calls resolve() 
>> which calls build(). but only for the stuff that resolve() thinks has 
>> changed.  We should probably just call build() after resolve() in clone.
>> -E
>>
>

Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Jacob Quinn
And I've already stolen Iain's work for Dates.jl
 and Datetime.jl
. +1 for
consistent travis configs!


On Mon, Aug 25, 2014 at 9:46 PM, Iain Dunning  wrote:

> The symlink style needs to go away, I think. If you see something, say
> something, and hopefully Travis configs will become a happier place.
>
>
> On Monday, August 25, 2014 9:45:42 PM UTC-4, Iain Dunning wrote:
>>
>> Check what I just did for HttpCommon, only thing lacking is a Pkg.build
>>
>> https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
>>
>> script:
>> - julia -e 'Pkg.init(); Pkg.clone(pwd())'
>> - julia -e 'Pkg.test("HttpCommon", coverage=true)'
>> after_success:
>> - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using
>> Coverage; Coveralls.submit(Coveralls.process_folder())'
>>
>> This what I hope to see most packages look like now
>>
>> On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
>>>
>>> I think that should be considered a bug, since clone() calls resolve()
>>> which calls build(). but only for the stuff that resolve() thinks has
>>> changed.  We should probably just call build() after resolve() in clone.
>>> -E
>>>
>>


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Tim Holy
Once nightly and release become different again, will we need to reinstate the 
if clauses?

--Tim

On Monday, August 25, 2014 06:45:41 PM Iain Dunning wrote:
> Check what I just did for HttpCommon, only thing lacking is a Pkg.build
> 
> https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
> 
> script:
> - julia -e 'Pkg.init(); Pkg.clone(pwd())'
> - julia -e 'Pkg.test("HttpCommon", coverage=true)'
> after_success:
> - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using Coverage;
> Coveralls.submit(Coveralls.process_folder())'
> 
> This what I hope to see most packages look like now
> 
> On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
> > I think that should be considered a bug, since clone() calls resolve()
> > which calls build(). but only for the stuff that resolve() thinks has
> > changed.  We should probably just call build() after resolve() in clone.
> > -E



Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Elliot Saba
Only if there are features you need that exist in one and not the other, I
suppose.
-E


On Mon, Aug 25, 2014 at 10:07 PM, Tim Holy  wrote:

> Once nightly and release become different again, will we need to reinstate
> the
> if clauses?
>
> --Tim
>
> On Monday, August 25, 2014 06:45:41 PM Iain Dunning wrote:
> > Check what I just did for HttpCommon, only thing lacking is a Pkg.build
> >
> > https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
> >
> > script:
> > - julia -e 'Pkg.init(); Pkg.clone(pwd())'
> > - julia -e 'Pkg.test("HttpCommon", coverage=true)'
> > after_success:
> > - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using
> Coverage;
> > Coveralls.submit(Coveralls.process_folder())'
> >
> > This what I hope to see most packages look like now
> >
> > On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
> > > I think that should be considered a bug, since clone() calls resolve()
> > > which calls build(). but only for the stuff that resolve() thinks
> has
> > > changed.  We should probably just call build() after resolve() in
> clone.
> > > -E
>
>


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Tim Holy
In partiuclar, Coveralls won't get confused if we push both?

--Tim

On Monday, August 25, 2014 10:10:10 PM Elliot Saba wrote:
> Only if there are features you need that exist in one and not the other, I
> suppose.
> -E
> 
> On Mon, Aug 25, 2014 at 10:07 PM, Tim Holy  wrote:
> > Once nightly and release become different again, will we need to reinstate
> > the
> > if clauses?
> > 
> > --Tim
> > 
> > On Monday, August 25, 2014 06:45:41 PM Iain Dunning wrote:
> > > Check what I just did for HttpCommon, only thing lacking is a Pkg.build
> > > 
> > > https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
> > > 
> > > script:
> > > - julia -e 'Pkg.init(); Pkg.clone(pwd())'
> > > - julia -e 'Pkg.test("HttpCommon", coverage=true)'
> > > after_success:
> > > - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using
> > 
> > Coverage;
> > 
> > > Coveralls.submit(Coveralls.process_folder())'
> > > 
> > > This what I hope to see most packages look like now
> > > 
> > > On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
> > > > I think that should be considered a bug, since clone() calls resolve()
> > > > which calls build(). but only for the stuff that resolve() thinks
> > 
> > has
> > 
> > > > changed.  We should probably just call build() after resolve() in
> > 
> > clone.
> > 
> > > > -E



Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Elliot Saba
Ah, that I have no idea.
-E


On Mon, Aug 25, 2014 at 10:20 PM, Tim Holy  wrote:

> In partiuclar, Coveralls won't get confused if we push both?
>
> --Tim
>
> On Monday, August 25, 2014 10:10:10 PM Elliot Saba wrote:
> > Only if there are features you need that exist in one and not the other,
> I
> > suppose.
> > -E
> >
> > On Mon, Aug 25, 2014 at 10:07 PM, Tim Holy  wrote:
> > > Once nightly and release become different again, will we need to
> reinstate
> > > the
> > > if clauses?
> > >
> > > --Tim
> > >
> > > On Monday, August 25, 2014 06:45:41 PM Iain Dunning wrote:
> > > > Check what I just did for HttpCommon, only thing lacking is a
> Pkg.build
> > > >
> > > > https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
> > > >
> > > > script:
> > > > - julia -e 'Pkg.init(); Pkg.clone(pwd())'
> > > > - julia -e 'Pkg.test("HttpCommon", coverage=true)'
> > > > after_success:
> > > > - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using
> > >
> > > Coverage;
> > >
> > > > Coveralls.submit(Coveralls.process_folder())'
> > > >
> > > > This what I hope to see most packages look like now
> > > >
> > > > On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
> > > > > I think that should be considered a bug, since clone() calls
> resolve()
> > > > > which calls build(). but only for the stuff that resolve()
> thinks
> > >
> > > has
> > >
> > > > > changed.  We should probably just call build() after resolve() in
> > >
> > > clone.
> > >
> > > > > -E
>
>


[julia-users] Re: Function with state calling stateless version, or the other way around?

2014-08-25 Thread Abe Schneider
You can think of a Task like a function that maintains state automatically. 
However, instead of using returning values with 'return x', you use 
'produce(x)' instead.

This will cause execution of the Task to pause until it is called again. Tasks 
are called with the 'consume' function, which will in turn return the value 
given by the 'produce'.

The official Julia documentation gives code examples of how to create and use 
tasks. In the case of the documentation, a task is returned from within a 
function. The variables defined in the function get bound to the Task returned, 
which in turn are modified between each call to consume. Julia allows you to 
also use Tasks as iterators.

You can get similar behavior with returning a function from a function. The 
inner function will be bound to the outer function's variables and can modify 
them. If the variables are scalars, they will be copied, if they are complex 
types a reference is created.


Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Yakir Gagnon
Oh sorry, I vaguely mentioned it in my first reply.
The short answer is:

using Color,Images
n = 500
wl1 = 380.
wl2 = 780.
wl = linspace(wl1,wl2,n)
I = Array(Float64,n,n,3)
for i = 1:n
xyz = colormatch(wl[i])
rgb = convert(RGB,xyz)
for (j,f) in enumerate([:r,:g,:b])
I[i,:,j] = rgb.(f)
end
end
imwrite(I,"a.png")

This results in the attached image. While I'm sure there's a much better
way of getting that done (feel free to show be btw, I'd love to know how to
improve), you can immediately see that the blues and reds are too far close
to each other and that the UV violet and IR black are overly represented.

The long answer is that I used pgfplots with Julia to generate that first
image. So the pgfplots part is this:

\begin{tikzpicture}
\draw (0,0) node {\pgfuseshading{mySpectrum}};
\foreach \x/\xl in {-3/400,-1/500,1/600,3/700}{
\draw[gray] (\x,-.75) -- (\x,-1.25) node[anchor=north,black] {\xl};
}
\node at (0,-2) {Wavelength (nm)};
\end{tikzpicture}

and the julia part is this:

using Color

n = 50
wl1 = 380
wl2 = 780
width = 8
wl = linspace(wl1,wl2,n)

d = linspace(0,width,n)
f = open("spectrum.txt","w")
write(f,"\\pgfdeclarehorizontalshading{mySpectrum}{2cm}{\n")
for i = 1:n-1
xyz = colormatch(wl[i])
rgb = convert(RGB,xyz)

txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b));\n"
write(f,txt)
end
i = n
xyz = colormatch(wl[i])
rgb = convert(RGB,xyz)
txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b))}"
write(f,txt)
close(f)

xl = [400:100:700]
nxl = length(xl)
wli = wl2-wl1
w = zeros(nxl)
for i  = 1:nxl
r = (xl[i]-wl1)/wli
w[i] = width*r
end
w .-= width/2




Yakir Gagnon
The Queensland Brain Institute (Building #79)
The University of Queensland
Brisbane QLD 4072
Australia

cell +61 (0)424 393 332
work +61 (0)733 654 089


On Tue, Aug 26, 2014 at 11:43 AM, Steven G. Johnson 
wrote:

>
>
> On Monday, August 25, 2014 6:35:43 PM UTC-4, Yakir Gagnon wrote:
>
>> I tried the checkout version of Color, and it's the same (see attached),
>> i.e. wrong: the blues should be close to the 400 mark and the reds closer
>> to the 700. the UV purple and IR "black" should be closer to the ends than
>> what we see. Any idea what's going wrong?
>>
>
> You didn't give any indication of how you made that plot...
>


[julia-users] Re: deepcopy() immutable types working as intended?

2014-08-25 Thread Philippe Maincon
Stephan, does this imply that "immutable" is intend to be "deep" read-only, 
and should be handled as such by the programmer as such, in spite of the 
compiler only enforcing "shallow immutability"?


Re: [julia-users] "Correct" (i.e. optimal) way to initialize package in Travis script

2014-08-25 Thread Iain Dunning
Not too confused at least, check out this:
https://coveralls.io/builds/1128440
You can see it shows the two "jobs". Which one you get for the badge... 
unclear to me, possibly configurable?

On Monday, August 25, 2014 10:25:42 PM UTC-4, Elliot Saba wrote:
>
> Ah, that I have no idea.
> -E
>
>
> On Mon, Aug 25, 2014 at 10:20 PM, Tim Holy  > wrote:
>
>> In partiuclar, Coveralls won't get confused if we push both?
>>
>> --Tim
>>
>> On Monday, August 25, 2014 10:10:10 PM Elliot Saba wrote:
>> > Only if there are features you need that exist in one and not the 
>> other, I
>> > suppose.
>> > -E
>> >
>> > On Mon, Aug 25, 2014 at 10:07 PM, Tim Holy > > wrote:
>> > > Once nightly and release become different again, will we need to 
>> reinstate
>> > > the
>> > > if clauses?
>> > >
>> > > --Tim
>> > >
>> > > On Monday, August 25, 2014 06:45:41 PM Iain Dunning wrote:
>> > > > Check what I just did for HttpCommon, only thing lacking is a 
>> Pkg.build
>> > > >
>> > > > https://github.com/JuliaLang/HttpCommon.jl/blob/master/.travis.yml
>> > > >
>> > > > script:
>> > > > - julia -e 'Pkg.init(); Pkg.clone(pwd())'
>> > > > - julia -e 'Pkg.test("HttpCommon", coverage=true)'
>> > > > after_success:
>> > > > - julia -e 'cd(Pkg.dir("HttpCommon")); Pkg.add("Coverage"); using
>> > >
>> > > Coverage;
>> > >
>> > > > Coveralls.submit(Coveralls.process_folder())'
>> > > >
>> > > > This what I hope to see most packages look like now
>> > > >
>> > > > On Monday, August 25, 2014 7:52:21 PM UTC-4, Elliot Saba wrote:
>> > > > > I think that should be considered a bug, since clone() calls 
>> resolve()
>> > > > > which calls build(). but only for the stuff that resolve() 
>> thinks
>> > >
>> > > has
>> > >
>> > > > > changed.  We should probably just call build() after resolve() in
>> > >
>> > > clone.
>> > >
>> > > > > -E
>>
>>
>

Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Daniel Jones

I looked into xcolor, and the color matching function they implement is 
only a rough approximation (page 55 of the xcolor manual), whereas Color.jl 
actually matches wavelengths to the CIE standard observer measurements. In 
this case, I think Color is more correct. Here's someone else's plot made 
from the CIE data that looks close to the Color.jl one: 
http://en.wikipedia.org/wiki/Luminosity_function#mediaviewer/File:Srgbspectrum.png


On Monday, August 25, 2014 8:43:13 PM UTC-7, Yakir Gagnon wrote:
>
> Oh sorry, I vaguely mentioned it in my first reply. 
> The short answer is:
>
> using Color,Images
> n = 500
> wl1 = 380.
> wl2 = 780.
> wl = linspace(wl1,wl2,n)
> I = Array(Float64,n,n,3)
> for i = 1:n
> xyz = colormatch(wl[i])
> rgb = convert(RGB,xyz)
> for (j,f) in enumerate([:r,:g,:b])
> I[i,:,j] = rgb.(f)
> end
> end
> imwrite(I,"a.png")
>
> This results in the attached image. While I'm sure there's a much better 
> way of getting that done (feel free to show be btw, I'd love to know how to 
> improve), you can immediately see that the blues and reds are too far close 
> to each other and that the UV violet and IR black are overly represented. 
>
> The long answer is that I used pgfplots with Julia to generate that first 
> image. So the pgfplots part is this:
>
> \begin{tikzpicture}
> \draw (0,0) node {\pgfuseshading{mySpectrum}};
> \foreach \x/\xl in {-3/400,-1/500,1/600,3/700}{
> \draw[gray] (\x,-.75) -- (\x,-1.25) node[anchor=north,black] {\xl};
> }
> \node at (0,-2) {Wavelength (nm)};
> \end{tikzpicture}
>
> and the julia part is this:
>
> using Color
>
> n = 50
> wl1 = 380
> wl2 = 780
> width = 8
> wl = linspace(wl1,wl2,n)
>
> d = linspace(0,width,n)
> f = open("spectrum.txt","w")
> write(f,"\\pgfdeclarehorizontalshading{mySpectrum}{2cm}{\n")
> for i = 1:n-1
> xyz = colormatch(wl[i])
> rgb = convert(RGB,xyz)
>
> txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b));\n"
> write(f,txt)
> end
> i = n
> xyz = colormatch(wl[i])
> rgb = convert(RGB,xyz)
> txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b))}"
> write(f,txt)
> close(f)
>
> xl = [400:100:700]
> nxl = length(xl)
> wli = wl2-wl1
> w = zeros(nxl)
> for i  = 1:nxl
> r = (xl[i]-wl1)/wli
> w[i] = width*r
> end
> w .-= width/2
>
>
>
>
> Yakir Gagnon
> The Queensland Brain Institute (Building #79)
> The University of Queensland
> Brisbane QLD 4072
> Australia
>
> cell +61 (0)424 393 332
> work +61 (0)733 654 089
>  
>
> On Tue, Aug 26, 2014 at 11:43 AM, Steven G. Johnson  > wrote:
>
>>
>>
>> On Monday, August 25, 2014 6:35:43 PM UTC-4, Yakir Gagnon wrote:
>>
>>> I tried the checkout version of Color, and it's the same (see attached), 
>>> i.e. wrong: the blues should be close to the 400 mark and the reds closer 
>>> to the 700. the UV purple and IR "black" should be closer to the ends than 
>>> what we see. Any idea what's going wrong? 
>>>
>>
>> You didn't give any indication of how you made that plot... 
>>
>
>

[julia-users] Package rename: Distance ==> Distances

2014-08-25 Thread Dahua Lin
Following Julia package naming convention, the package Distance was renamed 
to Distances.

New package page: https://github.com/JuliaStats/Distances.jl

All materials in Distance.jl has been migrated. New issues and PRs should 
go to Distances.jl.

Thanks,
Dahua



Re: [julia-users] Package rename: Distance ==> Distances

2014-08-25 Thread John Myles White
Dahua, did you keep the original package around?

 —  John

On Aug 25, 2014, at 10:07 PM, Dahua Lin  wrote:

> Following Julia package naming convention, the package Distance was renamed 
> to Distances.
> 
> New package page: https://github.com/JuliaStats/Distances.jl
> 
> All materials in Distance.jl has been migrated. New issues and PRs should go 
> to Distances.jl.
> 
> Thanks,
> Dahua
> 



Re: [julia-users] Package rename: Distance ==> Distances

2014-08-25 Thread Dahua Lin
The original package is still there. 

I just added a warning to direct people to the new one. 

See the discussion here: https://github.com/JuliaStats/Distance.jl/issues/25

Dahua

On Tuesday, August 26, 2014 1:08:48 PM UTC+8, John Myles White wrote:
>
> Dahua, did you keep the original package around? 
>
>  —  John 
>
> On Aug 25, 2014, at 10:07 PM, Dahua Lin > 
> wrote: 
>
> > Following Julia package naming convention, the package Distance was 
> renamed to Distances. 
> > 
> > New package page: https://github.com/JuliaStats/Distances.jl 
> > 
> > All materials in Distance.jl has been migrated. New issues and PRs 
> should go to Distances.jl. 
> > 
> > Thanks, 
> > Dahua 
> > 
>
>

Re: [julia-users] Re: Computing colors of molecules with Julia

2014-08-25 Thread Yakir Gagnon
Stand corrected. See attached image for a "comparison" between the three
scales we've discussed. The one in the background is the Julia one, the one
on the bottom is the one you showed from Wikipedia, and the one on top is
the one from xcolor. You can see that the point where Julia "disagrees"
most with xcolor is at 440 nm: Julia says 440 nm is violet while xcolor
says it's blue. I grabbed a 440 nm interference filter (I'm in a lab) and
looked. It was violet.
Thanks for your time!


Yakir Gagnon
The Queensland Brain Institute (Building #79)
The University of Queensland
Brisbane QLD 4072
Australia

cell +61 (0)424 393 332
work +61 (0)733 654 089


On Tue, Aug 26, 2014 at 2:29 PM, Daniel Jones 
wrote:

>
> I looked into xcolor, and the color matching function they implement is
> only a rough approximation (page 55 of the xcolor manual), whereas Color.jl
> actually matches wavelengths to the CIE standard observer measurements. In
> this case, I think Color is more correct. Here's someone else's plot made
> from the CIE data that looks close to the Color.jl one:
> http://en.wikipedia.org/wiki/Luminosity_function#mediaviewer/File:Srgbspectrum.png
>
>
> On Monday, August 25, 2014 8:43:13 PM UTC-7, Yakir Gagnon wrote:
>
>> Oh sorry, I vaguely mentioned it in my first reply.
>> The short answer is:
>>
>> using Color,Images
>> n = 500
>> wl1 = 380.
>> wl2 = 780.
>> wl = linspace(wl1,wl2,n)
>> I = Array(Float64,n,n,3)
>> for i = 1:n
>> xyz = colormatch(wl[i])
>> rgb = convert(RGB,xyz)
>> for (j,f) in enumerate([:r,:g,:b])
>> I[i,:,j] = rgb.(f)
>> end
>> end
>> imwrite(I,"a.png")
>>
>> This results in the attached image. While I'm sure there's a much better
>> way of getting that done (feel free to show be btw, I'd love to know how to
>> improve), you can immediately see that the blues and reds are too far close
>> to each other and that the UV violet and IR black are overly represented.
>>
>> The long answer is that I used pgfplots with Julia to generate that first
>> image. So the pgfplots part is this:
>>
>> \begin{tikzpicture}
>> \draw (0,0) node {\pgfuseshading{mySpectrum}};
>> \foreach \x/\xl in {-3/400,-1/500,1/600,3/700}{
>> \draw[gray] (\x,-.75) -- (\x,-1.25) node[anchor=north,black]
>> {\xl};
>> }
>> \node at (0,-2) {Wavelength (nm)};
>> \end{tikzpicture}
>>
>> and the julia part is this:
>>
>> using Color
>>
>> n = 50
>> wl1 = 380
>> wl2 = 780
>> width = 8
>> wl = linspace(wl1,wl2,n)
>>
>> d = linspace(0,width,n)
>> f = open("spectrum.txt","w")
>> write(f,"\\pgfdeclarehorizontalshading{mySpectrum}{2cm}{\n")
>> for i = 1:n-1
>> xyz = colormatch(wl[i])
>> rgb = convert(RGB,xyz)
>>
>> txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b));\n"
>> write(f,txt)
>> end
>> i = n
>> xyz = colormatch(wl[i])
>> rgb = convert(RGB,xyz)
>> txt = "\trgb($(d[i])cm)=($(rgb.r),$(rgb.g),$(rgb.b))}"
>> write(f,txt)
>> close(f)
>>
>> xl = [400:100:700]
>> nxl = length(xl)
>> wli = wl2-wl1
>> w = zeros(nxl)
>> for i  = 1:nxl
>> r = (xl[i]-wl1)/wli
>> w[i] = width*r
>> end
>> w .-= width/2
>>
>>
>>
>>
>> Yakir Gagnon
>> The Queensland Brain Institute (Building #79)
>> The University of Queensland
>> Brisbane QLD 4072
>> Australia
>>
>> cell +61 (0)424 393 332
>> work +61 (0)733 654 089
>>
>>
>> On Tue, Aug 26, 2014 at 11:43 AM, Steven G. Johnson 
>> wrote:
>>
>>>
>>>
>>> On Monday, August 25, 2014 6:35:43 PM UTC-4, Yakir Gagnon wrote:
>>>
 I tried the checkout version of Color, and it's the same (see
 attached), i.e. wrong: the blues should be close to the 400 mark and the
 reds closer to the 700. the UV purple and IR "black" should be closer to
 the ends than what we see. Any idea what's going wrong?

>>>
>>> You didn't give any indication of how you made that plot...
>>>
>>
>>


Re: [julia-users] Package rename: Distance ==> Distances

2014-08-25 Thread John Myles White
Great. Thanks for making the name change.

 — John

On Aug 25, 2014, at 10:10 PM, Dahua Lin  wrote:

> The original package is still there. 
> 
> I just added a warning to direct people to the new one. 
> 
> See the discussion here: https://github.com/JuliaStats/Distance.jl/issues/25
> 
> Dahua
> 
> On Tuesday, August 26, 2014 1:08:48 PM UTC+8, John Myles White wrote:
> Dahua, did you keep the original package around? 
> 
>  —  John 
> 
> On Aug 25, 2014, at 10:07 PM, Dahua Lin  wrote: 
> 
> > Following Julia package naming convention, the package Distance was renamed 
> > to Distances. 
> > 
> > New package page: https://github.com/JuliaStats/Distances.jl 
> > 
> > All materials in Distance.jl has been migrated. New issues and PRs should 
> > go to Distances.jl. 
> > 
> > Thanks, 
> > Dahua 
> > 
>