[julia-users] Re: how can I create an incidence matrix

2016-06-13 Thread Ford O.
Are you sure you know what is incidence matrix? wiki 


You don't need any dictionaries to implement it. Just use simple Array().

matrix = fill(false, row_size, col_size)
from_node = scan_input()
to_node   = scan_input()
matrix[from_node, to_node] = true



Re: [julia-users] how can I create an incidence matrix

2016-06-13 Thread Tamas Papp
Check out ModelMatrix in Dataframes.jl.

On Mon, Jun 13 2016, Gerson Jr. wrote:

> Hi
> I would like to create an incidence matrix.
> I have a file with 3 columns, like:
>
> id  x   y
> A 22   2
> B  4  21
> C 21 360
> D 26   2
> E 22  58
> F  2 347
>
> And I want a matrix like (without col and row names):
>
>   2 4 21 22 26 58 347 360
> A 1 0  0  1  0  0   0   0
> B 0 1  1  0  0  0   0   0
> C 0 0  1  0  0  0   0   1
> D 1 0  0  0  1  0   0   0
> E 0 0  0  1  0  1   0   0
> F 1 0  0  0  0  0   1   0
>
> I have started the code like:
>
> haps = readdlm("File.txt",header=true) #read the file
> hap1_2 = map(Int64,haps[1][:,2:end])  # create a subfile with 2 and
> 3 column
> ID = (haps[1][:,1])  #subfile with
> the 1 column
> dic1 = Dict()
>
> for (i in 1:21)
> dic1[ID[i]] = hap1_2[i,:]
> end
>
> X=[zeros(21,22)]; #the original file has 21 rows
> and 22 columns
> hcat(ID,X)
>
>
>
> The problem now is that I don't know how to fill the matrix with 1s in the
> specific columns as in the example above.
> I'm also not sure if I'm on the right way.
>
> Any suggestion??
>
> Thanks!


Re: [julia-users] Standard wrapper for global variable optimization hack?

2016-06-13 Thread Eric Forgy
Creating a global instance of a special type seems reasonable to me. I also 
use global Dict's for this kind of thing.

Instead of inc_global, I might define 

Base.(:+)(w::wrapper,s) = w.x + s 

so you can just do `w += 1` :)

If you find a better way, I would love to see it.

PS: Minor point, but looks like you have a typo. Should be `const w = 
wrapper(0.0)` I think. No biggie.

On Tuesday, June 14, 2016 at 8:41:40 AM UTC+8, Arch Robison wrote:
>
> Indeed the one-element vector case was what inspired my wrapper.  I 
> created the wrapper to avoid the run-time bounds check on the vector, 
> though I suppose I could sprinkle @inbounds to eliminate it.
>
> On Mon, Jun 13, 2016 at 6:40 PM, Eric Forgy  > wrote:
>
>> Not sure if this is the best way (curious about 'Ref'), but you could 
>> also make x a 1-element vector:
>>
>> const x = [0]
>>
>> x[1] += 1
>>
>
>

[julia-users] Re: (How?) Has parallel execution changed in 0.5?

2016-06-13 Thread Amit Murthy
The following did not hang on master on my system

Pkg.add("Optim")
using Optim   # precompile Optim
addprocs(4)
@everywhere using Optim  # Load on all workers



followed by the function definition



On Tuesday, 14 June 2016 04:35:55 UTC+5:30, Nils Gudat wrote:
>
> I feel like I've posted on here with the same title when we went from 0.3 
> to 0.4, but I'm finding myself in a similar situation. When running my code 
> on 0.5, it simply stops at a function definition - there's no error, the 
> program just doesn't move on. 
>
> The function definition is this:
>
> @everywhere function consdec(xt::Float64, σ::Float64, r::Float64, 
> υ::Float64)
> obj(c::Float64, xt=xt, σ=σ, r=r, υ=υ) = -(u(c,σ) + bq(r*(xt-c), υ))
> opt = Optim.optimize(obj, 0.1, xt)
> return -opt.f_minimum, opt.minimum
>   end
>
> When I run the code on a single core, this works as expected. However, 
> when I add workers using addprocs at the start, the code just stops at the 
> function definition. Any ideas what could be going wrong here?
>


Re: [julia-users] Standard wrapper for global variable optimization hack?

2016-06-13 Thread Arch Robison
Indeed the one-element vector case was what inspired my wrapper.  I created
the wrapper to avoid the run-time bounds check on the vector, though I
suppose I could sprinkle @inbounds to eliminate it.

On Mon, Jun 13, 2016 at 6:40 PM, Eric Forgy  wrote:

> Not sure if this is the best way (curious about 'Ref'), but you could also
> make x a 1-element vector:
>
> const x = [0]
>
> x[1] += 1
>


[julia-users] Standard wrapper for global variable optimization hack?

2016-06-13 Thread Eric Forgy
Not sure if this is the best way (curious about 'Ref'), but you could also make 
x a 1-element vector:

const x = [0]

x[1] += 1


[julia-users] (How?) Has parallel execution changed in 0.5?

2016-06-13 Thread Nils Gudat
I feel like I've posted on here with the same title when we went from 0.3 
to 0.4, but I'm finding myself in a similar situation. When running my code 
on 0.5, it simply stops at a function definition - there's no error, the 
program just doesn't move on. 

The function definition is this:

@everywhere function consdec(xt::Float64, σ::Float64, r::Float64, 
υ::Float64)
obj(c::Float64, xt=xt, σ=σ, r=r, υ=υ) = -(u(c,σ) + bq(r*(xt-c), υ))
opt = Optim.optimize(obj, 0.1, xt)
return -opt.f_minimum, opt.minimum
  end

When I run the code on a single core, this works as expected. However, when 
I add workers using addprocs at the start, the code just stops at the 
function definition. Any ideas what could be going wrong here?


[julia-users] Re: How to install 0.4.5 on Ubuntu?

2016-06-13 Thread cdm

T. Kelman's advice is apt here ...

unpacking the generic Linux binaries has worked well on Ubuntu systems for 
me over the past months.


the current release, 0.4.5,  for 64-bit environments can be had here:

  
 
https://julialang.s3.amazonaws.com/bin/linux/x64/0.4/julia-0.4.5-linux-x86_64.tar.gz





there was some time when the Ubuntu PPA was lagging due to other priorities 
...

however, based upon the information here:

   https://launchpad.net/~staticfloat/+archive/ubuntu/juliareleases


the current release PPA looks to be in good order (green check marks in the 
"Latest updates" box.

so the following should work in theory


sudo add-apt-repository ppa:staticfloat/juliareleases

sudo add-apt-repository ppa:staticfloat/julia-deps

sudo apt-get update

sudo apt-get install julia

sudo apt-get update

sudo apt-get upgrade



if you were getting v0.5.xxx, then it would seem that your initial
"sudo add-apt-repository ..." line is not pointed to the correct PPA
and/or "sudo apt-get update" needs to be run again.

hoping that helps ...
  


On Monday, June 13, 2016 at 2:49:59 PM UTC-7, Nils Gudat wrote:
>
> Sorry to come back to this, but I've now tried the instructions given 
> above (which are also on the julialang website as I've now seen), but I 
> keep getting v0.5.0-2975-ubuntu14.04.1. When I do sudo apt-get install 
> julia=0.4.5 as before, I'm being told that version 0.4.5 is not found.
>
> What could be causing this?
>


[julia-users] Re: Standard wrapper for global variable optimization hack?

2016-06-13 Thread Kristoffer Carlsson
Modifying construction values is technically "undefined behaviour" (hence the 
warning) and is only provided as a convenience when working interactively.

[julia-users] Re: Standard wrapper for global variable optimization hack?

2016-06-13 Thread 'Greg Plowman' via julia-users
I have a feeling this is a stupid question, but here it is anyway:

Why do you need a wrapper? Why not just declare the object const directly?

const x = 0.0

function inc_global()
x += 1
end


[julia-users] Re: How to install 0.4.5 on Ubuntu?

2016-06-13 Thread Nils Gudat
Sorry to come back to this, but I've now tried the instructions given above 
(which are also on the julialang website as I've now seen), but I keep 
getting v0.5.0-2975-ubuntu14.04.1. When I do sudo apt-get install 
julia=0.4.5 as before, I'm being told that version 0.4.5 is not found.

What could be causing this?  


[julia-users] Standard wrapper for global variable optimization hack?

2016-06-13 Thread Kristoffer Carlsson
Ref?

[julia-users] Standard wrapper for global variable optimization hack?

2016-06-13 Thread Arch Robison
I'm revising the notes for my JuliaCon workshop, and wondering if there is 
a standard way/convention for the following hack to reduce the overhead of 
accessing a global variable.  The hack is to replace a global variable, if 
known to always be bound to objects of the same type, with a const variable 
that wraps a reference to the the object.  For example, instead of writing:
x = 0.0

function inc_global()
   global x
   x += 1
end

we would write:

type wrapper{T}
x::T
end

const x = wrapper(0.0)

function inc_global()
w.x += 1
end

Is there a standard library type that serves the purpose of `wrapper`, or a 
standard type that can be abused to do so?


[julia-users] Re: A naive question regarding the definition of a type with arguments of varying types

2016-06-13 Thread Kristoffer Carlsson
Perhaps:

type RealorComplex{T1 <: Number, T2<:Real}
v::Vector{T1}
w::Vector{Complex{T2}}
function RealorComplex(v, w)
@assert T1 <: AbstractFloat || T1 <: Complex
new(v, w)
end
end

RealorComplex{T1,T2}(v::Vector{T1}, w::Vector{Complex{T2}}) = 
RealorComplex{T1, T2}(v, w)



On Monday, June 13, 2016 at 9:31:06 PM UTC+2, Kuan Xu wrote:
>
>
>
> I'm trying to define a type which has two fields, the first of which can 
> be either real or complex and the second is always complex. 
>
> type RealorComplex{T<:Number}
> v::Vector{T}
> w::Vector{Complex{T}}
> end
>
> r = RealorComplex([1.0, 2.0], [1.0+1.0im, 2.0])
> t = RealorComplex([1.0+1.0im, 2.0], [1.0+1.0im, 2.0])
>
> The construction of r is totally fine but not that of t, since mismatch of 
> the type. What is the best way to fix my ctor so that it can take either 
> type of first argument . 
>
> Hope I have made my question clear. Thanks.
>


[julia-users] how can I create an incidence matrix

2016-06-13 Thread Gerson Jr.
Hi
I would like to create an incidence matrix.
I have a file with 3 columns, like:

id  x   y
A 22   2
B  4  21
C 21 360
D 26   2
E 22  58
F  2 347

And I want a matrix like (without col and row names):

  2 4 21 22 26 58 347 360
A 1 0  0  1  0  0   0   0
B 0 1  1  0  0  0   0   0
C 0 0  1  0  0  0   0   1
D 1 0  0  0  1  0   0   0
E 0 0  0  1  0  1   0   0
F 1 0  0  0  0  0   1   0

I have started the code like:

haps = readdlm("File.txt",header=true) #read the file
hap1_2 = map(Int64,haps[1][:,2:end])  # create a subfile with 2 and 
3 column
ID = (haps[1][:,1])  #subfile with 
the 1 column
dic1 = Dict()

for (i in 1:21)
dic1[ID[i]] = hap1_2[i,:]
end

X=[zeros(21,22)]; #the original file has 21 rows 
and 22 columns 
hcat(ID,X)



The problem now is that I don't know how to fill the matrix with 1s in the 
specific columns as in the example above.  
I'm also not sure if I'm on the right way.

Any suggestion??

Thanks!



[julia-users] A naive question regarding the definition of a type with arguments of varying types

2016-06-13 Thread Kuan Xu


I'm trying to define a type which has two fields, the first of which can be 
either real or complex and the second is always complex. 

type RealorComplex{T<:Number}
v::Vector{T}
w::Vector{Complex{T}}
end

r = RealorComplex([1.0, 2.0], [1.0+1.0im, 2.0])
t = RealorComplex([1.0+1.0im, 2.0], [1.0+1.0im, 2.0])

The construction of r is totally fine but not that of t, since mismatch of 
the type. What is the best way to fix my ctor so that it can take either 
type of first argument . 

Hope I have made my question clear. Thanks.


Re: [julia-users] ControKTH

2016-06-13 Thread Toivo Henningsson
There is also https://github.com/JuliaControl/ControlSystems.jl which is 
MIT licensed.


Re: [julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Kristoffer Carlsson
Please try https://github.com/JuliaStats/Distances.jl/pull/44

On Monday, June 13, 2016 at 8:14:01 PM UTC+2, Mauro wrote:
>
> > function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1}) 
> > num = 0. 
> > den = 0. 
> > for I in 1:length(a) 
> > @inbounds ai = a[I] 
> > @inbounds bi = b[I] 
> > num = num + min(ai,bi) 
> > den = den + max(ai,bi) 
> > end 
> > 1. - num/den 
> > end 
> > 
> > 
> > 
> > function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1}) 
> > for i in 1:5 
> > myjaccard2(v1,v2) 
> > end 
> > end 
>
> I recommend using the values returned for something, otherwise the 
> compiler sometimes eliminates the loop (but not here): 
>
> julia> function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1}) 
>out = 0.0 
>for i in 1:5 
>out += myjaccard2(v1,v2) 
>end 
>out 
>end 
>
> > @time testDistances2(v1,v2) 
> > machine   3.217329 seconds (200.01 M allocations: 2.981 GB, 19.91% gc 
> time) 
>
> I cannot reproduce this, when I run it I get no allocations: 
>
> julia> v2 = rand(10^4); 
>
> # warm-up 
> julia> @time testDistances2(v1,v2) 
>   3.604478 seconds (8.15 k allocations: 401.797 KB, 0.42% gc time) 
> 24999.00112162811 
>
> julia> @time testDistances2(v1,v2) 
>   3.647563 seconds (5 allocations: 176 bytes) 
> 24999.00112162811 
>
> What version of Julia are you running. Me 0.4.5. 
>
> > function myjaccard5(a::Array{Float64,1}, b::Array{Float64,1}) 
> > num = 0. 
> > den = 0. 
> > for I in 1:length(a) 
> > @inbounds ai = a[I] 
> > @inbounds bi = b[I] 
> > abs_m = abs(ai-bi) 
> > abs_p = abs(ai+bi) 
> > num += abs_p - abs_m 
> > den += abs_p + abs_m 
> > end 
> > 1. - num/den 
> > end 
> > 
> > 
> > function testDistances5(a::Array{Float64,1}, b::Array{Float64,1}) 
> > for i in 1:5000 
> > myjaccard5(a,b) 
> > end 
> > end 
> > 
> > end 
> > 
> > 
> > julia> @time testDistances5(v1,v2) 
> >   0.166979 seconds (4 allocations: 160 bytes) 
> > 
> > 
> > 
> > We see that using abs is faster. 
> > 
> > I do not do a pull request beccause 
> > 
> > I would expect a good implementation to be 2 or 3 times slower than 
> > Euclidean, and I have not 
> > that yet. 
> > 
> > Le lundi 13 juin 2016 13:43:00 UTC+2, Kristoffer Carlsson a écrit : 
> >> 
> >> It seems weird to me that you guys want to call Jaccard distance with 
> >> float arrays. AFAIK Jaccard distance measures the distance between two 
> >> distinct samples from a pair of sets so basically between two 
> Vector{Bool}, 
> >> see: 
> >> 
> http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.jaccard.html
>  
> >> 
> >> "Computes the Jaccard-Needham dissimilarity between two boolean 1-D 
> >> arrays." 
> >> 
> >> Is there some more general formulation of it that extends to vectors in 
> a 
> >> continuous vector space? 
> >> 
> >> And, to note, Jaccard is type stable for inputs of Vector{Bool} in 
> >> Distances.jl. 
> >> 
> >> On Monday, June 13, 2016 at 3:53:14 AM UTC+2, jean-pierre both wrote: 
> >>> 
> >>> 
> >>> 
> >>> I encountered in my application with Distances.Jaccard compared with 
> >>> Distances.Euclidean 
> >>> It was very slow. 
> >>> 
> >>> For example with 2 vecteurs Float64 of size 11520 
> >>> 
> >>> I get the following 
> >>> julia> D=Euclidean() 
> >>> Distances.Euclidean() 
> >>> julia> @time for i in 1:500 
> >>>evaluate(D,v1,v2) 
> >>>end 
> >>>   0.002553 seconds (500 allocations: 7.813 KB) 
> >>> 
> >>> and with Jaccard 
> >>> 
> >>> julia> D=Jaccard() 
> >>> Distances.Jaccard() 
> >>> @time for i in 1:500 
> >>>   evaluate(D,v1,v2) 
> >>>   end 
> >>>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time) 
> >>> 
> >>> With a simple loop for computing jaccard : 
> >>> 
> >>> 
> >>> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1}) 
> >>>num = 0 
> >>>den = 0 
> >>>for i in 1:length(a) 
> >>>num = num + min(a[i],b[i]) 
> >>>den = den + max(a[i],b[i]) 
> >>>end 
> >>>1. - num/den 
> >>>end 
> >>> myjaccard2 (generic function with 1 method) 
> >>> 
> >>> julia> @time for i in 1:500 
> >>>   myjaccard2(v1,v2) 
> >>>   end 
> >>>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time) 
> >>> 
> >>> I do not see the problem in jaccard distance implementation in the 
> >>> Distances packages 
> >>> 
> >> 
>


[julia-users] Re: Pivoting when inverting a sparse matrix

2016-06-13 Thread Kristoffer Carlsson
Pardiso also has academic license.

On Monday, June 13, 2016 at 8:09:48 PM UTC+2, Patrick Belliveau wrote:
>
> Julia doesn't seem to have a built in sparse direct solver specifically 
> for symmetric indefinite matrices. However, as Tony alluded to, there are 
> Julia packages that interface with the mumps and pardiso libraries, which 
> do pivoting for symmetric indefinite matrices while taking advantage of 
> symmetry. Using pardiso requires a paid license or mkl but mumps is free 
> and open source.



[julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Kristoffer Carlsson
Please try https://github.com/JuliaStats/Distances.jl/pull/44

Using:

function testDistances5(a::Array{Float64,1}, b::Array{Float64,1})
@time for i in 1:5000
myjaccard5(a,b)
end
d = Jaccard()
@time for i in 1:5000
evaluate(d, a,b)
end
end


I get:
julia> testDistances5(rand(100), rand(100))
  0.000849 seconds
  0.001670 seconds




On Monday, June 13, 2016 at 7:57:20 PM UTC+2, jean-pierre both wrote:
>
> It makes perfect sense to use Jaccard distances for float values
> Cf  for example http://www.ncbi.nlm.nih.gov/pubmed/16794951
>
> Nevertheless the problem is just an implementation, the time spent should 
> be
> comparable with the one with Euclidean.
>
> The problem I mention is that the nice  implementation used in
>  packages Distances is a problem for this distance as a simple loop is 
> really faster.
> I presume there is an optimization issue as the difference in time with 
> Euclidean is many orders of magnitude 
> larger than what can be expected from the complexity.
>
>
> The funny thing is that min and max seems also part of the problem as can 
> be seen in the following:
>
>
> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
> num = 0.
> den = 0.
> for I in 1:length(a)
> @inbounds ai = a[I]
> @inbounds bi = b[I]
> num = num + min(ai,bi)
> den = den + max(ai,bi)  
> end
> 1. - num/den
> end
>
>
>
> function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1})
> for i in 1:5
> myjaccard2(v1,v2)
> end
> end
>
> @time testDistances2(v1,v2)
> machine   3.217329 seconds (200.01 M allocations: 2.981 GB, 19.91% gc time)
>
>
>
> function myjaccard5(a::Array{Float64,1}, b::Array{Float64,1})
> num = 0.
> den = 0.
> for I in 1:length(a)
> @inbounds ai = a[I]
> @inbounds bi = b[I]
> abs_m = abs(ai-bi)
> abs_p = abs(ai+bi)
> num += abs_p - abs_m
> den += abs_p + abs_m   
> end
> 1. - num/den
> end
>
>
> function testDistances5(a::Array{Float64,1}, b::Array{Float64,1})
> for i in 1:5000
> myjaccard5(a,b)
> end
> end
>
> end
>
>
> julia> @time testDistances5(v1,v2)
>   0.166979 seconds (4 allocations: 160 bytes)
>
>
>
> We see that using abs is faster.
>
> I do not do a pull request beccause
>
> I would expect a good implementation to be 2 or 3 times slower than 
> Euclidean, and I have not 
> that yet.
>
> Le lundi 13 juin 2016 13:43:00 UTC+2, Kristoffer Carlsson a écrit :
>>
>> It seems weird to me that you guys want to call Jaccard distance with 
>> float arrays. AFAIK Jaccard distance measures the distance between two 
>> distinct samples from a pair of sets so basically between two Vector{Bool}, 
>> see: 
>> http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.jaccard.html
>>
>> "Computes the Jaccard-Needham dissimilarity between two boolean 1-D 
>> arrays."
>>
>> Is there some more general formulation of it that extends to vectors in a 
>> continuous vector space?
>>
>> And, to note, Jaccard is type stable for inputs of Vector{Bool} in 
>> Distances.jl.
>>
>> On Monday, June 13, 2016 at 3:53:14 AM UTC+2, jean-pierre both wrote:
>>>
>>>
>>>
>>> I encountered in my application with Distances.Jaccard compared with 
>>> Distances.Euclidean
>>> It was very slow.
>>>
>>> For example with 2 vecteurs Float64 of size 11520
>>>
>>> I get the following 
>>> julia> D=Euclidean()
>>> Distances.Euclidean()
>>> julia> @time for i in 1:500
>>>evaluate(D,v1,v2)
>>>end
>>>   0.002553 seconds (500 allocations: 7.813 KB)
>>>
>>> and with Jaccard
>>>
>>> julia> D=Jaccard()
>>> Distances.Jaccard()
>>> @time for i in 1:500
>>>   evaluate(D,v1,v2)
>>>   end
>>>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time)
>>>
>>> With a simple loop for computing jaccard :
>>>
>>>
>>> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>>>num = 0
>>>den = 0
>>>for i in 1:length(a)
>>>num = num + min(a[i],b[i])
>>>den = den + max(a[i],b[i])  
>>>end
>>>1. - num/den
>>>end
>>> myjaccard2 (generic function with 1 method)
>>>
>>> julia> @time for i in 1:500
>>>   myjaccard2(v1,v2)
>>>   end
>>>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time)
>>>
>>> I do not see the problem in jaccard distance implementation in the 
>>> Distances packages
>>>
>>

Re: [julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Mauro
> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
> num = 0.
> den = 0.
> for I in 1:length(a)
> @inbounds ai = a[I]
> @inbounds bi = b[I]
> num = num + min(ai,bi)
> den = den + max(ai,bi)
> end
> 1. - num/den
> end
>
>
>
> function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1})
> for i in 1:5
> myjaccard2(v1,v2)
> end
> end

I recommend using the values returned for something, otherwise the
compiler sometimes eliminates the loop (but not here):

julia> function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1})
   out = 0.0
   for i in 1:5
   out += myjaccard2(v1,v2)
   end
   out
   end

> @time testDistances2(v1,v2)
> machine   3.217329 seconds (200.01 M allocations: 2.981 GB, 19.91% gc time)

I cannot reproduce this, when I run it I get no allocations:

julia> v2 = rand(10^4);

# warm-up
julia> @time testDistances2(v1,v2)
  3.604478 seconds (8.15 k allocations: 401.797 KB, 0.42% gc time)
24999.00112162811

julia> @time testDistances2(v1,v2)
  3.647563 seconds (5 allocations: 176 bytes)
24999.00112162811

What version of Julia are you running. Me 0.4.5.

> function myjaccard5(a::Array{Float64,1}, b::Array{Float64,1})
> num = 0.
> den = 0.
> for I in 1:length(a)
> @inbounds ai = a[I]
> @inbounds bi = b[I]
> abs_m = abs(ai-bi)
> abs_p = abs(ai+bi)
> num += abs_p - abs_m
> den += abs_p + abs_m
> end
> 1. - num/den
> end
>
>
> function testDistances5(a::Array{Float64,1}, b::Array{Float64,1})
> for i in 1:5000
> myjaccard5(a,b)
> end
> end
>
> end
>
>
> julia> @time testDistances5(v1,v2)
>   0.166979 seconds (4 allocations: 160 bytes)
>
>
>
> We see that using abs is faster.
>
> I do not do a pull request beccause
>
> I would expect a good implementation to be 2 or 3 times slower than
> Euclidean, and I have not
> that yet.
>
> Le lundi 13 juin 2016 13:43:00 UTC+2, Kristoffer Carlsson a écrit :
>>
>> It seems weird to me that you guys want to call Jaccard distance with
>> float arrays. AFAIK Jaccard distance measures the distance between two
>> distinct samples from a pair of sets so basically between two Vector{Bool},
>> see:
>> http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.jaccard.html
>>
>> "Computes the Jaccard-Needham dissimilarity between two boolean 1-D
>> arrays."
>>
>> Is there some more general formulation of it that extends to vectors in a
>> continuous vector space?
>>
>> And, to note, Jaccard is type stable for inputs of Vector{Bool} in
>> Distances.jl.
>>
>> On Monday, June 13, 2016 at 3:53:14 AM UTC+2, jean-pierre both wrote:
>>>
>>>
>>>
>>> I encountered in my application with Distances.Jaccard compared with
>>> Distances.Euclidean
>>> It was very slow.
>>>
>>> For example with 2 vecteurs Float64 of size 11520
>>>
>>> I get the following
>>> julia> D=Euclidean()
>>> Distances.Euclidean()
>>> julia> @time for i in 1:500
>>>evaluate(D,v1,v2)
>>>end
>>>   0.002553 seconds (500 allocations: 7.813 KB)
>>>
>>> and with Jaccard
>>>
>>> julia> D=Jaccard()
>>> Distances.Jaccard()
>>> @time for i in 1:500
>>>   evaluate(D,v1,v2)
>>>   end
>>>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time)
>>>
>>> With a simple loop for computing jaccard :
>>>
>>>
>>> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>>>num = 0
>>>den = 0
>>>for i in 1:length(a)
>>>num = num + min(a[i],b[i])
>>>den = den + max(a[i],b[i])
>>>end
>>>1. - num/den
>>>end
>>> myjaccard2 (generic function with 1 method)
>>>
>>> julia> @time for i in 1:500
>>>   myjaccard2(v1,v2)
>>>   end
>>>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time)
>>>
>>> I do not see the problem in jaccard distance implementation in the
>>> Distances packages
>>>
>>


[julia-users] Re: Pivoting when inverting a sparse matrix

2016-06-13 Thread Patrick Belliveau
Julia doesn't seem to have a built in sparse direct solver specifically for 
symmetric indefinite matrices. However, as Tony alluded to, there are Julia 
packages that interface with the mumps and pardiso libraries, which do pivoting 
for symmetric indefinite matrices while taking advantage of symmetry. Using 
pardiso requires a paid license or mkl but mumps is free and open source.

[julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread jean-pierre both
It makes perfect sense to use Jaccard distances for float values
Cf  for example http://www.ncbi.nlm.nih.gov/pubmed/16794951

Nevertheless the problem is just an implementation, the time spent should be
comparable with the one with Euclidean.

The problem I mention is that the nice  implementation used in
 packages Distances is a problem for this distance as a simple loop is 
really faster.
I presume there is an optimization issue as the difference in time with 
Euclidean is many orders of magnitude 
larger than what can be expected from the complexity.


The funny thing is that min and max seems also part of the problem as can 
be seen in the following:


function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
num = 0.
den = 0.
for I in 1:length(a)
@inbounds ai = a[I]
@inbounds bi = b[I]
num = num + min(ai,bi)
den = den + max(ai,bi)  
end
1. - num/den
end



function testDistances2(v1::Array{Float64,1}, v2::Array{Float64,1})
for i in 1:5
myjaccard2(v1,v2)
end
end

@time testDistances2(v1,v2)
machine   3.217329 seconds (200.01 M allocations: 2.981 GB, 19.91% gc time)



function myjaccard5(a::Array{Float64,1}, b::Array{Float64,1})
num = 0.
den = 0.
for I in 1:length(a)
@inbounds ai = a[I]
@inbounds bi = b[I]
abs_m = abs(ai-bi)
abs_p = abs(ai+bi)
num += abs_p - abs_m
den += abs_p + abs_m   
end
1. - num/den
end


function testDistances5(a::Array{Float64,1}, b::Array{Float64,1})
for i in 1:5000
myjaccard5(a,b)
end
end

end


julia> @time testDistances5(v1,v2)
  0.166979 seconds (4 allocations: 160 bytes)



We see that using abs is faster.

I do not do a pull request beccause

I would expect a good implementation to be 2 or 3 times slower than 
Euclidean, and I have not 
that yet.

Le lundi 13 juin 2016 13:43:00 UTC+2, Kristoffer Carlsson a écrit :
>
> It seems weird to me that you guys want to call Jaccard distance with 
> float arrays. AFAIK Jaccard distance measures the distance between two 
> distinct samples from a pair of sets so basically between two Vector{Bool}, 
> see: 
> http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.jaccard.html
>
> "Computes the Jaccard-Needham dissimilarity between two boolean 1-D 
> arrays."
>
> Is there some more general formulation of it that extends to vectors in a 
> continuous vector space?
>
> And, to note, Jaccard is type stable for inputs of Vector{Bool} in 
> Distances.jl.
>
> On Monday, June 13, 2016 at 3:53:14 AM UTC+2, jean-pierre both wrote:
>>
>>
>>
>> I encountered in my application with Distances.Jaccard compared with 
>> Distances.Euclidean
>> It was very slow.
>>
>> For example with 2 vecteurs Float64 of size 11520
>>
>> I get the following 
>> julia> D=Euclidean()
>> Distances.Euclidean()
>> julia> @time for i in 1:500
>>evaluate(D,v1,v2)
>>end
>>   0.002553 seconds (500 allocations: 7.813 KB)
>>
>> and with Jaccard
>>
>> julia> D=Jaccard()
>> Distances.Jaccard()
>> @time for i in 1:500
>>   evaluate(D,v1,v2)
>>   end
>>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time)
>>
>> With a simple loop for computing jaccard :
>>
>>
>> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>>num = 0
>>den = 0
>>for i in 1:length(a)
>>num = num + min(a[i],b[i])
>>den = den + max(a[i],b[i])  
>>end
>>1. - num/den
>>end
>> myjaccard2 (generic function with 1 method)
>>
>> julia> @time for i in 1:500
>>   myjaccard2(v1,v2)
>>   end
>>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time)
>>
>> I do not see the problem in jaccard distance implementation in the 
>> Distances packages
>>
>

Re: [julia-users] Question in using JuMP

2016-06-13 Thread JC
Thank you very much!

On Monday, June 13, 2016 at 1:17:27 PM UTC-4, Mauro wrote:
>
> Better repost this to julia-opt list.  Mauro 
>
> On Mon, 2016-06-13 at 04:27, JC > wrote: 
> > I have been trying to use JuMP for an optimization problem and ran into 
> > many issues. 
> > 
> > 1. I made my own function called myf and try to register that so that I 
> can 
> > read it into my objective function but am getting error message saying 
> > "Unrecognized function :myf used in nonlinear expression". 
> > 
> > 2. my input is a vector and I am having hard time reading vector into 
> the 
> > objective function. 
> > 
> > My code is below where x1 and y1 is 3 by 50 matrix and alpha beta are 
> > projection matrix : 
> > 
> > - 
> > using JuMP 
> > m = Model(solver=NLoptSolver(algorithm=:SLSQP)) 
> > 
> > #- define variables 
> > @variable(m, alpha) 
> > @variable(m, beta) 
> > 
> > function myf(x,y,a,b) 
> >   # apply projection 
> >   w1 = a * x 
> >   w2 = b * x 
> >   z1 = a * y 
> >   z2 = b * y 
> > 
> >   # calculate covariance matrix 
> >   S1 = [var(w1) cov(w1',w2'); cov(w2',w1') var(w2)] 
> >   S2 = [var(z1) cov(z1',z2'); cov(z2',z1') var(z2)] 
> >   S0 = ((nperdimension-1)*S1 + (nperdimension-1)*S2)/(ntotal-2) 
> > 
> >   S1d = abs(S1[1,1]*S1[2,2] - S1[1,2]*S1[2,1]) 
> >   S2d = abs(S2[1,1]*S2[2,2] - S2[1,2]*S1[2,1]) 
> >   S0d = abs(S0[1,1]*S0[2,2] - S0[1,2]*S0[2,1]) 
> > 
> >   test = - log(S1d) - log(S2d) + log(S0d) 
> > 
> >   return test 
> > end 
> > 
> > JuMP.register(:myf, 1, myf, autodiff = true) 
> > 
> > @NLobjective(m, Max, myf(x1,y1,alpha,beta)) 
> > 
> > setvalue(alpha, [1 0 0]) 
> > setvalue(beta, [0 0 1]) 
> > 
> > status = solve(m) 
> > --- 
> > 
> > My objective is to start with two projection vector [1 0 0] and [0 0 1] 
> and 
> > find the projection vector that gives max of test values. 
> > Can someone please help me to get this working? 
> > Thank you. 
>


[julia-users] Interact.jl + ipywidgets 5.x.x

2016-06-13 Thread Josef Heinen
Did anyone manage to get Interact.jl working with an up-to-date version of 
ipywidgets? I had to setup a special environment containing ipywidgets 
v4.1.1 and notebook v4.2.0 to use Interact.jl in Jupyter with a Julia 0.4.5 
kernel by overwriting the PYTHONPATH before starting jupyter notebook:

export PYTHONPATH=/usr/local/envs/ipywidgets-4.1.1:${PYTHONPATH}


Re: [julia-users] Question in using JuMP

2016-06-13 Thread Mauro
Better repost this to julia-opt list.  Mauro

On Mon, 2016-06-13 at 04:27, JC  wrote:
> I have been trying to use JuMP for an optimization problem and ran into
> many issues.
>
> 1. I made my own function called myf and try to register that so that I can
> read it into my objective function but am getting error message saying
> "Unrecognized function :myf used in nonlinear expression".
>
> 2. my input is a vector and I am having hard time reading vector into the
> objective function.
>
> My code is below where x1 and y1 is 3 by 50 matrix and alpha beta are
> projection matrix :
>
> -
> using JuMP
> m = Model(solver=NLoptSolver(algorithm=:SLSQP))
>
> #- define variables
> @variable(m, alpha)
> @variable(m, beta)
>
> function myf(x,y,a,b)
>   # apply projection
>   w1 = a * x
>   w2 = b * x
>   z1 = a * y
>   z2 = b * y
>
>   # calculate covariance matrix
>   S1 = [var(w1) cov(w1',w2'); cov(w2',w1') var(w2)]
>   S2 = [var(z1) cov(z1',z2'); cov(z2',z1') var(z2)]
>   S0 = ((nperdimension-1)*S1 + (nperdimension-1)*S2)/(ntotal-2)
>
>   S1d = abs(S1[1,1]*S1[2,2] - S1[1,2]*S1[2,1])
>   S2d = abs(S2[1,1]*S2[2,2] - S2[1,2]*S1[2,1])
>   S0d = abs(S0[1,1]*S0[2,2] - S0[1,2]*S0[2,1])
>
>   test = - log(S1d) - log(S2d) + log(S0d)
>
>   return test
> end
>
> JuMP.register(:myf, 1, myf, autodiff = true)
>
> @NLobjective(m, Max, myf(x1,y1,alpha,beta))
>
> setvalue(alpha, [1 0 0])
> setvalue(beta, [0 0 1])
>
> status = solve(m)
> ---
>
> My objective is to start with two projection vector [1 0 0] and [0 0 1] and
> find the projection vector that gives max of test values.
> Can someone please help me to get this working?
> Thank you.


[julia-users] ANN: Updated "lite" branch of Julia

2016-06-13 Thread Scott Jones
Since people have been looking at paring down the size of Julia recently, I 
went back and made sure my "lite" branch (
https://github.com/ScottPJones/julia/tree/spj/lite) of Julia was sync'ed 
with master, as that as had quite a lot of changes recently.

Here is some information comparing the "lite" branch to a normal Julia 
build:

After doing make install, for my lite branch (with Make.user turning off 
additionally DOCS, HELP, and REPL) (which is nice for making a version for 
with low memory overhead for running scripts) I get the following:


When run, I get Sys.maxrss() of 84,856,832, with DOCS, HELP, and REPL, it’s 
108,204,032, and normal build of master its 161,628,160, however, the 
number returned tends to fluctuate a lot.

v0.4.5 returns much less, only 95,952,896 when called from a script.


sys.dylib is 37,992,484 for normal build, and 17,089,520 for the lite 
build, so it takes about 45% the space.


The following shows the amount of space taken for the installation (it’s 
less than half the size of a normal v0.5 install).


/j/julia/julia-lite

12:26 $ du -k -d 1

56 ./bin

4 ./etc

280 ./include

100284 ./lib

0 ./libexec

19104 ./share

119728 .


I get the following for a normal build:

/j/julia/julia-0613

12:26 $ du -k -d 1

56 ./bin

4 ./etc

280 ./include

229484 ./lib

0 ./libexec

19056 ./share

248880 .


[julia-users] how to learn about function arguments

2016-06-13 Thread ggggg
Hello,

I've been thinking about writing a Julia package thats similar to lmfit 
, which provides a nice api for 
developing and exploring fitting models. I'll show you one of the key parts 
of the api, you create a Model from a function, and the model knows about 
all arguments which will be the fit parameters. You can then give 
parameters values, bounds, and fix or unfix them easily, and pass a 
Parameters object into a fit function call to be used for starting guesses.
In [1]: import lmfit

In [*8*]: def myfunc(x,a,b,c=4): return x*a+b*c


In [*9*]: model=lmfit.Model(myfunc)


In [*10*]: params = model.make_params(myfunc);params.pretty_print()

Parameters({

'a': , 

'b': , 

'c': , 

})

Clearly, we can't do exactly the same thing in Julia because myfunc could 
have multiple definitions to worth with multiple dispatch. So perhaps 
something like the following could work?

*julia> **myfunc(x,a,b;c=4) = x*a+b*c*

*myfunc (generic function with 1 method)*


*julia> **method = @which myfunc(4,5,6;c=7)*

*myfunc(x, a, b) at none:1*


*julia> **model = Model(method)*

It looks like Base.arg_decl_parts will get me a list of the arguments, but 
it doesn't know about the keyword arguments or default values (neither does 
@which). Maybe use this API, but require keyword arguments to be added 
manually?

Or I could try something like

*julia> **x,a,b=collect(1:10),4,5*


*julia> **@Model myfunc(x,a,b;c=4)*

Then all the information about desired arguments is in the argument to the 
macro. But it seems annoying to for you to define a bunch of probably 
useless variables just to make the macro call to build the mode.

I don't have a good sense for how any of these would interact with callable 
types, any problems that come to mind?

Any thoughts would be appreciated.


Re: [julia-users] Abstract version of rng=1:end ?

2016-06-13 Thread Dan
A reason such 'extended' ranges might be good, is the ability to dispatch 
on them for efficiency. For example, `deleteat!(vec,5:end)`  could be 
implemented faster than just any `deleteat!` of any range. This would be 
applicable to other such structures with edge effects. Indeed, we can learn 
from mathematics which often embraced infinity for ease of use and 
expression.
Again, this complication might not be worth the benefits.

So how about a half-bounded types of the form: 
typeof(5:end) == ExtendedUnitRange{BU}(5) # the BU stands for 
Bounded-Unbounded

a UnboundedUnbounded Unit range could be like `:` meaning unbounded in both 
directions.

To summarize: there are algorithmic optimizations which are enabled by 
knowing the operated range spans to the end (or even just up close to the 
end by 
> On Mon, Jun 13, 2016 at 10:47 AM, Matthew Pearce  > wrote: 
> > Hello 
> > 
> > I find myself frequently wishing to define functions that accept a range 
> > argument to operate on like: 
> > 
> > function foo{T}(M::Array{T,1}, rng::UnitRange) 
> > return sum(M[rng].^2) 
> > end 
> > 
> > It would be nice to be able to pass in `rng = 1:end`, but this isn't 
> legal. 
> > So I need a signature like, 
>
> rng=1:endof(M) 
>
> > 
> > function foo{T}(M::Array{T,1}) 
> > foo(M, 1:length(M)) 
> > end 
> > 
> > Is there a way to set `rng` to achieve my intended effect without having 
> to 
> > resort to declaring additional functions? 
> > 
> > Cheers 
> > 
> > Matthew 
>


[julia-users] Re: Why does adding type assertion to field slow down this example?

2016-06-13 Thread Jason Merrill
On Thursday, June 9, 2016 at 6:36:35 PM UTC-4, Arch Robison wrote
>
> Yes, I know if I declare x with a concrete type such as Int, the code will 
> do blazingly fast.  But I was poking around to see if declaring it with an 
> abstract type narrower than Any would help.
>

I think the other piece of standard advice in situations like these is to 
make a parametric type instead of a single type with abstract fields, so

type NotAsSlow{T<:Unsigned}
  x::T
end


for i=1:3
@time flog(NotAsSlow(UInt64(42)))
end

0.014765 seconds (8.38 k allocations: 342.502 KB)
0.000945 seconds (6 allocations: 192 bytes)
0.001423 seconds (6 allocations: 192 bytes)





Re: [julia-users] ControKTH

2016-06-13 Thread Linus Härenstam-Nielsen
I have to admit I did not take any licencing issues into consideration before 
uploading the code. I've deleted the repository for now but I will post 
here again if I end up re-uploading.

On Monday, June 13, 2016 at 4:34:01 PM UTC+2, Stefan Karpinski wrote:
>
> There is no LICENSE file in that directory, so using this code is legally 
> iffy. Linus says he "copied part of MatLab's algorithm", which makes 
> legality sound even more questionable. Linus, can you address the license 
> issues here and in the repository? What license was the original code 
> under? What license is this code under?
>
> On Sun, Jun 12, 2016 at 2:54 PM, Linus Härenstam-Nielsen <
> linu...@gmail.com > wrote:
>
>> Here we go: https://github.com/Linusnie/ControlBasic
>>
>> Note that at the time I used a 0.4.0 dev build so everything might not 
>> work properly. 
>>
>>
>> On Sunday, June 12, 2016 at 7:35:51 PM UTC+2, Linus Härenstam-Nielsen 
>> wrote:
>>>
>>> Hi author of (part of) ControlKTH here
>>>
>>> This was indeed a student project last summer, we based the development 
>>> on Control.jl . We decided 
>>> not to post the code publicly under the university name since I copied part 
>>> of MatLab's algorithm for drawing root locus diagrams, but since you ask I 
>>> could post it on my private repository. 
>>>
>>> I'll write here once I've done so. 
>>>
>>> On Sunday, June 12, 2016 at 5:47:34 AM UTC+2, Christian Peel wrote:

 Here are two pertinent links
 * 
 https://www.kth.se/polopoly_fs/1.627826!/harenstam-nielsen_lindemann.pdf
 * 
 https://www.kth.se/social/files/55f124f8f276547bba8a16f9/controldoc.pdf

 I guess they refer to a student project to implement basic control 
 theory concepts in Julia. Even if it is not too fancy, it would be useful 
 for the author (cc'd) to post the code publicly.  

 On Sat, Jun 11, 2016 at 4:48 PM, Stefan Karpinski >>> > wrote:

> The top Google hit on "ControKTH" is this message thread. After that 
> the hits are all nonsense. It's also completely unclear what EL1000 is 
> supposed to be.
>
> On Sat, Jun 11, 2016 at 6:06 AM, Henri Girard  
> wrote:
>
>> Hi,
>> Does somebody knows where I can get the control toolbox ?
>>
>> EL1000 home page and download the ControlKTH.zip
>> Is there other toolbox for julia ?
>> Regards
>> Henri
>>
>>
>>
>>
>


 -- 
 chris...@ieee.org

>>>
>

Re: [julia-users] Plotting lots of data

2016-06-13 Thread Tom Breloff
If you check out the dev branch of Plots, you can do 'plot(y,
t=:scattergl)' and it should use the WebGL method in Plotly.  Try it out if
you want... (and report back if you find anything interesting)

On Mon, Jun 13, 2016 at 9:11 AM, Jonathan Malmaud  wrote:

> Does the Plots.jl wrapper over PlotlyJS support the OpenGL-based scatter
> plotting?
>
> On Monday, June 13, 2016 at 8:08:42 AM UTC-4, Tom Breloff wrote:
>>
>> I recommend testing your stuff with Plots... The overhead should be
>> constant among backends, so you can use the same code to benchmark PyPlot,
>> GR and Plotly/PlotlyJS. See the "backends" page of the Plots docs for more
>> info. I recommend checking out master as there's been some good
>> fixes/improvements since my last tag (I'll try to tag again today)
>>
>> Also I'm hoping that there can be some hacking on the GLVisualize backend
>> during JuliaCon!
>>
>> On Monday, June 13, 2016, Mauro  wrote:
>>
>>> I also found that GR is a lot faster than PyPlot, so give that a try.
>>>
>>> On Mon, 2016-06-13 at 09:57, Andreas Lobinger 
>>> wrote:
>>> > Hello colleague,
>>> >
>>> > On Sunday, June 12, 2016 at 10:31:06 PM UTC+2, CrocoDuck O'Ducks wrote:
>>> >>
>>> >> Hi there!
>>> >>
>>> >> I have been experimenting a little with many plotting packages
>>> recently.
>>> >> Being used to Matlab PyPlot seemed to work well for me... until this
>>> bug
>>> >>  I did not figure
>>> out
>>> >> how to workaround. I often need to plot a lot of data. The fact is
>>> that I
>>> >> often work with sampled data, like audio. For example, I could have
>>> to plot
>>> >> 10 seconds of 192 kHz sampled audio. Even when PyPlot was working it
>>> was
>>> >> hard to plot so many data: PyPlot was used to give up after few
>>> errors. I
>>> >> tried also other packages (Gadfly and few others) but seems like they
>>> >> really struggle to plot so much stuff: they often kinda freeze. I am
>>> not
>>> >> sure wether I am missing something or using the packages improperly
>>> or the
>>> >> packages are somewhat limited at this stage. I have resorted to
>>> export the
>>> >> data to .mat files and plot with matlab...
>>> >>
>>> >> My question is:  how do you guys plot large data sets? Do you suggest
>>> a
>>> >> plot package in particular?
>>> >>
>>> >
>>> > I use some home-grown solutions and (still) Winston. I just tested
>>> Plots.jl
>>> > with GR backend and could plot 5e6 random lines. Maybe you look into
>>> this.
>>> >
>>> > Wishing a happy day,
>>> >  Andreas
>>>
>>


Re: [julia-users] Abstract version of rng=1:end ?

2016-06-13 Thread Yichao Yu
On Mon, Jun 13, 2016 at 10:47 AM, Matthew Pearce  wrote:
> Hello
>
> I find myself frequently wishing to define functions that accept a range
> argument to operate on like:
>
> function foo{T}(M::Array{T,1}, rng::UnitRange)
> return sum(M[rng].^2)
> end
>
> It would be nice to be able to pass in `rng = 1:end`, but this isn't legal.
> So I need a signature like,

rng=1:endof(M)

>
> function foo{T}(M::Array{T,1})
> foo(M, 1:length(M))
> end
>
> Is there a way to set `rng` to achieve my intended effect without having to
> resort to declaring additional functions?
>
> Cheers
>
> Matthew


[julia-users] Abstract version of rng=1:end ?

2016-06-13 Thread Matthew Pearce
Hello

I find myself frequently wishing to define functions that accept a range 
argument to operate on like:

function foo{T}(M::Array{T,1}, rng::UnitRange)
return sum(M[rng].^2)
end

It would be nice to be able to pass in `rng = 1:end`, but this isn't legal. 
So I need a signature like,

function foo{T}(M::Array{T,1})
foo(M, 1:length(M))
end

Is there a way to set `rng` to achieve my intended effect without having to 
resort to declaring additional functions?

Cheers

Matthew


Re: [julia-users] ControKTH

2016-06-13 Thread Stefan Karpinski
There is no LICENSE file in that directory, so using this code is legally
iffy. Linus says he "copied part of MatLab's algorithm", which makes
legality sound even more questionable. Linus, can you address the license
issues here and in the repository? What license was the original code
under? What license is this code under?

On Sun, Jun 12, 2016 at 2:54 PM, Linus Härenstam-Nielsen <
linush...@gmail.com> wrote:

> Here we go: https://github.com/Linusnie/ControlBasic
>
> Note that at the time I used a 0.4.0 dev build so everything might not
> work properly.
>
>
> On Sunday, June 12, 2016 at 7:35:51 PM UTC+2, Linus Härenstam-Nielsen
> wrote:
>>
>> Hi author of (part of) ControlKTH here
>>
>> This was indeed a student project last summer, we based the development
>> on Control.jl . We decided not
>> to post the code publicly under the university name since I copied part of
>> MatLab's algorithm for drawing root locus diagrams, but since you ask I
>> could post it on my private repository.
>>
>> I'll write here once I've done so.
>>
>> On Sunday, June 12, 2016 at 5:47:34 AM UTC+2, Christian Peel wrote:
>>>
>>> Here are two pertinent links
>>> *
>>> https://www.kth.se/polopoly_fs/1.627826!/harenstam-nielsen_lindemann.pdf
>>> *
>>> https://www.kth.se/social/files/55f124f8f276547bba8a16f9/controldoc.pdf
>>>
>>> I guess they refer to a student project to implement basic control
>>> theory concepts in Julia. Even if it is not too fancy, it would be useful
>>> for the author (cc'd) to post the code publicly.
>>>
>>> On Sat, Jun 11, 2016 at 4:48 PM, Stefan Karpinski 
>>> wrote:
>>>
 The top Google hit on "ControKTH" is this message thread. After that
 the hits are all nonsense. It's also completely unclear what EL1000 is
 supposed to be.

 On Sat, Jun 11, 2016 at 6:06 AM, Henri Girard 
 wrote:

> Hi,
> Does somebody knows where I can get the control toolbox ?
>
> EL1000 home page and download the ControlKTH.zip
> Is there other toolbox for julia ?
> Regards
> Henri
>
>
>
>

>>>
>>>
>>> --
>>> chris...@ieee.org
>>>
>>


[julia-users] Question in using JuMP

2016-06-13 Thread JC
I have been trying to use JuMP for an optimization problem and ran into 
many issues.

1. I made my own function called myf and try to register that so that I can 
read it into my objective function but am getting error message saying
"Unrecognized function :myf used in nonlinear expression".

2. my input is a vector and I am having hard time reading vector into the 
objective function.

My code is below where x1 and y1 is 3 by 50 matrix and alpha beta are 
projection matrix :

-
using JuMP
m = Model(solver=NLoptSolver(algorithm=:SLSQP))

#- define variables
@variable(m, alpha)
@variable(m, beta)

function myf(x,y,a,b)
  # apply projection
  w1 = a * x
  w2 = b * x
  z1 = a * y
  z2 = b * y

  # calculate covariance matrix
  S1 = [var(w1) cov(w1',w2'); cov(w2',w1') var(w2)]
  S2 = [var(z1) cov(z1',z2'); cov(z2',z1') var(z2)]
  S0 = ((nperdimension-1)*S1 + (nperdimension-1)*S2)/(ntotal-2)

  S1d = abs(S1[1,1]*S1[2,2] - S1[1,2]*S1[2,1])
  S2d = abs(S2[1,1]*S2[2,2] - S2[1,2]*S1[2,1])
  S0d = abs(S0[1,1]*S0[2,2] - S0[1,2]*S0[2,1])

  test = - log(S1d) - log(S2d) + log(S0d)

  return test
end

JuMP.register(:myf, 1, myf, autodiff = true)

@NLobjective(m, Max, myf(x1,y1,alpha,beta))

setvalue(alpha, [1 0 0])
setvalue(beta, [0 0 1])

status = solve(m)
---

My objective is to start with two projection vector [1 0 0] and [0 0 1] and 
find the projection vector that gives max of test values.
Can someone please help me to get this working?
Thank you.



Re: [julia-users] Plotting lots of data

2016-06-13 Thread Jonathan Malmaud
Does the Plots.jl wrapper over PlotlyJS support the OpenGL-based scatter 
plotting? 

On Monday, June 13, 2016 at 8:08:42 AM UTC-4, Tom Breloff wrote:
>
> I recommend testing your stuff with Plots... The overhead should be 
> constant among backends, so you can use the same code to benchmark PyPlot, 
> GR and Plotly/PlotlyJS. See the "backends" page of the Plots docs for more 
> info. I recommend checking out master as there's been some good 
> fixes/improvements since my last tag (I'll try to tag again today)
>
> Also I'm hoping that there can be some hacking on the GLVisualize backend 
> during JuliaCon! 
>
> On Monday, June 13, 2016, Mauro > wrote:
>
>> I also found that GR is a lot faster than PyPlot, so give that a try.
>>
>> On Mon, 2016-06-13 at 09:57, Andreas Lobinger  
>> wrote:
>> > Hello colleague,
>> >
>> > On Sunday, June 12, 2016 at 10:31:06 PM UTC+2, CrocoDuck O'Ducks wrote:
>> >>
>> >> Hi there!
>> >>
>> >> I have been experimenting a little with many plotting packages 
>> recently.
>> >> Being used to Matlab PyPlot seemed to work well for me... until this 
>> bug
>> >>  I did not figure 
>> out
>> >> how to workaround. I often need to plot a lot of data. The fact is 
>> that I
>> >> often work with sampled data, like audio. For example, I could have to 
>> plot
>> >> 10 seconds of 192 kHz sampled audio. Even when PyPlot was working it 
>> was
>> >> hard to plot so many data: PyPlot was used to give up after few 
>> errors. I
>> >> tried also other packages (Gadfly and few others) but seems like they
>> >> really struggle to plot so much stuff: they often kinda freeze. I am 
>> not
>> >> sure wether I am missing something or using the packages improperly or 
>> the
>> >> packages are somewhat limited at this stage. I have resorted to export 
>> the
>> >> data to .mat files and plot with matlab...
>> >>
>> >> My question is:  how do you guys plot large data sets? Do you suggest a
>> >> plot package in particular?
>> >>
>> >
>> > I use some home-grown solutions and (still) Winston. I just tested 
>> Plots.jl
>> > with GR backend and could plot 5e6 random lines. Maybe you look into 
>> this.
>> >
>> > Wishing a happy day,
>> >  Andreas
>>
>

[julia-users] making Juliadoc available to sphinx/python

2016-06-13 Thread Tamas Papp
Hi,

Could someone please explain what the recommended approach is for making
sure that the juliadoc module is available when I want to compile
documentation for _packages_ for offline reading (eg "make singehtml")?

I tried

export PYTHONPATH="~/src/julia/deps/build/julia-env/src/juliadoc/"

which does not work (the path is correct). The particular error message
is

$ make singlehtml
sphinx-build -b singlehtml -d _build/doctrees   . _build/singlehtml
Running Sphinx v1.3.6

Exception occurred:
  File "conf.py", line 17, in 
import juliadoc
ImportError: No module named juliadoc

Googling leads to something solutions mentioning "virtualenv", but I am
unfamiliar with Python, so I am looking for the simplest setup on Linux
that would allow me to install into my homedir and compile package docs
for offline reading.

Best,

Tamas


Re: [julia-users] Plotting lots of data

2016-06-13 Thread Tom Breloff
I recommend testing your stuff with Plots... The overhead should be
constant among backends, so you can use the same code to benchmark PyPlot,
GR and Plotly/PlotlyJS. See the "backends" page of the Plots docs for more
info. I recommend checking out master as there's been some good
fixes/improvements since my last tag (I'll try to tag again today)

Also I'm hoping that there can be some hacking on the GLVisualize backend
during JuliaCon!

On Monday, June 13, 2016, Mauro  wrote:

> I also found that GR is a lot faster than PyPlot, so give that a try.
>
> On Mon, 2016-06-13 at 09:57, Andreas Lobinger  > wrote:
> > Hello colleague,
> >
> > On Sunday, June 12, 2016 at 10:31:06 PM UTC+2, CrocoDuck O'Ducks wrote:
> >>
> >> Hi there!
> >>
> >> I have been experimenting a little with many plotting packages recently.
> >> Being used to Matlab PyPlot seemed to work well for me... until this bug
> >>  I did not figure out
> >> how to workaround. I often need to plot a lot of data. The fact is that
> I
> >> often work with sampled data, like audio. For example, I could have to
> plot
> >> 10 seconds of 192 kHz sampled audio. Even when PyPlot was working it was
> >> hard to plot so many data: PyPlot was used to give up after few errors.
> I
> >> tried also other packages (Gadfly and few others) but seems like they
> >> really struggle to plot so much stuff: they often kinda freeze. I am not
> >> sure wether I am missing something or using the packages improperly or
> the
> >> packages are somewhat limited at this stage. I have resorted to export
> the
> >> data to .mat files and plot with matlab...
> >>
> >> My question is:  how do you guys plot large data sets? Do you suggest a
> >> plot package in particular?
> >>
> >
> > I use some home-grown solutions and (still) Winston. I just tested
> Plots.jl
> > with GR backend and could plot 5e6 random lines. Maybe you look into
> this.
> >
> > Wishing a happy day,
> >  Andreas
>


[julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Kristoffer Carlsson
It seems weird to me that you guys want to call Jaccard distance with float 
arrays. AFAIK Jaccard distance measures the distance between two distinct 
samples from a pair of sets so basically between two Vector{Bool}, 
see: 
http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.jaccard.html

"Computes the Jaccard-Needham dissimilarity between two boolean 1-D arrays."

Is there some more general formulation of it that extends to vectors in a 
continuous vector space?

And, to note, Jaccard is type stable for inputs of Vector{Bool} in 
Distances.jl.

On Monday, June 13, 2016 at 3:53:14 AM UTC+2, jean-pierre both wrote:
>
>
>
> I encountered in my application with Distances.Jaccard compared with 
> Distances.Euclidean
> It was very slow.
>
> For example with 2 vecteurs Float64 of size 11520
>
> I get the following 
> julia> D=Euclidean()
> Distances.Euclidean()
> julia> @time for i in 1:500
>evaluate(D,v1,v2)
>end
>   0.002553 seconds (500 allocations: 7.813 KB)
>
> and with Jaccard
>
> julia> D=Jaccard()
> Distances.Jaccard()
> @time for i in 1:500
>   evaluate(D,v1,v2)
>   end
>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time)
>
> With a simple loop for computing jaccard :
>
>
> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>num = 0
>den = 0
>for i in 1:length(a)
>num = num + min(a[i],b[i])
>den = den + max(a[i],b[i])  
>end
>1. - num/den
>end
> myjaccard2 (generic function with 1 method)
>
> julia> @time for i in 1:500
>   myjaccard2(v1,v2)
>   end
>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time)
>
> I do not see the problem in jaccard distance implementation in the 
> Distances packages
>


[julia-users] Re: Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Simon Danisch
It also seems unnecessary to restrict it to Float64 and Array:

function myjaccard2{T}(A::AbstractArray{T,1}, B::AbstractArray{T, 1})
num = zero(T)
den = zero(T) 
for (a,b) in zip(A,B) 
num += min(a,b) 
den += max(a,b) 
end 
return 1.0 - num/den 
end

Maybe also use T1, T2 and then promote to a common type ;)

Best,
Simon

Am Montag, 13. Juni 2016 03:53:14 UTC+2 schrieb jean-pierre both:
>
>
>
> I encountered in my application with Distances.Jaccard compared with 
> Distances.Euclidean
> It was very slow.
>
> For example with 2 vecteurs Float64 of size 11520
>
> I get the following 
> julia> D=Euclidean()
> Distances.Euclidean()
> julia> @time for i in 1:500
>evaluate(D,v1,v2)
>end
>   0.002553 seconds (500 allocations: 7.813 KB)
>
> and with Jaccard
>
> julia> D=Jaccard()
> Distances.Jaccard()
> @time for i in 1:500
>   evaluate(D,v1,v2)
>   end
>   1.995046 seconds (40.32 M allocations: 703.156 MB, 9.68% gc time)
>
> With a simple loop for computing jaccard :
>
>
> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>num = 0
>den = 0
>for i in 1:length(a)
>num = num + min(a[i],b[i])
>den = den + max(a[i],b[i])  
>end
>1. - num/den
>end
> myjaccard2 (generic function with 1 method)
>
> julia> @time for i in 1:500
>   myjaccard2(v1,v2)
>   end
>   0.451582 seconds (23.04 M allocations: 351.592 MB, 20.04% gc time)
>
> I do not see the problem in jaccard distance implementation in the 
> Distances packages
>


Re: [julia-users] Packages Distances problem with Distances.Jaccard : very slow

2016-06-13 Thread Mauro
I am not sure I quite understand.  But it looks like you know how to
improve the Jaccard distance code, so why not make a pull request at
Distances.jl?

> function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
>num = 0
>den = 0
>for i in 1:length(a)
>num = num + min(a[i],b[i])
>den = den + max(a[i],b[i])
>end
>1. - num/den
>end

Initialising num and den as Int gives you a type instability.  This
should work better (plus a few cosmetic changes):

function myjaccard2(a::Array{Float64,1}, b::Array{Float64,1})
num = 0.0
den = 0.0
for i in 1:length(a)
num += min(a[i],b[i])
den += max(a[i],b[i])
end
return 1. - num/den
end


Re: [julia-users] Re: Plotting lots of data

2016-06-13 Thread Mauro
I also found that GR is a lot faster than PyPlot, so give that a try.

On Mon, 2016-06-13 at 09:57, Andreas Lobinger  wrote:
> Hello colleague,
>
> On Sunday, June 12, 2016 at 10:31:06 PM UTC+2, CrocoDuck O'Ducks wrote:
>>
>> Hi there!
>>
>> I have been experimenting a little with many plotting packages recently.
>> Being used to Matlab PyPlot seemed to work well for me... until this bug
>>  I did not figure out
>> how to workaround. I often need to plot a lot of data. The fact is that I
>> often work with sampled data, like audio. For example, I could have to plot
>> 10 seconds of 192 kHz sampled audio. Even when PyPlot was working it was
>> hard to plot so many data: PyPlot was used to give up after few errors. I
>> tried also other packages (Gadfly and few others) but seems like they
>> really struggle to plot so much stuff: they often kinda freeze. I am not
>> sure wether I am missing something or using the packages improperly or the
>> packages are somewhat limited at this stage. I have resorted to export the
>> data to .mat files and plot with matlab...
>>
>> My question is:  how do you guys plot large data sets? Do you suggest a
>> plot package in particular?
>>
>
> I use some home-grown solutions and (still) Winston. I just tested Plots.jl
> with GR backend and could plot 5e6 random lines. Maybe you look into this.
>
> Wishing a happy day,
>  Andreas


[julia-users] Re: Plotting lots of data

2016-06-13 Thread Andreas Lobinger
Hello colleague,

On Sunday, June 12, 2016 at 10:31:06 PM UTC+2, CrocoDuck O'Ducks wrote:
>
> Hi there!
>
> I have been experimenting a little with many plotting packages recently. 
> Being used to Matlab PyPlot seemed to work well for me... until this bug 
>  I did not figure out 
> how to workaround. I often need to plot a lot of data. The fact is that I 
> often work with sampled data, like audio. For example, I could have to plot 
> 10 seconds of 192 kHz sampled audio. Even when PyPlot was working it was 
> hard to plot so many data: PyPlot was used to give up after few errors. I 
> tried also other packages (Gadfly and few others) but seems like they 
> really struggle to plot so much stuff: they often kinda freeze. I am not 
> sure wether I am missing something or using the packages improperly or the 
> packages are somewhat limited at this stage. I have resorted to export the 
> data to .mat files and plot with matlab...
>
> My question is:  how do you guys plot large data sets? Do you suggest a 
> plot package in particular?
>

I use some home-grown solutions and (still) Winston. I just tested Plots.jl 
with GR backend and could plot 5e6 random lines. Maybe you look into this.

Wishing a happy day,
 Andreas