Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-23 Thread Kevin Liu
Hey Yichao, it's late here, I will continue tomorrow. Thanks for your help!

> On Nov 23, 2016, at 22:59, Yichao Yu  wrote:
> 
>> On Wed, Nov 23, 2016 at 7:31 PM, Kevin Liu  wrote:
>> In `permute_dims = [Remain_dims,Remove_dims]`, both Remain and Remove_dims
> 
> Which is why it is wrong.
> 
>> are vectors. Even if I `permute_dims = [Remain_dims]`, I still get the same
>> error.
> 
> 
> julia> [[1]]
> 1-element Array{Array{Int64,1},1}:
> [1]
> 
> julia> [[1];]
> 1-element Array{Int64,1}:
> 1
> 
> 
>> 
>>> On Wed, Nov 23, 2016 at 9:54 PM, Yichao Yu  wrote:
>>> 
>>>> On Wed, Nov 23, 2016 at 6:50 PM, Kevin Liu  wrote:
>>>> Attached!
>>> 
>>> ```
>>> help?> permutedims
>>> search: permutedims permutedims! ipermutedims
>>> 
>>>  permutedims(A, perm)
>>> 
>>>  Permute the dimensions of array A. perm is a vector specifying a
>>> permutation
>>>  of length ndims(A). This is a generalization of transpose for
>>>  multi-dimensional arrays. Transpose is equivalent to permutedims(A,
>>> [2,1]).
>>> 
>>>  julia> A = reshape(collect(1:8), (2,2,2))
>>>  2×2×2 Array{Int64,3}:
>>>  [:, :, 1] =
>>>   1  3
>>>   2  4
>>>  
>>>  [:, :, 2] =
>>>   5  7
>>>   6  8
>>> 
>>>  julia> permutedims(A, [3, 2, 1])
>>>  2×2×2 Array{Int64,3}:
>>>  [:, :, 1] =
>>>   1  3
>>>   5  7
>>>  
>>>  [:, :, 2] =
>>>   2  4
>>>   6  8
>>> ```
>>> 
>>> You are not giving `permutedims` the correct second parameters
>>> 
>>> (https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e6222cacaeab73/BN/src/FactorOperations.jl#L66).
>>> 
>>> 
>>>> 
>>>>> On Wed, Nov 23, 2016 at 9:44 PM, Yichao Yu  wrote:
>>>>> 
>>>>>> On Wed, Nov 23, 2016 at 4:02 PM, Kevin Liu  wrote:
>>>>>> Yichao, would you give me some direction? I am a bit lost.
>>>>> 
>>>>> Post and/or identify the error after you've fixed the `Factor.` problem
>>>>> 
>>>>>> 
>>>>>>> On Tue, Nov 22, 2016 at 7:58 PM, Kevin Liu  wrote:
>>>>>>> 
>>>>>>> Do you want a cut in the profits for helping me get it to work? It's
>>>>>>> a
>>>>>>> marathon. I still have Markov Random Field and Markov Logic Network
>>>>>>> in
>>>>>>> line... and two of the largest private Brazilian banks on standby.
>>>>>>> 
>>>>>>>>> On Nov 22, 2016, at 19:39, Yichao Yu  wrote:
>>>>>>>>> 
>>>>>>>>> On Tue, Nov 22, 2016 at 4:23 PM, Kevin Liu 
>>>>>>>>> wrote:
>>>>>>>>> I would like to remove variable "c" from factor C. I tried
>>>>>>>>> removing
>>>>>>>>> `Factor.` but it didn't work.
>>>>>>>> 
>>>>>>>> There might be (almost certainly) multiple mistakes in the code so
>>>>>>>> fixing one won't fix all of them.
>>>>>>>> 
>>>>>>>>> 
>>>>>>>>>> On Tue, Nov 22, 2016 at 6:54 PM, Yichao Yu 
>>>>>>>>>> wrote:
>>>>>>>>>> 
>>>>>>>>>>> On Tue, Nov 22, 2016 at 3:45 PM, Kevin Liu 
>>>>>>>>>>> wrote:
>>>>>>>>>>> Yichao, I used a hashtag in the last message to show you what I
>>>>>>>>>>> want
>>>>>>>>>>> to
>>>>>>>>>>> do. Is it clear?
>>>>>>>>>> 
>>>>>>>>>> No
>>>>>>>>>> 
>>>>>>>>>> I'm just talking about the `Factor.` in the line I linked. I
>>>>>>>>>> don't
>>>>>>>>>> know what you want to access. Do you just want `FactorMargin`?
>>>>>>>>>> What's
>>>>>>>>>> the extra `Factor.` for?
>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> On Nov 22, 2016, at 18:27, Yichao Yu  wrote:
>>>>>>>>>>> 
>>>>>>>>>>>>> Yichao and DPSanders, I have already used instances of Factor
>>>>>>>>>>>>> on
>>>>>>>>>>>>> runtests.jl, instances A, B, and C
>>>>>>>>>>>> 
>>>>>>>>>>>> AFAICT you are still accessing a non existing field of a
>>>>>>>>>>>> type[1]
>>>>>>>>>>>> and
>>>>>>>>>>>> it's unclear what you actually want to do.
>>>>>>>>>>>> 
>>>>>>>>>>>> [1]
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e6222cacaeab73/BN/src/FactorOperations.jl#L87
>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
>>>>>>>>>>>>> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
>>>>>>>>>>>>> C = FactorProduct(A, B)
>>>>>>>>>>>>> FactorDropMargin(C, ["c"])
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Do you recommend I make any of the functions in
>>>>>>>>>>>>> FactorOperations.jl
>>>>>>>>>>>>> into inner constructors of `type Factor` in Factor.jl?
>>>>>>>>> 
>>>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>>>> 
>> 
>> 


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-23 Thread Kevin Liu
In `permute_dims = [Remain_dims,Remove_dims]`, both Remain and Remove_dims
are vectors. Even if I `permute_dims = [Remain_dims]`, I still get the same
error.

On Wed, Nov 23, 2016 at 9:54 PM, Yichao Yu  wrote:

> On Wed, Nov 23, 2016 at 6:50 PM, Kevin Liu  wrote:
> > Attached!
>
> ```
> help?> permutedims
> search: permutedims permutedims! ipermutedims
>
>   permutedims(A, perm)
>
>   Permute the dimensions of array A. perm is a vector specifying a
> permutation
>   of length ndims(A). This is a generalization of transpose for
>   multi-dimensional arrays. Transpose is equivalent to permutedims(A,
> [2,1]).
>
>   julia> A = reshape(collect(1:8), (2,2,2))
>   2×2×2 Array{Int64,3}:
>   [:, :, 1] =
>1  3
>2  4
>   
>   [:, :, 2] =
>5  7
>6  8
>
>   julia> permutedims(A, [3, 2, 1])
>   2×2×2 Array{Int64,3}:
>   [:, :, 1] =
>1  3
>5  7
>   
>   [:, :, 2] =
>2  4
>6  8
> ```
>
> You are not giving `permutedims` the correct second parameters
> (https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e622
> 2cacaeab73/BN/src/FactorOperations.jl#L66).
>
>
> >
> > On Wed, Nov 23, 2016 at 9:44 PM, Yichao Yu  wrote:
> >>
> >> On Wed, Nov 23, 2016 at 4:02 PM, Kevin Liu  wrote:
> >> > Yichao, would you give me some direction? I am a bit lost.
> >>
> >> Post and/or identify the error after you've fixed the `Factor.` problem
> >>
> >> >
> >> > On Tue, Nov 22, 2016 at 7:58 PM, Kevin Liu  wrote:
> >> >>
> >> >> Do you want a cut in the profits for helping me get it to work? It's
> a
> >> >> marathon. I still have Markov Random Field and Markov Logic Network
> in
> >> >> line... and two of the largest private Brazilian banks on standby.
> >> >>
> >> >> > On Nov 22, 2016, at 19:39, Yichao Yu  wrote:
> >> >> >
> >> >> >> On Tue, Nov 22, 2016 at 4:23 PM, Kevin Liu 
> wrote:
> >> >> >> I would like to remove variable "c" from factor C. I tried
> removing
> >> >> >> `Factor.` but it didn't work.
> >> >> >
> >> >> > There might be (almost certainly) multiple mistakes in the code so
> >> >> > fixing one won't fix all of them.
> >> >> >
> >> >> >>
> >> >> >>> On Tue, Nov 22, 2016 at 6:54 PM, Yichao Yu 
> >> >> >>> wrote:
> >> >> >>>
> >> >> >>>> On Tue, Nov 22, 2016 at 3:45 PM, Kevin Liu 
> >> >> >>>> wrote:
> >> >> >>>> Yichao, I used a hashtag in the last message to show you what I
> >> >> >>>> want
> >> >> >>>> to
> >> >> >>>> do. Is it clear?
> >> >> >>>
> >> >> >>> No
> >> >> >>>
> >> >> >>> I'm just talking about the `Factor.` in the line I linked. I
> don't
> >> >> >>> know what you want to access. Do you just want `FactorMargin`?
> >> >> >>> What's
> >> >> >>> the extra `Factor.` for?
> >> >> >>>
> >> >> >>>>
> >> >> >>>> On Nov 22, 2016, at 18:27, Yichao Yu  wrote:
> >> >> >>>>
> >> >> >>>>>> Yichao and DPSanders, I have already used instances of Factor
> on
> >> >> >>>>>> runtests.jl, instances A, B, and C
> >> >> >>>>>
> >> >> >>>>> AFAICT you are still accessing a non existing field of a
> type[1]
> >> >> >>>>> and
> >> >> >>>>> it's unclear what you actually want to do.
> >> >> >>>>>
> >> >> >>>>> [1]
> >> >> >>>>>
> >> >> >>>>>
> >> >> >>>>> https://github.com/hpoit/MLN.jl/blob/
> 1c13725666f34587e57c4a1757e6222cacaeab73/BN/src/FactorOperations.jl#L87
> >> >> >>>>>
> >> >> >>>>>>
> >> >> >>>>>> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> >> >> >>>>>> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> >> >> >>>>>> C = FactorProduct(A, B)
> >> >> >>>>>> FactorDropMargin(C, ["c"])
> >> >> >>>>>>
> >> >> >>>>>> Do you recommend I make any of the functions in
> >> >> >>>>>> FactorOperations.jl
> >> >> >>>>>> into inner constructors of `type Factor` in Factor.jl?
> >> >> >>
> >> >> >>
> >> >
> >> >
> >
> >
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
Thanks. Is there anything that sticks out to you?

On Tue, Nov 22, 2016 at 7:39 PM, Yichao Yu  wrote:

> On Tue, Nov 22, 2016 at 4:23 PM, Kevin Liu  wrote:
> > I would like to remove variable "c" from factor C. I tried removing
> > `Factor.` but it didn't work.
>
> There might be (almost certainly) multiple mistakes in the code so
> fixing one won't fix all of them.
>
> >
> > On Tue, Nov 22, 2016 at 6:54 PM, Yichao Yu  wrote:
> >>
> >> On Tue, Nov 22, 2016 at 3:45 PM, Kevin Liu  wrote:
> >> > Yichao, I used a hashtag in the last message to show you what I want
> to
> >> > do. Is it clear?
> >>
> >> No
> >>
> >> I'm just talking about the `Factor.` in the line I linked. I don't
> >> know what you want to access. Do you just want `FactorMargin`? What's
> >> the extra `Factor.` for?
> >>
> >> >
> >> > On Nov 22, 2016, at 18:27, Yichao Yu  wrote:
> >> >
> >> >>> Yichao and DPSanders, I have already used instances of Factor on
> >> >>> runtests.jl, instances A, B, and C
> >> >>
> >> >> AFAICT you are still accessing a non existing field of a type[1] and
> >> >> it's unclear what you actually want to do.
> >> >>
> >> >> [1]
> >> >> https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e622
> 2cacaeab73/BN/src/FactorOperations.jl#L87
> >> >>
> >> >>>
> >> >>> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> >> >>> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> >> >>> C = FactorProduct(A, B)
> >> >>> FactorDropMargin(C, ["c"])
> >> >>>
> >> >>> Do you recommend I make any of the functions in FactorOperations.jl
> >> >>> into inner constructors of `type Factor` in Factor.jl?
> >
> >
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
I would like to remove variable "c" from factor C. I tried removing
`Factor.` but it didn't work.

On Tue, Nov 22, 2016 at 6:54 PM, Yichao Yu  wrote:

> On Tue, Nov 22, 2016 at 3:45 PM, Kevin Liu  wrote:
> > Yichao, I used a hashtag in the last message to show you what I want to
> do. Is it clear?
>
> No
>
> I'm just talking about the `Factor.` in the line I linked. I don't
> know what you want to access. Do you just want `FactorMargin`? What's
> the extra `Factor.` for?
>
> >
> > On Nov 22, 2016, at 18:27, Yichao Yu  wrote:
> >
> >>> Yichao and DPSanders, I have already used instances of Factor on
> runtests.jl, instances A, B, and C
> >>
> >> AFAICT you are still accessing a non existing field of a type[1] and
> >> it's unclear what you actually want to do.
> >>
> >> [1] https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e622
> 2cacaeab73/BN/src/FactorOperations.jl#L87
> >>
> >>>
> >>> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> >>> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> >>> C = FactorProduct(A, B)
> >>> FactorDropMargin(C, ["c"])
> >>>
> >>> Do you recommend I make any of the functions in FactorOperations.jl
> into inner constructors of `type Factor` in Factor.jl?
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
Yichao, I used a hashtag in the last message to show you what I want to do. Is 
it clear? 

On Nov 22, 2016, at 18:27, Yichao Yu  wrote:

>> Yichao and DPSanders, I have already used instances of Factor on 
>> runtests.jl, instances A, B, and C
> 
> AFAICT you are still accessing a non existing field of a type[1] and
> it's unclear what you actually want to do.
> 
> [1] 
> https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e6222cacaeab73/BN/src/FactorOperations.jl#L87
> 
>> 
>> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
>> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
>> C = FactorProduct(A, B)
>> FactorDropMargin(C, ["c"])
>> 
>> Do you recommend I make any of the functions in FactorOperations.jl into 
>> inner constructors of `type Factor` in Factor.jl?


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
julia> A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9]);

julia> B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2]);

julia> C = FactorProduct(A, B)
Factor(["a", "b", "c"],[3, 2, 2],[0.25, 0.05, 0.15, 0.08, 0.0, 0.09, 0.35,
0.07, 0.21, 0.16, 0.0, 0.18])

julia> FactorDropMargin(C, ["c"]) # Yichao, this is what I want to do
Factor(["a", "b"],[3, 2],[0.6, 0.12, 0.36, 0.24, 0.0, 0.27]) # Yichao, this
is what I want to do

julia> FactorKeepMargin(C, ["b", "a"])
Factor(["b", "a"],[2, 3],[0.6, 0.24, 0.12, 0.0, 0.36, 0.27])

julia> FactorPermute(ans, [2, 1])
Factor(["a", "b"],[3, 2],[0.6, 0.12, 0.36, 0.24, 0.0, 0.27])

julia> FactorKeepMargin(C, ["a", "b"])
Factor(["a", "b"],[3, 2],[0.6, 0.12, 0.36, 0.24, 0.0, 0.27])

On Tue, Nov 22, 2016 at 6:27 PM, Yichao Yu  wrote:

> > Yichao and DPSanders, I have already used instances of Factor on
> runtests.jl, instances A, B, and C
>
> AFAICT you are still accessing a non existing field of a type[1] and
> it's unclear what you actually want to do.
>
> [1] https://github.com/hpoit/MLN.jl/blob/1c13725666f34587e57c4a1757e622
> 2cacaeab73/BN/src/FactorOperations.jl#L87
>
> >
> > A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> > B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> > C = FactorProduct(A, B)
> > FactorDropMargin(C, ["c"])
> >
> > Do you recommend I make any of the functions in FactorOperations.jl into
> inner constructors of `type Factor` in Factor.jl?
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
On Tuesday, November 22, 2016 at 5:53:14 PM UTC-2, Kevin Liu wrote:
> On Friday, November 18, 2016 at 4:07:44 PM UTC-2, Kevin Liu wrote:
> > Have a look please https://github.com/hpoit/MLN.jl/tree/master/BN
> > 
> > On Friday, November 18, 2016 at 11:48:58 AM UTC-2, Yichao Yu wrote:On Thu, 
> > Nov 17, 2016 at 2:39 PM, Kevin Liu  wrote:
> > 
> > > Right, I need the instance of Factor
> > 
> > 
> > 
> > Then use the instance of Factor.
> > 
> > 
> > 
> > >
> > 
> > > On Thursday, November 17, 2016 at 5:33:05 PM UTC-2, Yichao Yu wrote:
> > 
> > >>
> > 
> > >> On Thu, Nov 17, 2016 at 2:27 PM, Kevin Liu  wrote:
> > 
> > >> > I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back
> > 
> > >> > again.
> > 
> > >> >
> > 
> > >> > Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor 
> > >> > not
> > 
> > >> > defined} at the end of each block.
> > 
> > >> >
> > 
> > >> > Factor is defined on Factor.jl, and that file evaluates fine.
> > 
> > >>
> > 
> > >> AFAICT the `Factor`is a type so `Factor.FactorMargin` is definitely 
> > >> wrong.
> > 
> > >>
> > 
> > >> >
> > 
> > >> > The main file, BN.jl, includes Factor.jl and FactorOperations.jl and
> > 
> > >> > exports
> > 
> > >> > Factor, and also evaluates fine.
> > 
> > >> >
> > 
> > >> > On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu wrote:
> > 
> > >> >>
> > 
> > >> >> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu  wrote:
> > 
> > >> >> > Hi Yichao!
> > 
> > >> >>
> > 
> > >> >> In general there's nothing from the code you posted that shows what
> > 
> > >> >> you want to do.
> > 
> > >> >>
> > 
> > >> >> >
> > 
> > >> >> > Here is the function from FactorOperations.jl
> > 
> > >> >> >
> > 
> > >> >> > function FactorDropMargin(A::Factor, Remove_var::Vector{String})
> > 
> > >> >> >     Remove_dims = indexin(Remove_var, A.var)
> > 
> > >> >> >     if any(Remove_dims==0)
> > 
> > >> >> >         error("Wrong variable!")
> > 
> > >> >> >     end
> > 
> > >> >> >
> > 
> > >> >> >     Remain_var = symdiff(A.var, Remove_var)
> > 
> > >> >> >     Remain_dims = indexin(Remain_var, A.var)
> > 
> > >> >> >
> > 
> > >> >> >     Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims,
> > 
> > >> >> > Remain_dims) # line 85
> > 
> > >> >>
> > 
> > >> >> Unless you overloaded getindex on this type (which you should include)
> > 
> > >> >> you are construction a Vector of `Factor` from a symbol and then
> > 
> > >> >> calling it. It's impossible to tell what you actually want to do.
> > 
> > >> >>
> > 
> > >> >> And as I previously mentioned, unless you are using PyCall, the issue
> > 
> > >> >> you linked is totally unrelated to this.
> > 
> > >> >>
> > 
> > >> >> > end
> > 
> > >> >> >
> > 
> > >> >> > runtests.jl:
> > 
> > >> >> >
> > 
> > >> >> > @testset "Multiply and marginalize factor" begin
> > 
> > >> >> >
> > 
> > >> >> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> > 
> > >> >> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> > 
> > >> >> >   C = FactorProduct(A, B)
> > 
> > >> >> >   FactorDropMargin(C, ["c"]) # line 19
> > 
> > >> >> >   FactorKeepMargin(C, ["b", "a"])
> > 
> > >> >> >   FactorPermute(ans, [2, 1])
> > 
> > >> >> >   FactorKeepMargin(C, ["a", "b"])
> > 
>

Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-22 Thread Kevin Liu
On Friday, November 18, 2016 at 4:07:44 PM UTC-2, Kevin Liu wrote:
> Have a look please https://github.com/hpoit/MLN.jl/tree/master/BN
> 
> On Friday, November 18, 2016 at 11:48:58 AM UTC-2, Yichao Yu wrote:On Thu, 
> Nov 17, 2016 at 2:39 PM, Kevin Liu  wrote:
> 
> > Right, I need the instance of Factor
> 
> 
> 
> Then use the instance of Factor.
> 
> 
> 
> >
> 
> > On Thursday, November 17, 2016 at 5:33:05 PM UTC-2, Yichao Yu wrote:
> 
> >>
> 
> >> On Thu, Nov 17, 2016 at 2:27 PM, Kevin Liu  wrote:
> 
> >> > I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back
> 
> >> > again.
> 
> >> >
> 
> >> > Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor not
> 
> >> > defined} at the end of each block.
> 
> >> >
> 
> >> > Factor is defined on Factor.jl, and that file evaluates fine.
> 
> >>
> 
> >> AFAICT the `Factor`is a type so `Factor.FactorMargin` is definitely wrong.
> 
> >>
> 
> >> >
> 
> >> > The main file, BN.jl, includes Factor.jl and FactorOperations.jl and
> 
> >> > exports
> 
> >> > Factor, and also evaluates fine.
> 
> >> >
> 
> >> > On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu wrote:
> 
> >> >>
> 
> >> >> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu  wrote:
> 
> >> >> > Hi Yichao!
> 
> >> >>
> 
> >> >> In general there's nothing from the code you posted that shows what
> 
> >> >> you want to do.
> 
> >> >>
> 
> >> >> >
> 
> >> >> > Here is the function from FactorOperations.jl
> 
> >> >> >
> 
> >> >> > function FactorDropMargin(A::Factor, Remove_var::Vector{String})
> 
> >> >> >     Remove_dims = indexin(Remove_var, A.var)
> 
> >> >> >     if any(Remove_dims==0)
> 
> >> >> >         error("Wrong variable!")
> 
> >> >> >     end
> 
> >> >> >
> 
> >> >> >     Remain_var = symdiff(A.var, Remove_var)
> 
> >> >> >     Remain_dims = indexin(Remain_var, A.var)
> 
> >> >> >
> 
> >> >> >     Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims,
> 
> >> >> > Remain_dims) # line 85
> 
> >> >>
> 
> >> >> Unless you overloaded getindex on this type (which you should include)
> 
> >> >> you are construction a Vector of `Factor` from a symbol and then
> 
> >> >> calling it. It's impossible to tell what you actually want to do.
> 
> >> >>
> 
> >> >> And as I previously mentioned, unless you are using PyCall, the issue
> 
> >> >> you linked is totally unrelated to this.
> 
> >> >>
> 
> >> >> > end
> 
> >> >> >
> 
> >> >> > runtests.jl:
> 
> >> >> >
> 
> >> >> > @testset "Multiply and marginalize factor" begin
> 
> >> >> >
> 
> >> >> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
> 
> >> >> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
> 
> >> >> >   C = FactorProduct(A, B)
> 
> >> >> >   FactorDropMargin(C, ["c"]) # line 19
> 
> >> >> >   FactorKeepMargin(C, ["b", "a"])
> 
> >> >> >   FactorPermute(ans, [2, 1])
> 
> >> >> >   FactorKeepMargin(C, ["a", "b"])
> 
> >> >> >
> 
> >> >> > end
> 
> >> >> >
> 
> >> >> > what I got on the REPL:
> 
> >> >> >
> 
> >> >> > julia> Pkg.test("BN")
> 
> >> >> >
> 
> >> >> > INFO: Testing BN
> 
> >> >> >
> 
> >> >> > Test Summary:                                     |
> 
> >> >> >
> 
> >> >> >   Define, permute factor, and call (var, card, val) | No tests
> 
> >> >> >
> 
> >> >> > Multiply and marginalize factor: Error During Test
> 
> >> >> >
> 
> >> >> >   Got an exception of 

Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-18 Thread Kevin Liu
Have a look please https://github.com/hpoit/MLN.jl/tree/master/BN

On Friday, November 18, 2016 at 11:48:58 AM UTC-2, Yichao Yu wrote:
>
> On Thu, Nov 17, 2016 at 2:39 PM, Kevin Liu > 
> wrote: 
> > Right, I need the instance of Factor 
>
> Then use the instance of Factor. 
>
> > 
> > On Thursday, November 17, 2016 at 5:33:05 PM UTC-2, Yichao Yu wrote: 
> >> 
> >> On Thu, Nov 17, 2016 at 2:27 PM, Kevin Liu  wrote: 
> >> > I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back 
> >> > again. 
> >> > 
> >> > Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor 
> not 
> >> > defined} at the end of each block. 
> >> > 
> >> > Factor is defined on Factor.jl, and that file evaluates fine. 
> >> 
> >> AFAICT the `Factor`is a type so `Factor.FactorMargin` is definitely 
> wrong. 
> >> 
> >> > 
> >> > The main file, BN.jl, includes Factor.jl and FactorOperations.jl and 
> >> > exports 
> >> > Factor, and also evaluates fine. 
> >> > 
> >> > On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu 
> wrote: 
> >> >> 
> >> >> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu  
> wrote: 
> >> >> > Hi Yichao! 
> >> >> 
> >> >> In general there's nothing from the code you posted that shows what 
> >> >> you want to do. 
> >> >> 
> >> >> > 
> >> >> > Here is the function from FactorOperations.jl 
> >> >> > 
> >> >> > function FactorDropMargin(A::Factor, Remove_var::Vector{String}) 
> >> >> > Remove_dims = indexin(Remove_var, A.var) 
> >> >> > if any(Remove_dims==0) 
> >> >> > error("Wrong variable!") 
> >> >> > end 
> >> >> > 
> >> >> > Remain_var = symdiff(A.var, Remove_var) 
> >> >> > Remain_dims = indexin(Remain_var, A.var) 
> >> >> > 
> >> >> > Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> >> >> > Remain_dims) # line 85 
> >> >> 
> >> >> Unless you overloaded getindex on this type (which you should 
> include) 
> >> >> you are construction a Vector of `Factor` from a symbol and then 
> >> >> calling it. It's impossible to tell what you actually want to do. 
> >> >> 
> >> >> And as I previously mentioned, unless you are using PyCall, the 
> issue 
> >> >> you linked is totally unrelated to this. 
> >> >> 
> >> >> > end 
> >> >> > 
> >> >> > runtests.jl: 
> >> >> > 
> >> >> > @testset "Multiply and marginalize factor" begin 
> >> >> > 
> >> >> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9]) 
> >> >> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2]) 
> >> >> >   C = FactorProduct(A, B) 
> >> >> >   FactorDropMargin(C, ["c"]) # line 19 
> >> >> >   FactorKeepMargin(C, ["b", "a"]) 
> >> >> >   FactorPermute(ans, [2, 1]) 
> >> >> >   FactorKeepMargin(C, ["a", "b"]) 
> >> >> > 
> >> >> > end 
> >> >> > 
> >> >> > what I got on the REPL: 
> >> >> > 
> >> >> > julia> Pkg.test("BN") 
> >> >> > 
> >> >> > INFO: Testing BN 
> >> >> > 
> >> >> > Test Summary: | 
> >> >> > 
> >> >> >   Define, permute factor, and call (var, card, val) | No tests 
> >> >> > 
> >> >> > Multiply and marginalize factor: Error During Test 
> >> >> > 
> >> >> >   Got an exception of type ErrorException outside of a @test 
> >> >> > 
> >> >> >   type DataType has no field FactorMargin 
> >> >> > 
> >> >> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> >> >> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
> >> >> > 
> >> >> >in macro expansion; at 
> >> >> > /Users/Corvus/.julia/v0.5/

Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-17 Thread Kevin Liu
Right, I need the instance of Factor

On Thursday, November 17, 2016 at 5:33:05 PM UTC-2, Yichao Yu wrote:
>
> On Thu, Nov 17, 2016 at 2:27 PM, Kevin Liu > 
> wrote: 
> > I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back 
> again. 
> > 
> > Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor not 
> > defined} at the end of each block. 
> > 
> > Factor is defined on Factor.jl, and that file evaluates fine. 
>
> AFAICT the `Factor`is a type so `Factor.FactorMargin` is definitely wrong. 
>
> > 
> > The main file, BN.jl, includes Factor.jl and FactorOperations.jl and 
> exports 
> > Factor, and also evaluates fine. 
> > 
> > On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu wrote: 
> >> 
> >> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu  wrote: 
> >> > Hi Yichao! 
> >> 
> >> In general there's nothing from the code you posted that shows what 
> >> you want to do. 
> >> 
> >> > 
> >> > Here is the function from FactorOperations.jl 
> >> > 
> >> > function FactorDropMargin(A::Factor, Remove_var::Vector{String}) 
> >> > Remove_dims = indexin(Remove_var, A.var) 
> >> > if any(Remove_dims==0) 
> >> > error("Wrong variable!") 
> >> > end 
> >> > 
> >> > Remain_var = symdiff(A.var, Remove_var) 
> >> > Remain_dims = indexin(Remain_var, A.var) 
> >> > 
> >> > Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> >> > Remain_dims) # line 85 
> >> 
> >> Unless you overloaded getindex on this type (which you should include) 
> >> you are construction a Vector of `Factor` from a symbol and then 
> >> calling it. It's impossible to tell what you actually want to do. 
> >> 
> >> And as I previously mentioned, unless you are using PyCall, the issue 
> >> you linked is totally unrelated to this. 
> >> 
> >> > end 
> >> > 
> >> > runtests.jl: 
> >> > 
> >> > @testset "Multiply and marginalize factor" begin 
> >> > 
> >> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9]) 
> >> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2]) 
> >> >   C = FactorProduct(A, B) 
> >> >   FactorDropMargin(C, ["c"]) # line 19 
> >> >   FactorKeepMargin(C, ["b", "a"]) 
> >> >   FactorPermute(ans, [2, 1]) 
> >> >   FactorKeepMargin(C, ["a", "b"]) 
> >> > 
> >> > end 
> >> > 
> >> > what I got on the REPL: 
> >> > 
> >> > julia> Pkg.test("BN") 
> >> > 
> >> > INFO: Testing BN 
> >> > 
> >> > Test Summary: | 
> >> > 
> >> >   Define, permute factor, and call (var, card, val) | No tests 
> >> > 
> >> > Multiply and marginalize factor: Error During Test 
> >> > 
> >> >   Got an exception of type ErrorException outside of a @test 
> >> > 
> >> >   type DataType has no field FactorMargin 
> >> > 
> >> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> >> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
> >> > 
> >> >    in macro expansion; at 
> >> > /Users/Corvus/.julia/v0.5/BN/test/runtests.jl:19 
> >> > [inlined] 
> >> > 
> >> >in macro expansion; at ./test.jl:672 [inlined] 
> >> > 
> >> >in anonymous at ./:? 
> >> > 
> >> >in include_from_node1(::String) at ./loading.jl:488 
> >> > 
> >> >in include_from_node1(::String) at 
> >> > 
> >> > 
> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
> >> > 
> >> >in process_options(::Base.JLOptions) at ./client.jl:262 
> >> > 
> >> >in _start() at ./client.jl:318 
> >> > 
> >> >in _start() at 
> >> > 
> >> > 
> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
> >> > 
> >> > Test Summary:   | Error  Total 
> >> > 
> >> >   Multiply and marginalize factor | 1  1 
> >> > 
> >> > 
> >> > On Wednesday, November 16, 2016 at 10:02:48 PM UTC-2, Yichao Yu 
> wrote: 
> >> >> 
> >> >> On Wed, Nov 16, 2016 at 6:50 PM, Kevin Liu  
> wrote: 
> >> >> > From this issue https://github.com/JuliaPy/PyPlot.jl/issues/157 I 
> >> >> > understand 
> >> >> 
> >> >> ^^ This is irrelevant unless you are using PyCall 
> >> >> 
> >> >> > 
> >> >> > `Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> >> >> > Remain_dims)` 
> >> >> > (line 85 of FactorOperations.jl) should pass, as it does on Atom, 
> but 
> >> >> > not on 
> >> >> > the REPL, which throws 
> >> >> > 
> >> >> > Got an exception of type ErrorException outside of a @test 
> >> >> > 
> >> >> >   type DataType has no field FactorMargin 
> >> >> > 
> >> >> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> >> >> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
> >> >> 
> >> >> Impossible to tell without code. 
> >> >> 
> >> >> > 
> >> >> > 
> >> >> > Help, please. 
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-17 Thread Kevin Liu
{Factor.FactorMargin()} simply marginalizes non-stated variables in Factor.

On Thursday, November 17, 2016 at 5:27:38 PM UTC-2, Kevin Liu wrote:
>
> I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back again. 
>
> Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor not 
> defined} at the end of each block. 
>
> Factor is defined on Factor.jl, and that file evaluates fine. 
>
> The main file, BN.jl, includes Factor.jl and FactorOperations.jl and 
> exports Factor, and also evaluates fine. 
>
> On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu wrote:
>>
>> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu  wrote: 
>> > Hi Yichao! 
>>
>> In general there's nothing from the code you posted that shows what 
>> you want to do. 
>>
>> > 
>> > Here is the function from FactorOperations.jl 
>> > 
>> > function FactorDropMargin(A::Factor, Remove_var::Vector{String}) 
>> > Remove_dims = indexin(Remove_var, A.var) 
>> > if any(Remove_dims==0) 
>> > error("Wrong variable!") 
>> > end 
>> > 
>> > Remain_var = symdiff(A.var, Remove_var) 
>> > Remain_dims = indexin(Remain_var, A.var) 
>> > 
>> > Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
>> > Remain_dims) # line 85 
>>
>> Unless you overloaded getindex on this type (which you should include) 
>> you are construction a Vector of `Factor` from a symbol and then 
>> calling it. It's impossible to tell what you actually want to do. 
>>
>> And as I previously mentioned, unless you are using PyCall, the issue 
>> you linked is totally unrelated to this. 
>>
>> > end 
>> > 
>> > runtests.jl: 
>> > 
>> > @testset "Multiply and marginalize factor" begin 
>> > 
>> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9]) 
>> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2]) 
>> >   C = FactorProduct(A, B) 
>> >   FactorDropMargin(C, ["c"]) # line 19 
>> >   FactorKeepMargin(C, ["b", "a"]) 
>> >   FactorPermute(ans, [2, 1]) 
>> >   FactorKeepMargin(C, ["a", "b"]) 
>> > 
>> > end 
>> > 
>> > what I got on the REPL: 
>> > 
>> > julia> Pkg.test("BN") 
>> > 
>> > INFO: Testing BN 
>> > 
>> > Test Summary: | 
>> > 
>> >   Define, permute factor, and call (var, card, val) | No tests 
>> > 
>> > Multiply and marginalize factor: Error During Test 
>> > 
>> >   Got an exception of type ErrorException outside of a @test 
>> > 
>> >   type DataType has no field FactorMargin 
>> > 
>> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
>> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
>> > 
>> >in macro expansion; at 
>> /Users/Corvus/.julia/v0.5/BN/test/runtests.jl:19 
>> > [inlined] 
>> > 
>> >in macro expansion; at ./test.jl:672 [inlined] 
>> > 
>> >in anonymous at ./:? 
>> > 
>> >in include_from_node1(::String) at ./loading.jl:488 
>> > 
>> >in include_from_node1(::String) at 
>> > 
>> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
>> > 
>> >in process_options(::Base.JLOptions) at ./client.jl:262 
>> > 
>> >in _start() at ./client.jl:318 
>> > 
>> >in _start() at 
>> > 
>> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
>> > 
>> > Test Summary:   | Error  Total 
>> > 
>> >   Multiply and marginalize factor | 1  1 
>> > 
>> > 
>> > On Wednesday, November 16, 2016 at 10:02:48 PM UTC-2, Yichao Yu wrote: 
>> >> 
>> >> On Wed, Nov 16, 2016 at 6:50 PM, Kevin Liu  wrote: 
>> >> > From this issue https://github.com/JuliaPy/PyPlot.jl/issues/157 I 
>> >> > understand 
>> >> 
>> >> ^^ This is irrelevant unless you are using PyCall 
>> >> 
>> >> > 
>> >> > `Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
>> >> > Remain_dims)` 
>> >> > (line 85 of FactorOperations.jl) should pass, as it does on Atom, 
>> but 
>> >> > not on 
>> >> > the REPL, which throws 
>> >> > 
>> >> > Got an exception of type ErrorException outside of a @test 
>> >> > 
>> >> >   type DataType has no field FactorMargin 
>> >> > 
>> >> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
>> >> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
>> >> 
>> >> Impossible to tell without code. 
>> >> 
>> >> > 
>> >> > 
>> >> > Help, please. 
>>
>

Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-17 Thread Kevin Liu
I replaced Factor[:FactorMargin]() with Factor.FactorMargin() back again. 

Still, for FactorOperations.jl on Atom, I get {UndefVarError: Factor not 
defined} at the end of each block. 

Factor is defined on Factor.jl, and that file evaluates fine. 

The main file, BN.jl, includes Factor.jl and FactorOperations.jl and 
exports Factor, and also evaluates fine. 

On Wednesday, November 16, 2016 at 11:25:24 PM UTC-2, Yichao Yu wrote:
>
> On Wed, Nov 16, 2016 at 7:24 PM, Kevin Liu > 
> wrote: 
> > Hi Yichao! 
>
> In general there's nothing from the code you posted that shows what 
> you want to do. 
>
> > 
> > Here is the function from FactorOperations.jl 
> > 
> > function FactorDropMargin(A::Factor, Remove_var::Vector{String}) 
> > Remove_dims = indexin(Remove_var, A.var) 
> > if any(Remove_dims==0) 
> > error("Wrong variable!") 
> > end 
> > 
> > Remain_var = symdiff(A.var, Remove_var) 
> > Remain_dims = indexin(Remain_var, A.var) 
> > 
> > Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> > Remain_dims) # line 85 
>
> Unless you overloaded getindex on this type (which you should include) 
> you are construction a Vector of `Factor` from a symbol and then 
> calling it. It's impossible to tell what you actually want to do. 
>
> And as I previously mentioned, unless you are using PyCall, the issue 
> you linked is totally unrelated to this. 
>
> > end 
> > 
> > runtests.jl: 
> > 
> > @testset "Multiply and marginalize factor" begin 
> > 
> >   A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9]) 
> >   B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2]) 
> >   C = FactorProduct(A, B) 
> >   FactorDropMargin(C, ["c"]) # line 19 
> >   FactorKeepMargin(C, ["b", "a"]) 
> >   FactorPermute(ans, [2, 1]) 
> >   FactorKeepMargin(C, ["a", "b"]) 
> > 
> > end 
> > 
> > what I got on the REPL: 
> > 
> > julia> Pkg.test("BN") 
> > 
> > INFO: Testing BN 
> > 
> > Test Summary: | 
> > 
> >   Define, permute factor, and call (var, card, val) | No tests 
> > 
> > Multiply and marginalize factor: Error During Test 
> > 
> >   Got an exception of type ErrorException outside of a @test 
> > 
> >   type DataType has no field FactorMargin 
> > 
> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
> > 
> >in macro expansion; at 
> /Users/Corvus/.julia/v0.5/BN/test/runtests.jl:19 
> > [inlined] 
> > 
> >in macro expansion; at ./test.jl:672 [inlined] 
> > 
> >in anonymous at ./:? 
> > 
> >in include_from_node1(::String) at ./loading.jl:488 
> > 
> >in include_from_node1(::String) at 
> > 
> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
> > 
> >in process_options(::Base.JLOptions) at ./client.jl:262 
> > 
> >in _start() at ./client.jl:318 
> > 
> >in _start() at 
> > 
> /Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:? 
> > 
> > Test Summary:   | Error  Total 
> > 
> >   Multiply and marginalize factor | 1  1 
> > 
> > 
> > On Wednesday, November 16, 2016 at 10:02:48 PM UTC-2, Yichao Yu wrote: 
> >> 
> >> On Wed, Nov 16, 2016 at 6:50 PM, Kevin Liu  wrote: 
> >> > From this issue https://github.com/JuliaPy/PyPlot.jl/issues/157 I 
> >> > understand 
> >> 
> >> ^^ This is irrelevant unless you are using PyCall 
> >> 
> >> > 
> >> > `Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> >> > Remain_dims)` 
> >> > (line 85 of FactorOperations.jl) should pass, as it does on Atom, but 
> >> > not on 
> >> > the REPL, which throws 
> >> > 
> >> > Got an exception of type ErrorException outside of a @test 
> >> > 
> >> >   type DataType has no field FactorMargin 
> >> > 
> >> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> >> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
> >> 
> >> Impossible to tell without code. 
> >> 
> >> > 
> >> > 
> >> > Help, please. 
>


Re: [julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-16 Thread Kevin Liu
Hi Yichao!

Here is the function from FactorOperations.jl

function FactorDropMargin(A::Factor, Remove_var::Vector{String})
Remove_dims = indexin(Remove_var, A.var)
if any(Remove_dims==0)
error("Wrong variable!")
end

Remain_var = symdiff(A.var, Remove_var)
Remain_dims = indexin(Remain_var, A.var)

Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
Remain_dims) # line 85
end

runtests.jl:

@testset "Multiply and marginalize factor" begin

  A=Factor(["a", "b"],[3, 2],[0.5, 0.1, 0.3, 0.8, 0, 0.9])
  B=Factor(["b", "c"],[2, 2],[0.5, 0.1, 0.7, 0.2])
  C = FactorProduct(A, B)
  FactorDropMargin(C, ["c"]) # line 19
  FactorKeepMargin(C, ["b", "a"])
  FactorPermute(ans, [2, 1])
  FactorKeepMargin(C, ["a", "b"])

end

what I got on the REPL:

*julia> **Pkg.test("BN")*

*INFO: Testing BN*

*Test Summary: | *

  Define, permute factor, and call (var, card, val) | *No tests*

*Multiply and marginalize factor: **Error During Test*

  Got an exception of type ErrorException outside of a @test

  type DataType has no field FactorMargin

   in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
/Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85

   in macro expansion; at /Users/Corvus/.julia/v0.5/BN/test/runtests.jl:19 
[inlined]

   in macro expansion; at ./test.jl:672 [inlined]

   in anonymous at ./:?

   in include_from_node1(::String) at ./loading.jl:488

   in include_from_node1(::String) at 
/Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?

   in process_options(::Base.JLOptions) at ./client.jl:262

   in _start() at ./client.jl:318

   in _start() at 
/Applications/Julia-0.5.app/Contents/Resources/julia/lib/julia/sys.dylib:?

*Test Summary:   | **Error  **Total*

  Multiply and marginalize factor | *    1  **    1*

On Wednesday, November 16, 2016 at 10:02:48 PM UTC-2, Yichao Yu wrote:
>
> On Wed, Nov 16, 2016 at 6:50 PM, Kevin Liu > 
> wrote: 
> > From this issue https://github.com/JuliaPy/PyPlot.jl/issues/157 I 
> understand 
>
> ^^ This is irrelevant unless you are using PyCall 
>
> > 
> > `Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
> Remain_dims)` 
> > (line 85 of FactorOperations.jl) should pass, as it does on Atom, but 
> not on 
> > the REPL, which throws 
> > 
> > Got an exception of type ErrorException outside of a @test 
> > 
> >   type DataType has no field FactorMargin 
> > 
> >in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
> > /Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85 
>
> Impossible to tell without code. 
>
> > 
> > 
> > Help, please. 
>


[julia-users] Got an exception of type ErrorException outside of a @test: type DataType has no field FactorMargin

2016-11-16 Thread Kevin Liu
>From this issue https://github.com/JuliaPy/PyPlot.jl/issues/157 I 
understand 

`Factor[:FactorMargin](A, Remove_var, Remain_var, Remove_dims, 
Remain_dims)` (line 85 of FactorOperations.jl) should pass, as it does on 
Atom, but not on the REPL, which throws

Got an exception of type ErrorException outside of a @test

  type DataType has no field FactorMargin

   in FactorDropMargin(::BN.Factor, ::Array{String,1}) at 
/Users/Corvus/.julia/v0.5/BN/src/FactorOperations.jl:85


Help, please.


Re: [julia-users] Re: Reloading module doesn't redefine constant

2016-11-15 Thread Kevin Liu
For some reason, now everything just worked. 

On Tuesday, November 15, 2016 at 3:09:07 PM UTC-2, Kevin Liu wrote:
>
> Full code
>
> https://github.com/hpoit/MLN.jl/blob/master/Factor.jl
> https://github.com/hpoit/MLN.jl/blob/master/FactorOperations.jl
>
> On Tue, Nov 15, 2016 at 3:05 PM, Kevin Liu  wrote:
>
>> This is weird. Please observe include() in the two attachments I've made. 
>>
>> On Tue, Nov 15, 2016 at 11:37 AM, Yichao Yu  wrote:
>>
>>> On Mon, Nov 14, 2016 at 10:35 PM, Kevin Liu  wrote:
>>> > Does indentation affect `include("FactorOperations.jl")`? If I pull it 
>>> back
>>>
>>> No.
>>>
>>> > to where `module` starts, it says `incomplete module at ... requires 
>>> end`.
>>> > Then pushing it under `type` defines `module`.
>>>
>>> Unclear what you mean.
>>>
>>
>>
>

Re: [julia-users] Re: Reloading module doesn't redefine constant

2016-11-15 Thread Kevin Liu
Full code

https://github.com/hpoit/MLN.jl/blob/master/Factor.jl
https://github.com/hpoit/MLN.jl/blob/master/FactorOperations.jl

On Tue, Nov 15, 2016 at 3:05 PM, Kevin Liu  wrote:

> This is weird. Please observe include() in the two attachments I've made.
>
> On Tue, Nov 15, 2016 at 11:37 AM, Yichao Yu  wrote:
>
>> On Mon, Nov 14, 2016 at 10:35 PM, Kevin Liu  wrote:
>> > Does indentation affect `include("FactorOperations.jl")`? If I pull it
>> back
>>
>> No.
>>
>> > to where `module` starts, it says `incomplete module at ... requires
>> end`.
>> > Then pushing it under `type` defines `module`.
>>
>> Unclear what you mean.
>>
>
>


Re: [julia-users] Re: Reloading module doesn't redefine constant

2016-11-14 Thread Kevin Liu
Cool, thanks Yichao. I changed module Factor to module FactorNode. Now I 
got 


TypeError: is defined: expected Symbol, got 
Type{FactorNode.FactorNode.Factor}

(see attachment)

On Monday, November 14, 2016 at 10:41:43 PM UTC-2, Yichao Yu wrote:
>
> On Mon, Nov 14, 2016 at 7:39 PM, Kevin Liu > 
> wrote: 
> > 
> > 
> > On Monday, November 14, 2016 at 10:36:07 PM UTC-2, Kevin Liu wrote: 
> >> 
> >> Help! (see attachment) 
>
> This is not related to reloading. You can't have a global variable 
> with the same name of the module since that's already bound to the 
> module itself. 
>


[julia-users] Re: Reloading module doesn't redefine constant

2016-11-14 Thread Kevin Liu


On Monday, November 14, 2016 at 10:36:07 PM UTC-2, Kevin Liu wrote:
>
> Help! (see attachment)
>


[julia-users] Existential quantifier

2016-10-18 Thread Kevin Liu
What's the right way to do this? 

julia> parse("∃x(sister(x,Spot) & cat(x))")

LoadError: ParseError("invalid character \"∃\"")
while loading In[15], in expression starting on line 7

 in parse at parse.jl:180
 in parse at parse.jl:190

Thanks, Kevin


[julia-users] Re: Representation of a material conditional (implication)

2016-10-08 Thread Kevin Liu
Thanks Fengyang

On Saturday, October 8, 2016 at 12:39:42 AM UTC-3, Fengyang Wang wrote:
>
> As Jussi Piitulainen noted, the ^ operator is backwards, so you need to 
> wrap it around a function.
>
> On Friday, October 7, 2016 at 10:05:34 AM UTC-4, Kevin Liu wrote:
>>
>> *julia> **@code_native(b^a)*
>>
>>.section__TEXT,__text,regular,pure_instructions
>>
>> Filename: bool.jl
>>
>> Source line: 39
>>
>>pushq   %rbp
>>
>>movq%rsp, %rbp
>>
>> Source line: 39
>>
>>xorb$1, %sil
>>
>>orb %dil, %sil
>>
>>movb%sil, %al
>>
>>popq%rbp
>>
>>ret
>>
>> *julia> **@code_native(a<=b)*
>>
>>.section__TEXT,__text,regular,pure_instructions
>>
>> Filename: bool.jl
>>
>> Source line: 29
>>
>>pushq   %rbp
>>
>>movq%rsp, %rbp
>>
>> Source line: 29
>>
>>xorb$1, %dil
>>
>>orb %sil, %dil
>>
>>movb%dil, %al
>>
>>popq%rbp
>>
>>ret
>>
>> *julia> **@code_native(ifelse(a,b,true))*
>>
>>.section__TEXT,__text,regular,pure_instructions
>>
>> Filename: operators.jl
>>
>> Source line: 48
>>
>>pushq   %rbp
>>
>>movq%rsp, %rbp
>>
>>testb   $1, %dil
>>
>> Source line: 48
>>
>>jne L17
>>
>>movb%dl, %sil
>>
>> L17:   movb%sil, %al
>>
>>popq%rbp
>>
>>ret
>>
>>
>>
>> On Friday, October 7, 2016 at 10:58:34 AM UTC-3, Kevin Liu wrote:
>>>
>>> *julia> **@code_llvm(b^a)*
>>>
>>>
>>> define i1 @"julia_^_21646"(i1, i1) {
>>>
>>> top:
>>>
>>>   %2 = xor i1 %1, true
>>>
>>>   %3 = or i1 %0, %2
>>>
>>>   ret i1 %3
>>>
>>> }
>>>
>>> On Friday, October 7, 2016 at 10:56:26 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> Sorry, no need, I got this
>>>>
>>>> *julia> **@code_llvm(a<=b)*
>>>>
>>>>
>>>> define i1 @"julia_<=_21637"(i1, i1) {
>>>>
>>>> top:
>>>>
>>>>   %2 = xor i1 %0, true
>>>>
>>>>   %3 = or i1 %1, %2
>>>>
>>>>   ret i1 %3
>>>>
>>>> }
>>>>
>>>>
>>>> *julia> **@code_llvm(ifelse(a,b,true))*
>>>>
>>>>
>>>> define i1 @julia_ifelse_21636(i1, i1, i1) {
>>>>
>>>> top:
>>>>
>>>>   %3 = select i1 %0, i1 %1, i1 %2
>>>>
>>>>   ret i1 %3
>>>>
>>>> }
>>>>
>>>>
>>>> How do you read this output?
>>>>
>>>> On Friday, October 7, 2016 at 10:50:57 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Jeffrey, can you show the expression you put inside @code_llvm() and 
>>>>> @code_native() for evaluation? 
>>>>>
>>>>> On Friday, October 7, 2016 at 2:26:56 AM UTC-3, Jeffrey Sarnoff wrote:
>>>>>>
>>>>>> Hi Jussi,
>>>>>>
>>>>>> Your version compiles down more neatly than the ifelse version. On my 
>>>>>> system, BenchmarkTools gives nearly identical results; I don't know why, 
>>>>>> but the ifelse version is consistently a smidge faster (~%2, relative 
>>>>>> speed). Here is the llvm code and local native code for each, your 
>>>>>> version 
>>>>>> looks more tidy.  
>>>>>>
>>>>>>
>>>>>> ```
>>>>>> implies(p::Bool, q::Bool) = (p <= q)  implies(p::Bool, 
>>>>>> q::Bool) = ifelse( p, q, true )
>>>>>>
>>>>>> # llvm
>>>>>>
>>>>>>   %2 = xor i8 %0, 1%2 = and i8 %0, 1
>>>>>>   %3 = or i8 %2, %1%3 = icmp eq i8 
>>>>>> %2, 0
>>>>>>   ret i8 %3%4 = select i1 %3, 
>>>>>> i8 1, i8 %1
>>>>>>ret i8 %3
>>>>>>
>>>>>> # native with some common code removed
>>>>>>
>>>>>> xorb   $1, %diltestb  $1, %dil
>>>>>> orb%sil, %dil  movb   $1, %al
>>>>>> movb   %dil, %al   je L15
>>>>>> popq   %rbpmovb   %sil, %al
>>>>>> retq L15:  popq   %rbp
>>>>>>retq
>>>>>> ```
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Friday, October 7, 2016 at 12:22:23 AM UTC-4, Jussi Piitulainen 
>>>>>> wrote:
>>>>>>>
>>>>>>>
>>>>>>> implies(p::Bool, q::Bool) = p <= q
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> torstai 6. lokakuuta 2016 19.10.51 UTC+3 Kevin Liu kirjoitti:
>>>>>>>>
>>>>>>>> How is an implication represented in Julia? 
>>>>>>>>
>>>>>>>>
>>>>>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>>>>>
>>>>>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-07 Thread Kevin Liu


*julia> **@code_native(b^a)*

   .section__TEXT,__text,regular,pure_instructions

Filename: bool.jl

Source line: 39

   pushq   %rbp

   movq%rsp, %rbp

Source line: 39

   xorb$1, %sil

   orb %dil, %sil

   movb%sil, %al

   popq%rbp

   ret

*julia> **@code_native(a<=b)*

   .section__TEXT,__text,regular,pure_instructions

Filename: bool.jl

Source line: 29

   pushq   %rbp

   movq%rsp, %rbp

Source line: 29

   xorb$1, %dil

   orb %sil, %dil

   movb%dil, %al

   popq%rbp

   ret

*julia> **@code_native(ifelse(a,b,true))*

   .section__TEXT,__text,regular,pure_instructions

Filename: operators.jl

Source line: 48

   pushq   %rbp

   movq%rsp, %rbp

   testb   $1, %dil

Source line: 48

   jne L17

   movb%dl, %sil

L17:   movb%sil, %al

   popq%rbp

   ret



On Friday, October 7, 2016 at 10:58:34 AM UTC-3, Kevin Liu wrote:
>
> *julia> **@code_llvm(b^a)*
>
>
> define i1 @"julia_^_21646"(i1, i1) {
>
> top:
>
>   %2 = xor i1 %1, true
>
>   %3 = or i1 %0, %2
>
>   ret i1 %3
>
> }
>
> On Friday, October 7, 2016 at 10:56:26 AM UTC-3, Kevin Liu wrote:
>>
>> Sorry, no need, I got this
>>
>> *julia> **@code_llvm(a<=b)*
>>
>>
>> define i1 @"julia_<=_21637"(i1, i1) {
>>
>> top:
>>
>>   %2 = xor i1 %0, true
>>
>>   %3 = or i1 %1, %2
>>
>>   ret i1 %3
>>
>> }
>>
>>
>> *julia> **@code_llvm(ifelse(a,b,true))*
>>
>>
>> define i1 @julia_ifelse_21636(i1, i1, i1) {
>>
>> top:
>>
>>   %3 = select i1 %0, i1 %1, i1 %2
>>
>>   ret i1 %3
>>
>> }
>>
>>
>> How do you read this output?
>>
>> On Friday, October 7, 2016 at 10:50:57 AM UTC-3, Kevin Liu wrote:
>>>
>>> Jeffrey, can you show the expression you put inside @code_llvm() and 
>>> @code_native() for evaluation? 
>>>
>>> On Friday, October 7, 2016 at 2:26:56 AM UTC-3, Jeffrey Sarnoff wrote:
>>>>
>>>> Hi Jussi,
>>>>
>>>> Your version compiles down more neatly than the ifelse version. On my 
>>>> system, BenchmarkTools gives nearly identical results; I don't know why, 
>>>> but the ifelse version is consistently a smidge faster (~%2, relative 
>>>> speed). Here is the llvm code and local native code for each, your version 
>>>> looks more tidy.  
>>>>
>>>>
>>>> ```
>>>> implies(p::Bool, q::Bool) = (p <= q)  implies(p::Bool, q::Bool) 
>>>> = ifelse( p, q, true )
>>>>
>>>> # llvm
>>>>
>>>>   %2 = xor i8 %0, 1%2 = and i8 %0, 1
>>>>   %3 = or i8 %2, %1%3 = icmp eq i8 %2, 0
>>>>   ret i8 %3%4 = select i1 %3, 
>>>> i8 1, i8 %1
>>>>ret i8 %3
>>>>
>>>> # native with some common code removed
>>>>
>>>> xorb   $1, %diltestb  $1, %dil
>>>> orb%sil, %dil  movb   $1, %al
>>>> movb   %dil, %al   je L15
>>>> popq   %rbpmovb   %sil, %al
>>>> retq L15:  popq   %rbp
>>>>retq
>>>> ```
>>>>
>>>>
>>>>
>>>>
>>>> On Friday, October 7, 2016 at 12:22:23 AM UTC-4, Jussi Piitulainen 
>>>> wrote:
>>>>>
>>>>>
>>>>> implies(p::Bool, q::Bool) = p <= q
>>>>>
>>>>>
>>>>>
>>>>> torstai 6. lokakuuta 2016 19.10.51 UTC+3 Kevin Liu kirjoitti:
>>>>>>
>>>>>> How is an implication represented in Julia? 
>>>>>>
>>>>>>
>>>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>>>
>>>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-07 Thread Kevin Liu
 

*julia> **@code_llvm(b^a)*


define i1 @"julia_^_21646"(i1, i1) {

top:

  %2 = xor i1 %1, true

  %3 = or i1 %0, %2

  ret i1 %3

}

On Friday, October 7, 2016 at 10:56:26 AM UTC-3, Kevin Liu wrote:
>
> Sorry, no need, I got this
>
> *julia> **@code_llvm(a<=b)*
>
>
> define i1 @"julia_<=_21637"(i1, i1) {
>
> top:
>
>   %2 = xor i1 %0, true
>
>   %3 = or i1 %1, %2
>
>   ret i1 %3
>
> }
>
>
> *julia> **@code_llvm(ifelse(a,b,true))*
>
>
> define i1 @julia_ifelse_21636(i1, i1, i1) {
>
> top:
>
>   %3 = select i1 %0, i1 %1, i1 %2
>
>   ret i1 %3
>
> }
>
>
> How do you read this output?
>
> On Friday, October 7, 2016 at 10:50:57 AM UTC-3, Kevin Liu wrote:
>>
>> Jeffrey, can you show the expression you put inside @code_llvm() and 
>> @code_native() for evaluation? 
>>
>> On Friday, October 7, 2016 at 2:26:56 AM UTC-3, Jeffrey Sarnoff wrote:
>>>
>>> Hi Jussi,
>>>
>>> Your version compiles down more neatly than the ifelse version. On my 
>>> system, BenchmarkTools gives nearly identical results; I don't know why, 
>>> but the ifelse version is consistently a smidge faster (~%2, relative 
>>> speed). Here is the llvm code and local native code for each, your version 
>>> looks more tidy.  
>>>
>>>
>>> ```
>>> implies(p::Bool, q::Bool) = (p <= q)  implies(p::Bool, q::Bool) 
>>> = ifelse( p, q, true )
>>>
>>> # llvm
>>>
>>>   %2 = xor i8 %0, 1%2 = and i8 %0, 1
>>>   %3 = or i8 %2, %1%3 = icmp eq i8 %2, 0
>>>   ret i8 %3%4 = select i1 %3, i8 
>>> 1, i8 %1
>>>ret i8 %3
>>>
>>> # native with some common code removed
>>>
>>> xorb   $1, %diltestb  $1, %dil
>>> orb%sil, %dil  movb   $1, %al
>>> movb   %dil, %al   je     L15
>>> popq   %rbpmovb   %sil, %al
>>> retq L15:  popq   %rbp
>>>retq
>>> ```
>>>
>>>
>>>
>>>
>>> On Friday, October 7, 2016 at 12:22:23 AM UTC-4, Jussi Piitulainen wrote:
>>>>
>>>>
>>>> implies(p::Bool, q::Bool) = p <= q
>>>>
>>>>
>>>>
>>>> torstai 6. lokakuuta 2016 19.10.51 UTC+3 Kevin Liu kirjoitti:
>>>>>
>>>>> How is an implication represented in Julia? 
>>>>>
>>>>>
>>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>>
>>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-07 Thread Kevin Liu
Sorry, no need, I got this

*julia> **@code_llvm(a<=b)*


define i1 @"julia_<=_21637"(i1, i1) {

top:

  %2 = xor i1 %0, true

  %3 = or i1 %1, %2

  ret i1 %3

}


*julia> **@code_llvm(ifelse(a,b,true))*


define i1 @julia_ifelse_21636(i1, i1, i1) {

top:

  %3 = select i1 %0, i1 %1, i1 %2

  ret i1 %3

}


How do you read this output?

On Friday, October 7, 2016 at 10:50:57 AM UTC-3, Kevin Liu wrote:
>
> Jeffrey, can you show the expression you put inside @code_llvm() and 
> @code_native() for evaluation? 
>
> On Friday, October 7, 2016 at 2:26:56 AM UTC-3, Jeffrey Sarnoff wrote:
>>
>> Hi Jussi,
>>
>> Your version compiles down more neatly than the ifelse version. On my 
>> system, BenchmarkTools gives nearly identical results; I don't know why, 
>> but the ifelse version is consistently a smidge faster (~%2, relative 
>> speed). Here is the llvm code and local native code for each, your version 
>> looks more tidy.  
>>
>>
>> ```
>> implies(p::Bool, q::Bool) = (p <= q)  implies(p::Bool, q::Bool) 
>> = ifelse( p, q, true )
>>
>> # llvm
>>
>>   %2 = xor i8 %0, 1%2 = and i8 %0, 1
>>   %3 = or i8 %2, %1%3 = icmp eq i8 %2, 0
>>   ret i8 %3%4 = select i1 %3, i8 
>> 1, i8 %1
>>ret i8 %3
>>
>> # native with some common code removed
>>
>> xorb   $1, %diltestb  $1, %dil
>> orb%sil, %dil  movb   $1, %al
>> movb   %dil, %al   je L15
>> popq   %rbpmovb   %sil, %al
>> retq L15:  popq   %rbp
>>        retq
>> ```
>>
>>
>>
>>
>> On Friday, October 7, 2016 at 12:22:23 AM UTC-4, Jussi Piitulainen wrote:
>>>
>>>
>>> implies(p::Bool, q::Bool) = p <= q
>>>
>>>
>>>
>>> torstai 6. lokakuuta 2016 19.10.51 UTC+3 Kevin Liu kirjoitti:
>>>>
>>>> How is an implication represented in Julia? 
>>>>
>>>>
>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>
>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-07 Thread Kevin Liu
Jeffrey, can you show the expression you put inside @code_llvm() and 
@code_native() for evaluation? 

On Friday, October 7, 2016 at 2:26:56 AM UTC-3, Jeffrey Sarnoff wrote:
>
> Hi Jussi,
>
> Your version compiles down more neatly than the ifelse version. On my 
> system, BenchmarkTools gives nearly identical results; I don't know why, 
> but the ifelse version is consistently a smidge faster (~%2, relative 
> speed). Here is the llvm code and local native code for each, your version 
> looks more tidy.  
>
>
> ```
> implies(p::Bool, q::Bool) = (p <= q)  implies(p::Bool, q::Bool) 
> = ifelse( p, q, true )
>
> # llvm
>
>   %2 = xor i8 %0, 1%2 = and i8 %0, 1
>   %3 = or i8 %2, %1%3 = icmp eq i8 %2, 0
>   ret i8 %3%4 = select i1 %3, i8 
> 1, i8 %1
>ret i8 %3
>
> # native with some common code removed
>
> xorb   $1, %diltestb  $1, %dil
> orb%sil, %dil  movb   $1, %al
> movb   %dil, %al   je L15
> popq   %rbpmovb   %sil, %al
> retq L15:  popq   %rbp
>retq
> ```
>
>
>
>
> On Friday, October 7, 2016 at 12:22:23 AM UTC-4, Jussi Piitulainen wrote:
>>
>>
>> implies(p::Bool, q::Bool) = p <= q
>>
>>
>>
>> torstai 6. lokakuuta 2016 19.10.51 UTC+3 Kevin Liu kirjoitti:
>>>
>>> How is an implication represented in Julia? 
>>>
>>>
>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>
>>

Re: [julia-users] Re: Representation of a material conditional (implication)

2016-10-06 Thread Kevin Liu
That's awesome, thank you. 

> On Oct 6, 2016, at 17:54, Jeffrey Sarnoff  wrote:
> 
> Peter Norvig's book+site is a very good learning tool.
> 
> by the way: if you are using OSX or Linux and have your terminal using a font 
> with decent unicode coverage,   
> `\Rightarrow` followed by TAB turns into `⇒`, which is the generally accepted 
> symbol for material implication.
> 
> ⇒(p::Bool, q::Bool) = ifelse(p, q, true)
> 
> true  ⇒  true, false  ⇒  true,  false  ⇒  false
> # (true, true, true)
> 
> true  ⇒ false
> # false
> 
> 
> 
> 
> 
> 
>> On Thursday, October 6, 2016 at 4:34:11 PM UTC-4, Kevin Liu wrote:
>> Thanks for the distinction, Jeffrey.
>> 
>> Also, look what I found https://github.com/aimacode. Julia is empty :-). Can 
>> we hire some Martians to fill it up as we have ran out of Julians on Earth? 
>> I'm happy I found this though. 
>> 
>>> On Thursday, October 6, 2016 at 5:26:43 PM UTC-3, Jeffrey Sarnoff wrote:
>>> you are welcome to use
>>> implies(p::Bool, q::Bool) = !p | q
>>> { !p, ~p likely compile to the same instructions -- they do for me; you 
>>> might prefer to use of !p here as that means 'logical_not(p)' where ~p 
>>> means 'flip_the_bits_of(p)' }
>>> 
>>> I find that this form is also 40% slower than the ifelse form.
>>> 
>>> 
>>> 
>>>> On Thursday, October 6, 2016 at 4:11:55 PM UTC-4, Kevin Liu wrote:
>>>> Is this why I couldn't find implication in Julia? 
>>>> 
>>>>> Maybe it was considered redundant because (1) it is less primitive than 
>>>>> "^", "v", "~", (2) it saves very little typing since "A => B" is 
>>>>> equivalent to "~A v B". – Giorgio Jan 18 '13 at 14:50
>>>> 
>>>> Wikipedia also says the implication table is identical to that of ~p | q. 
>>>> So instead just the below?
>>>> 
>>>> julia> ~p | q 
>>>> 
>>>> false
>>>> 
>>>> 
>>>> I'll take that.
>>>> 
>>>>> On Thursday, October 6, 2016 at 4:08:00 PM UTC-3, Jeffrey Sarnoff wrote:
>>>>> (the version using ifelse benchmarks faster on my system)
>>>>> 
>>>>>> On Thursday, October 6, 2016 at 3:05:50 PM UTC-4, Jeffrey Sarnoff wrote:
>>>>>> here are two ways
>>>>>> 
>>>>>> implies(p::Bool, q::Bool) = !(p & !q)
>>>>>> 
>>>>>> implies(p::Bool, q::Bool) = ifelse(p, q, true)
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>>> On Thursday, October 6, 2016 at 12:10:51 PM UTC-4, Kevin Liu wrote:
>>>>>>> How is an implication represented in Julia? 
>>>>>>> 
>>>>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional


[julia-users] Re: Representation of a material conditional (implication)

2016-10-06 Thread Kevin Liu
Thanks for the distinction, Jeffrey.

Also, look what I found https://github.com/aimacode. Julia is empty :-). 
Can we hire some Martians to fill it up as we have ran out of Julians on 
Earth? I'm happy I found this though. 

On Thursday, October 6, 2016 at 5:26:43 PM UTC-3, Jeffrey Sarnoff wrote:
>
> you are welcome to use
> implies(p::Bool, q::Bool) = !p | q
> { !p, ~p likely compile to the same instructions -- they do for me; you 
> might prefer to use of !p here as that means 'logical_not(p)' where ~p 
> means 'flip_the_bits_of(p)' }
>
> I find that this form is also 40% slower than the ifelse form.
>
>
>
> On Thursday, October 6, 2016 at 4:11:55 PM UTC-4, Kevin Liu wrote:
>>
>> Is this why I couldn't find implication in Julia? 
>>
>> Maybe it was considered redundant because (1) it is less primitive than 
>>> "^", "v", "~", (2) it saves very little typing since "A => B" is equivalent 
>>> to "~A v B". – Giorgio 
>>> <http://programmers.stackexchange.com/users/29020/giorgio> Jan 18 '13 
>>> at 14:50 
>>> <http://programmers.stackexchange.com/questions/184089/why-dont-languages-include-implication-as-a-logical-operator#comment353607_184089>
>>
>>
>> Wikipedia also says the implication table is identical to that of ~p | q. 
>> So instead just the below?
>>
>> julia> ~p | q 
>>
>> false
>>
>>
>> I'll take that.
>>
>> On Thursday, October 6, 2016 at 4:08:00 PM UTC-3, Jeffrey Sarnoff wrote:
>>>
>>> (the version using ifelse benchmarks faster on my system)
>>>
>>> On Thursday, October 6, 2016 at 3:05:50 PM UTC-4, Jeffrey Sarnoff wrote:
>>>>
>>>> here are two ways
>>>>
>>>> implies(p::Bool, q::Bool) = !(p & !q)
>>>>
>>>> implies(p::Bool, q::Bool) = ifelse(p, q, true)
>>>>
>>>>
>>>>
>>>>
>>>> On Thursday, October 6, 2016 at 12:10:51 PM UTC-4, Kevin Liu wrote:
>>>>>
>>>>> How is an implication represented in Julia? 
>>>>>
>>>>>
>>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>>
>>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-06 Thread Kevin Liu
Sorry, 

julia> ~p|q

true



On Thursday, October 6, 2016 at 5:11:55 PM UTC-3, Kevin Liu wrote:
>
> Is this why I couldn't find implication in Julia? 
>
> Maybe it was considered redundant because (1) it is less primitive than 
>> "^", "v", "~", (2) it saves very little typing since "A => B" is equivalent 
>> to "~A v B". – Giorgio 
>> <http://programmers.stackexchange.com/users/29020/giorgio> Jan 18 '13 at 
>> 14:50 
>> <http://programmers.stackexchange.com/questions/184089/why-dont-languages-include-implication-as-a-logical-operator#comment353607_184089>
>
>
> Wikipedia also says the implication table is identical to that of ~p | q. 
> So instead just the below?
>
> julia> ~p | q 
>
> false
>
>
> I'll take that.
>
> On Thursday, October 6, 2016 at 4:08:00 PM UTC-3, Jeffrey Sarnoff wrote:
>>
>> (the version using ifelse benchmarks faster on my system)
>>
>> On Thursday, October 6, 2016 at 3:05:50 PM UTC-4, Jeffrey Sarnoff wrote:
>>>
>>> here are two ways
>>>
>>> implies(p::Bool, q::Bool) = !(p & !q)
>>>
>>> implies(p::Bool, q::Bool) = ifelse(p, q, true)
>>>
>>>
>>>
>>>
>>> On Thursday, October 6, 2016 at 12:10:51 PM UTC-4, Kevin Liu wrote:
>>>>
>>>> How is an implication represented in Julia? 
>>>>
>>>>
>>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>>
>>>

[julia-users] Re: Representation of a material conditional (implication)

2016-10-06 Thread Kevin Liu
Is this why I couldn't find implication in Julia? 

Maybe it was considered redundant because (1) it is less primitive than 
> "^", "v", "~", (2) it saves very little typing since "A => B" is equivalent 
> to "~A v B". – Giorgio 
> <http://programmers.stackexchange.com/users/29020/giorgio> Jan 18 '13 at 
> 14:50 
> <http://programmers.stackexchange.com/questions/184089/why-dont-languages-include-implication-as-a-logical-operator#comment353607_184089>


Wikipedia also says the implication table is identical to that of ~p | q. 
So instead just the below?

julia> ~p | q 

false


I'll take that.

On Thursday, October 6, 2016 at 4:08:00 PM UTC-3, Jeffrey Sarnoff wrote:
>
> (the version using ifelse benchmarks faster on my system)
>
> On Thursday, October 6, 2016 at 3:05:50 PM UTC-4, Jeffrey Sarnoff wrote:
>>
>> here are two ways
>>
>> implies(p::Bool, q::Bool) = !(p & !q)
>>
>> implies(p::Bool, q::Bool) = ifelse(p, q, true)
>>
>>
>>
>>
>> On Thursday, October 6, 2016 at 12:10:51 PM UTC-4, Kevin Liu wrote:
>>>
>>> How is an implication represented in Julia? 
>>>
>>>
>>> https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional
>>>
>>

[julia-users] Representation of a material conditional (implication)

2016-10-06 Thread Kevin Liu
How is an implication represented in Julia? 

https://en.wikipedia.org/wiki/Material_conditional#Definitions_of_the_material_conditional


Re: [julia-users] Re: Definition of true for validity

2016-10-01 Thread Kevin Liu
Thanks a lot, Steve. 

> On Oct 1, 2016, at 17:13, Steven G. Johnson  wrote:
> 
> https://github.com/JuliaLang/julia/blob/c4ebf7c8bbdfaba034a08fa795b93a5732514c64/src/jltypes.c#L3724


[julia-users] Re: Definition of true for validity

2016-10-01 Thread Kevin Liu
* Would anyone know in which file of https://github.com/JuliaLang/julia is 
the definition of {true} defined? Thanks

On Saturday, October 1, 2016 at 3:35:10 PM UTC-3, Kevin Liu wrote:
>
> Hello. Would anyone know in which file of 
> https://github.com/JuliaLang/julia would the definition of {true} be 
> defined? Thanks
>


[julia-users] Definition of true for validity

2016-10-01 Thread Kevin Liu
Hello. Would anyone know in which file 
of https://github.com/JuliaLang/julia would the definition of {true} be 
defined? Thanks


Re: [julia-users] Horn clauses

2016-09-20 Thread Kevin Liu
Thanks a lot Cedric. I am just getting to know about Russell and Norvig's 
importance thanks to you. I came across articles of theirs some time ago 
but didn't know they were pretty much the holy grail. I just came out of a 
330 page book from Domingos (even though not mathy) so am catching some 
air. I want to implement logical reasoning in Julia. I hope Julia will be 
it, and the bible you recommended me. 

On Tuesday, September 20, 2016 at 12:50:44 PM UTC-3, Cedric St-Jean wrote:
>
> It depends what you want to do. Julia's base language doesn't include 
> logical reasoning. If you want to translate a Prolog program into Julia, 
> you can get away with `if x then y`, but if you want to implement logical 
> reasoning, then you need to build some machinery yourself. I would use 
> types 
>
> immutable Implication
>precondition
>consequence
> end
>
> immutable And
>condition1
>condition2
> end
>
> ...
>
> then (a && b) => c becomes Implication(And(:a, :b), :c)
>
> Then I would define functions to perform backward-chaining. Julia is quite 
> nice for this kind of thing, because of multiple dispatch. 
>
> Then of course, you'll need unification/matching for variables, but that's 
> not too hard to write either. Read carefully Artificial Intelligence: A 
> modern approach, and try to implement its pseudocode in Julia. Write a lot 
> of tests to make sure that it works correctly, then slowly move up from 
> there. It takes a lot of time to learn mathematics; there are no short-cuts.
>
> Cédric
>
> On Tue, Sep 20, 2016 at 11:25 AM, Kevin Liu  > wrote:
>
>> The negation I'm guessing would be x = false, iff an equivalence of type 
>> and value (but in Julia?), implication a combination of if x = true then y 
>> = false? Is it as simple as this? 
>>
>>
>> On Tuesday, September 20, 2016 at 11:56:46 AM UTC-3, Kevin Liu wrote:
>>>
>>> Would anyone know how to represent logical connectives (e.g. negation ¬, 
>>> conjunction ∧, disjunction ∨, material implication ⇒, biconditional iff ⇔) 
>>> and quantifiers (e.g. all ∀, exists ∃) in Julia? 
>>>
>>> I understand 'all' can be a for loop. Is the conjunction a comma like in 
>>> Prolog? Disjunction the 'else' of an if statement? 'Exists' an x = true? 
>>>
>>> On Tuesday, September 20, 2016 at 12:31:23 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> Thanks Cedric, read some of that and LilKanren.jl and this is where I 
>>>> am with the code (attached). Will continue tomorrow. Feel a bit lost, 
>>>> nothing out of the usual. 
>>>>
>>>> On Monday, September 19, 2016 at 9:15:43 PM UTC-3, Cedric St-Jean wrote:
>>>>>
>>>>> You might want to roll your own, too. It's instructive, and not 
>>>>> particularly hard. Russell and Norvig's textbook has a good section on it.
>>>>>
>>>>> On Monday, September 19, 2016 at 5:44:04 PM UTC-4, Kevin Liu wrote:
>>>>>>
>>>>>> Thanks for the direction, Stefan.
>>>>>>
>>>>>> On Monday, September 19, 2016 at 3:10:19 PM UTC-3, Stefan Karpinski 
>>>>>> wrote:
>>>>>>>
>>>>>>> You might try LilKanren.jl <https://github.com/lilinjn/LilKanren.jl>
>>>>>>> .
>>>>>>>
>>>>>>> On Mon, Sep 19, 2016 at 10:21 AM, Kevin Liu  
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hello. What would be the long-term solution for using Horn clauses 
>>>>>>>> in Julia? Is the present solution to call Prolog from C and C from 
>>>>>>>> Julia? 
>>>>>>>> Thanks
>>>>>>>>
>>>>>>>
>>>>>>>
>

Re: [julia-users] Horn clauses

2016-09-20 Thread Kevin Liu
The negation I'm guessing would be x = false, iff an equivalence of type 
and value (but in Julia?), implication a combination of if x = true then y 
= false? Is it as simple as this? 

On Tuesday, September 20, 2016 at 11:56:46 AM UTC-3, Kevin Liu wrote:
>
> Would anyone know how to represent logical connectives (e.g. negation ¬, 
> conjunction ∧, disjunction ∨, material implication ⇒, biconditional iff ⇔) 
> and quantifiers (e.g. all ∀, exists ∃) in Julia? 
>
> I understand 'all' can be a for loop. Is the conjunction a comma like in 
> Prolog? Disjunction the 'else' of an if statement? 'Exists' an x = true? 
>
> On Tuesday, September 20, 2016 at 12:31:23 AM UTC-3, Kevin Liu wrote:
>>
>> Thanks Cedric, read some of that and LilKanren.jl and this is where I am 
>> with the code (attached). Will continue tomorrow. Feel a bit lost, nothing 
>> out of the usual. 
>>
>> On Monday, September 19, 2016 at 9:15:43 PM UTC-3, Cedric St-Jean wrote:
>>>
>>> You might want to roll your own, too. It's instructive, and not 
>>> particularly hard. Russell and Norvig's textbook has a good section on it.
>>>
>>> On Monday, September 19, 2016 at 5:44:04 PM UTC-4, Kevin Liu wrote:
>>>>
>>>> Thanks for the direction, Stefan.
>>>>
>>>> On Monday, September 19, 2016 at 3:10:19 PM UTC-3, Stefan Karpinski 
>>>> wrote:
>>>>>
>>>>> You might try LilKanren.jl <https://github.com/lilinjn/LilKanren.jl>.
>>>>>
>>>>> On Mon, Sep 19, 2016 at 10:21 AM, Kevin Liu  wrote:
>>>>>
>>>>>> Hello. What would be the long-term solution for using Horn clauses in 
>>>>>> Julia? Is the present solution to call Prolog from C and C from Julia? 
>>>>>> Thanks
>>>>>>
>>>>>
>>>>>

Re: [julia-users] Horn clauses

2016-09-20 Thread Kevin Liu
Would anyone know how to represent logical connectives (e.g. negation ¬, 
conjunction ∧, disjunction ∨, material implication ⇒, biconditional iff ⇔) 
and quantifiers (e.g. all ∀, exists ∃) in Julia? 

I understand 'all' can be a for loop. Is the conjunction a comma like in 
Prolog? Disjunction the 'else' of an if statement? 'Exists' an x = true? 

On Tuesday, September 20, 2016 at 12:31:23 AM UTC-3, Kevin Liu wrote:
>
> Thanks Cedric, read some of that and LilKanren.jl and this is where I am 
> with the code (attached). Will continue tomorrow. Feel a bit lost, nothing 
> out of the usual. 
>
> On Monday, September 19, 2016 at 9:15:43 PM UTC-3, Cedric St-Jean wrote:
>>
>> You might want to roll your own, too. It's instructive, and not 
>> particularly hard. Russell and Norvig's textbook has a good section on it.
>>
>> On Monday, September 19, 2016 at 5:44:04 PM UTC-4, Kevin Liu wrote:
>>>
>>> Thanks for the direction, Stefan.
>>>
>>> On Monday, September 19, 2016 at 3:10:19 PM UTC-3, Stefan Karpinski 
>>> wrote:
>>>>
>>>> You might try LilKanren.jl <https://github.com/lilinjn/LilKanren.jl>.
>>>>
>>>> On Mon, Sep 19, 2016 at 10:21 AM, Kevin Liu  wrote:
>>>>
>>>>> Hello. What would be the long-term solution for using Horn clauses in 
>>>>> Julia? Is the present solution to call Prolog from C and C from Julia? 
>>>>> Thanks
>>>>>
>>>>
>>>>

Re: [julia-users] Horn clauses

2016-09-19 Thread Kevin Liu
Thanks for the direction, Stefan.

On Monday, September 19, 2016 at 3:10:19 PM UTC-3, Stefan Karpinski wrote:
>
> You might try LilKanren.jl <https://github.com/lilinjn/LilKanren.jl>.
>
> On Mon, Sep 19, 2016 at 10:21 AM, Kevin Liu  > wrote:
>
>> Hello. What would be the long-term solution for using Horn clauses in 
>> Julia? Is the present solution to call Prolog from C and C from Julia? 
>> Thanks
>>
>
>

[julia-users] Horn clauses

2016-09-19 Thread Kevin Liu
Hello. What would be the long-term solution for using Horn clauses in 
Julia? Is the present solution to call Prolog from C and C from Julia? 
Thanks


Re: [julia-users] Is the master algorithm on the roadmap?

2016-09-02 Thread Kevin Liu
Thanks Tim. It's frustrating to see the community has very little 
experience with MLN, after all, this is the smartest group of people I know 
in computer science. Okay, the focus here will be on code. 

On Friday, September 2, 2016 at 1:16:56 PM UTC-3, Viral Shah wrote:
>
> I agree with John here. This is totally unacceptable, and is making the 
> experience poorer for others.
>
> -viral
>
> On Friday, September 2, 2016 at 8:48:44 PM UTC+5:30, John Myles White 
> wrote:
>>
>> May I also point out to the My settings button on your top right corner > 
>>> My topic email subscriptions > Unsubscribe from this thread, which would've 
>>> spared you the message.
>>
>>
>> I'm sorry, but this kind of attitude is totally unacceptable, Kevin. I've 
>> tolerated your misuse of the mailing list, but it is not acceptable for you 
>> to imply that others are behaving inappropriately when they complain about 
>> your unequivocal misuse of the mailing list.
>>
>>  --John 
>>
>> On Friday, September 2, 2016 at 7:23:27 AM UTC-7, Kevin Liu wrote:
>>>
>>> May I also point out to the My settings button on your top right corner 
>>> > My topic email subscriptions > Unsubscribe from this thread, which 
>>> would've spared you the message.
>>>
>>> On Friday, September 2, 2016 at 11:19:42 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> Hello Chris. Have you been applying relational learning to your Neural 
>>>> Crest Migration Patterns in Craniofacial Development research project? It 
>>>> could enhance your insights. 
>>>>
>>>> On Friday, September 2, 2016 at 6:18:15 AM UTC-3, Chris Rackauckas 
>>>> wrote:
>>>>>
>>>>> This entire thread is a trip... a trip which is not really relevant to 
>>>>> julia-users. You may want to share these musings in the form of a blog 
>>>>> instead of posting them here.
>>>>>
>>>>> On Friday, September 2, 2016 at 1:41:03 AM UTC-7, Kevin Liu wrote:
>>>>>>
>>>>>> Princeton's post: 
>>>>>> http://www.nytimes.com/2016/08/28/world/europe/france-burkini-bikini-ban.html?_r=1
>>>>>>
>>>>>> Only logic saves us from paradox. - Minsky
>>>>>>
>>>>>> On Thursday, August 25, 2016 at 10:18:27 PM UTC-3, Kevin Liu wrote:
>>>>>>>
>>>>>>> Tim Holy, I am watching your keynote speech at JuliaCon 2016 where 
>>>>>>> you mention the best optimization is not doing the computation at all. 
>>>>>>>
>>>>>>> Domingos talks about that in his book, where an efficient kind of 
>>>>>>> learning is by analogy, with no model at all, and how numerous 
>>>>>>> scientific 
>>>>>>> discoveries have been made that way, e.g. Bohr's analogy of the solar 
>>>>>>> system to the atom. Analogizers learn by hypothesizing that entities 
>>>>>>> with 
>>>>>>> similar known properties have similar unknown ones. 
>>>>>>>
>>>>>>> MLN can reproduce structure mapping, which is the more powerful type 
>>>>>>> of analogy, that can make inferences from one domain (solar system) to 
>>>>>>> another (atom). This can be done by learning formulas that don't refer 
>>>>>>> to 
>>>>>>> any of the specific relations in the source domain (general formulas). 
>>>>>>>
>>>>>>> Seth and Tim have been helping me a lot with putting the pieces 
>>>>>>> together for MLN in the repo I created 
>>>>>>> <https://github.com/hpoit/Kenya.jl/issues/2>, and more help is 
>>>>>>> always welcome. I would like to write MLN in idiomatic Julia. My 
>>>>>>> question 
>>>>>>> at the moment to you and the community is how to keep mappings of 
>>>>>>> first-order harmonic functions type-stable in Julia? I am just 
>>>>>>> getting acquainted with the type field. 
>>>>>>>
>>>>>>> On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>>>>>>>>
>>>>>>>> Helping me separate the process in parts and priorities would be a 
>>>>>>>> lot of help. 
>>>>>>>>
>>>>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-09-02 Thread Kevin Liu
The explicit message here with no implicit one is that the unsubscribe option 
exists for a reason. 

On Sep 2, 2016, at 12:18, John Myles White  wrote:

>> May I also point out to the My settings button on your top right corner > My 
>> topic email subscriptions > Unsubscribe from this thread, which would've 
>> spared you the message.
> 
> I'm sorry, but this kind of attitude is totally unacceptable, Kevin. I've 
> tolerated your misuse of the mailing list, but it is not acceptable for you 
> to imply that others are behaving inappropriately when they complain about 
> your unequivocal misuse of the mailing list.
> 
>  --John 
> 
>> On Friday, September 2, 2016 at 7:23:27 AM UTC-7, Kevin Liu wrote:
>> May I also point out to the My settings button on your top right corner > My 
>> topic email subscriptions > Unsubscribe from this thread, which would've 
>> spared you the message.
>> 
>>> On Friday, September 2, 2016 at 11:19:42 AM UTC-3, Kevin Liu wrote:
>>> Hello Chris. Have you been applying relational learning to your Neural 
>>> Crest Migration Patterns in Craniofacial Development research project? It 
>>> could enhance your insights. 
>>> 
>>>> On Friday, September 2, 2016 at 6:18:15 AM UTC-3, Chris Rackauckas wrote:
>>>> This entire thread is a trip... a trip which is not really relevant to 
>>>> julia-users. You may want to share these musings in the form of a blog 
>>>> instead of posting them here.
>>>> 
>>>>> On Friday, September 2, 2016 at 1:41:03 AM UTC-7, Kevin Liu wrote:
>>>>> Princeton's post: 
>>>>> http://www.nytimes.com/2016/08/28/world/europe/france-burkini-bikini-ban.html?_r=1
>>>>> 
>>>>> Only logic saves us from paradox. - Minsky
>>>>> 
>>>>>> On Thursday, August 25, 2016 at 10:18:27 PM UTC-3, Kevin Liu wrote:
>>>>>> Tim Holy, I am watching your keynote speech at JuliaCon 2016 where you 
>>>>>> mention the best optimization is not doing the computation at all. 
>>>>>> 
>>>>>> Domingos talks about that in his book, where an efficient kind of 
>>>>>> learning is by analogy, with no model at all, and how numerous 
>>>>>> scientific discoveries have been made that way, e.g. Bohr's analogy of 
>>>>>> the solar system to the atom. Analogizers learn by hypothesizing that 
>>>>>> entities with similar known properties have similar unknown ones. 
>>>>>> 
>>>>>> MLN can reproduce structure mapping, which is the more powerful type of 
>>>>>> analogy, that can make inferences from one domain (solar system) to 
>>>>>> another (atom). This can be done by learning formulas that don't refer 
>>>>>> to any of the specific relations in the source domain (general 
>>>>>> formulas). 
>>>>>> 
>>>>>> Seth and Tim have been helping me a lot with putting the pieces together 
>>>>>> for MLN in the repo I created, and more help is always welcome. I would 
>>>>>> like to write MLN in idiomatic Julia. My question at the moment to you 
>>>>>> and the community is how to keep mappings of first-order harmonic 
>>>>>> functions type-stable in Julia? I am just getting acquainted with the 
>>>>>> type field. 
>>>>>> 
>>>>>>> On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>>>>>>> Helping me separate the process in parts and priorities would be a lot 
>>>>>>> of help. 
>>>>>>> 
>>>>>>>> On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>>>>>>>> Tim Holy, what if I could tap into the well of knowledge that you are 
>>>>>>>> to speed up things? Can you imagine if every learner had to start 
>>>>>>>> without priors? 
>>>>>>>> 
>>>>>>>> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
>>>>>>>> > 
>>>>>>>> > I'd recommend starting by picking a very small project. For example, 
>>>>>>>> > fix a bug 
>>>>>>>> > or implement a small improvement in a package that you already find 
>>>>>>>> > useful or 
>>>>>>>> > interesting. That way you'll get some guidance while making a 
>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-09-02 Thread Kevin Liu
May I also point out to the My settings button on your top right corner > 
My topic email subscriptions > Unsubscribe from this thread, which would've 
spared you the message.

On Friday, September 2, 2016 at 11:19:42 AM UTC-3, Kevin Liu wrote:
>
> Hello Chris. Have you been applying relational learning to your Neural 
> Crest Migration Patterns in Craniofacial Development research project? It 
> could enhance your insights. 
>
> On Friday, September 2, 2016 at 6:18:15 AM UTC-3, Chris Rackauckas wrote:
>>
>> This entire thread is a trip... a trip which is not really relevant to 
>> julia-users. You may want to share these musings in the form of a blog 
>> instead of posting them here.
>>
>> On Friday, September 2, 2016 at 1:41:03 AM UTC-7, Kevin Liu wrote:
>>>
>>> Princeton's post: 
>>> http://www.nytimes.com/2016/08/28/world/europe/france-burkini-bikini-ban.html?_r=1
>>>
>>> Only logic saves us from paradox. - Minsky
>>>
>>> On Thursday, August 25, 2016 at 10:18:27 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> Tim Holy, I am watching your keynote speech at JuliaCon 2016 where you 
>>>> mention the best optimization is not doing the computation at all. 
>>>>
>>>> Domingos talks about that in his book, where an efficient kind of 
>>>> learning is by analogy, with no model at all, and how numerous scientific 
>>>> discoveries have been made that way, e.g. Bohr's analogy of the solar 
>>>> system to the atom. Analogizers learn by hypothesizing that entities with 
>>>> similar known properties have similar unknown ones. 
>>>>
>>>> MLN can reproduce structure mapping, which is the more powerful type of 
>>>> analogy, that can make inferences from one domain (solar system) to 
>>>> another 
>>>> (atom). This can be done by learning formulas that don't refer to any of 
>>>> the specific relations in the source domain (general formulas). 
>>>>
>>>> Seth and Tim have been helping me a lot with putting the pieces 
>>>> together for MLN in the repo I created 
>>>> <https://github.com/hpoit/Kenya.jl/issues/2>, and more help is always 
>>>> welcome. I would like to write MLN in idiomatic Julia. My question at the 
>>>> moment to you and the community is how to keep mappings of first-order 
>>>> harmonic functions type-stable in Julia? I am just getting acquainted with 
>>>> the type field. 
>>>>
>>>> On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Helping me separate the process in parts and priorities would be a lot 
>>>>> of help. 
>>>>>
>>>>> On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> Tim Holy, what if I could tap into the well of knowledge that you are 
>>>>>> to speed up things? Can you imagine if every learner had to start 
>>>>>> without 
>>>>>> priors? 
>>>>>>
>>>>>> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
>>>>>> > 
>>>>>> > I'd recommend starting by picking a very small project. For 
>>>>>> example, fix a bug 
>>>>>> > or implement a small improvement in a package that you already find 
>>>>>> useful or 
>>>>>> > interesting. That way you'll get some guidance while making a 
>>>>>> positive 
>>>>>> > contribution; once you know more about julia, it will be easier to 
>>>>>> see your 
>>>>>> > way forward. 
>>>>>> > 
>>>>>> > Best, 
>>>>>> > --Tim 
>>>>>> > 
>>>>>> >> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote: 
>>>>>> >> I have no idea where to start and where to finish. Founders' help 
>>>>>> would be 
>>>>>> >> wonderful. 
>>>>>> >> 
>>>>>> >>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote: 
>>>>>> >>> After which I have to code Felix into Julia, a relational 
>>>>>> optimizer for 
>>>>>> >>> statistical inference with Tuffy <
>>>>>> http://i.stanford.edu/hazy/tuffy/> 
>>>>>> >>> inside, for enterprise settings. 
>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-09-02 Thread Kevin Liu
Hello Chris. Have you been applying relational learning to your Neural 
Crest Migration Patterns in Craniofacial Development research project? It 
could enhance your insights. 

On Friday, September 2, 2016 at 6:18:15 AM UTC-3, Chris Rackauckas wrote:
>
> This entire thread is a trip... a trip which is not really relevant to 
> julia-users. You may want to share these musings in the form of a blog 
> instead of posting them here.
>
> On Friday, September 2, 2016 at 1:41:03 AM UTC-7, Kevin Liu wrote:
>>
>> Princeton's post: 
>> http://www.nytimes.com/2016/08/28/world/europe/france-burkini-bikini-ban.html?_r=1
>>
>> Only logic saves us from paradox. - Minsky
>>
>> On Thursday, August 25, 2016 at 10:18:27 PM UTC-3, Kevin Liu wrote:
>>>
>>> Tim Holy, I am watching your keynote speech at JuliaCon 2016 where you 
>>> mention the best optimization is not doing the computation at all. 
>>>
>>> Domingos talks about that in his book, where an efficient kind of 
>>> learning is by analogy, with no model at all, and how numerous scientific 
>>> discoveries have been made that way, e.g. Bohr's analogy of the solar 
>>> system to the atom. Analogizers learn by hypothesizing that entities with 
>>> similar known properties have similar unknown ones. 
>>>
>>> MLN can reproduce structure mapping, which is the more powerful type of 
>>> analogy, that can make inferences from one domain (solar system) to another 
>>> (atom). This can be done by learning formulas that don't refer to any of 
>>> the specific relations in the source domain (general formulas). 
>>>
>>> Seth and Tim have been helping me a lot with putting the pieces together 
>>> for MLN in the repo I created 
>>> <https://github.com/hpoit/Kenya.jl/issues/2>, and more help is always 
>>> welcome. I would like to write MLN in idiomatic Julia. My question at the 
>>> moment to you and the community is how to keep mappings of first-order 
>>> harmonic functions type-stable in Julia? I am just getting acquainted with 
>>> the type field. 
>>>
>>> On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> Helping me separate the process in parts and priorities would be a lot 
>>>> of help. 
>>>>
>>>> On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Tim Holy, what if I could tap into the well of knowledge that you are 
>>>>> to speed up things? Can you imagine if every learner had to start without 
>>>>> priors? 
>>>>>
>>>>> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
>>>>> > 
>>>>> > I'd recommend starting by picking a very small project. For example, 
>>>>> fix a bug 
>>>>> > or implement a small improvement in a package that you already find 
>>>>> useful or 
>>>>> > interesting. That way you'll get some guidance while making a 
>>>>> positive 
>>>>> > contribution; once you know more about julia, it will be easier to 
>>>>> see your 
>>>>> > way forward. 
>>>>> > 
>>>>> > Best, 
>>>>> > --Tim 
>>>>> > 
>>>>> >> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote: 
>>>>> >> I have no idea where to start and where to finish. Founders' help 
>>>>> would be 
>>>>> >> wonderful. 
>>>>> >> 
>>>>> >>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote: 
>>>>> >>> After which I have to code Felix into Julia, a relational 
>>>>> optimizer for 
>>>>> >>> statistical inference with Tuffy <
>>>>> http://i.stanford.edu/hazy/tuffy/> 
>>>>> >>> inside, for enterprise settings. 
>>>>> >>> 
>>>>> >>>> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote: 
>>>>> >>>> Can I get tips on bringing Alchemy's optimized Tuffy 
>>>>> >>>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while 
>>>>> showing the 
>>>>> >>>> best of Julia? I am going for the most correct way, even if it 
>>>>> means 
>>>>> >>>> coding 
>>>>> >>>> Tuffy into C and Julia. 
>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-09-02 Thread Kevin Liu
Princeton's 
post: 
http://www.nytimes.com/2016/08/28/world/europe/france-burkini-bikini-ban.html?_r=1

Only logic saves us from paradox. - Minsky

On Thursday, August 25, 2016 at 10:18:27 PM UTC-3, Kevin Liu wrote:
>
> Tim Holy, I am watching your keynote speech at JuliaCon 2016 where you 
> mention the best optimization is not doing the computation at all. 
>
> Domingos talks about that in his book, where an efficient kind of learning 
> is by analogy, with no model at all, and how numerous scientific 
> discoveries have been made that way, e.g. Bohr's analogy of the solar 
> system to the atom. Analogizers learn by hypothesizing that entities with 
> similar known properties have similar unknown ones. 
>
> MLN can reproduce structure mapping, which is the more powerful type of 
> analogy, that can make inferences from one domain (solar system) to another 
> (atom). This can be done by learning formulas that don't refer to any of 
> the specific relations in the source domain (general formulas). 
>
> Seth and Tim have been helping me a lot with putting the pieces together 
> for MLN in the repo I created <https://github.com/hpoit/Kenya.jl/issues/2>, 
> and 
> more help is always welcome. I would like to write MLN in idiomatic Julia. 
> My question at the moment to you and the community is how to keep mappings 
> of first-order harmonic functions type-stable in Julia? I am just 
> getting acquainted with the type field. 
>
> On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>>
>> Helping me separate the process in parts and priorities would be a lot of 
>> help. 
>>
>> On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>>>
>>> Tim Holy, what if I could tap into the well of knowledge that you are to 
>>> speed up things? Can you imagine if every learner had to start without 
>>> priors? 
>>>
>>> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
>>> > 
>>> > I'd recommend starting by picking a very small project. For example, 
>>> fix a bug 
>>> > or implement a small improvement in a package that you already find 
>>> useful or 
>>> > interesting. That way you'll get some guidance while making a positive 
>>> > contribution; once you know more about julia, it will be easier to see 
>>> your 
>>> > way forward. 
>>> > 
>>> > Best, 
>>> > --Tim 
>>> > 
>>> >> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote: 
>>> >> I have no idea where to start and where to finish. Founders' help 
>>> would be 
>>> >> wonderful. 
>>> >> 
>>> >>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote: 
>>> >>> After which I have to code Felix into Julia, a relational optimizer 
>>> for 
>>> >>> statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/> 
>>>
>>> >>> inside, for enterprise settings. 
>>> >>> 
>>> >>>> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote: 
>>> >>>> Can I get tips on bringing Alchemy's optimized Tuffy 
>>> >>>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing 
>>> the 
>>> >>>> best of Julia? I am going for the most correct way, even if it 
>>> means 
>>> >>>> coding 
>>> >>>> Tuffy into C and Julia. 
>>> >>>> 
>>> >>>>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote: 
>>> >>>>> I'll try to build it, compare it, and show it to you guys. I 
>>> offered to 
>>> >>>>> do this as work. I am waiting to see if they will accept it. 
>>> >>>>> 
>>> >>>>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski 
>>> wrote: 
>>> >>>>>> Kevin, as previously requested by Isaiah, please take this to 
>>> some 
>>> >>>>>> other forum or maybe start a blog. 
>>> >>>>>> 
>>> >>>>>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  
>>> wrote: 
>>> >>>>>>> Symmetry-based learning, Domingos, 2014 
>>> >>>>>>> 
>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning 
>>> >>>>>>> / 
>>> >>>>>>> 
>&

[julia-users] Re: Package development - best practices

2016-08-30 Thread Kevin Liu
Just make an alias of, say, ~/.julia/v0.4/mypackage (hidden) and paste the 
alias in the directory you want, e.g. Google Drive/mypackage_alias (not 
hidden)

Whatever changes you make to mypackage_alias will change mypackage, which 
can then be called on the REPL with 

This way you don't have to deal with hidden folders or having your only 
backup on Github. 

I couldn't find a clear alternative on Julialang's manual, which is 
understandable given their resource constraint and priorities. 

If my suggestion here is bullocks, please comment. 

On Friday, December 19, 2014 at 10:47:31 PM UTC-2, David van Leeuwen wrote:
>
> Hi, 
>
> On Wednesday, December 17, 2014 11:21:44 PM UTC+1, Seth wrote:
>>
>> I'm wondering whether folks are actually basing their repos in ~/.julia 
>> and doing their development and commits from there, or whether there's some 
>> other process that allows development to happen in a more standard location 
>> than a hidden directory off of ~ while still allowing use of Pkg. I'm 
>> probably overlooking something trivial.
>>
>> What's a recommended setup/process for creating packages using Pkg to 
>> manage them?
>>
>
> I'm not sure what the right answer would be, but I have been struggling 
> with it a lot.  This is my set-up for most of the stuff I do.  
>
>  - I develop packages in their own directories on the user filesystem. 
>  For me, that would be `.../julia//` .  
>  - from there, in `./src/` I have the various source files.  One, 
> specifically is called "nomodule.jl".  This contains a 
>- "require()"
>- "include()"
>- ...
>  - Each time I change something in the code, I do a `reload("nomodule.jl`) 
> from the REPL.  This replaces all functions, except for the type 
> definitions---they can't be updated
>  - When I need to change the types, I need to do a `ctrl-D; julia; 
> reload("nomodule.jl")` which can be a bit of a pain
>  - When I am happy with the functionality, I make this code into a module, 
> using a file ".jl", very similar to the "nomodule.jl" but with the 
> "require" replaced by "include"
>  - I sync this then with github
>  - If I need the functionality from another working directory, I do a 
> `Pkg.clone()` to load a sort-of-stable version into ~/.julia
>  - When I am really happy with the code, I do a pull request to 
> METADATA.jl for my package to be included in the standard set of Julia 
> packages. 
>
>  - If I need functionality on some other machines, I sync using git, 
> either through github or directly.
>
>  Cheers, 
>
> ---david
>


Re: [julia-users] Package installation directory: dealing with multiple Julia version

2016-08-29 Thread Kevin Liu
Hi Tim, I changed my package directory on OSX to be in Google Drive, but 
*using* the package on the REPL calls it from the default location, 
/.julia/v0.4... How do I change *using* to call the package from Google 
Drive? Thanks, Kevin

On Tuesday, September 30, 2014 at 7:14:41 AM UTC-3, Tim Holy wrote:
>
> On Tuesday, September 30, 2014 02:48:23 AM Giulio Valentino Dalla Riva 
> wrote: 
> > Wouldn't this also change the behavior of my Julia? 
>
> Not if you have different aliases, `myjulia` and `centraljulia`. You can 
> define 
> each one to do something different. 
>
> --Tim 
>


Re: [julia-users] dispatch slowdown when iterating over array with abstract values

2016-08-26 Thread Kevin Liu
Thanks Tim for leading me to this thread.

On Friday, August 26, 2016 at 11:06:33 AM UTC-3, Kevin Liu wrote:
>
> Nice video recommendation, Yichao. Thanks.
>
> On Saturday, April 2, 2016 at 1:16:07 PM UTC-3, Yichao Yu wrote:
>>
>> On Sat, Apr 2, 2016 at 10:26 AM, Cedric St-Jean  
>> wrote: 
>> > 
>> >> Therefore there's no way the compiler can rewrite the slow version to 
>> the 
>> >> fast version. 
>> > 
>> > 
>> > It knows that the element type is a Feature, so it could produce: 
>> > 
>> > if isa(features[i], A) 
>> > retval += evaluate(features[i]::A) 
>> > elseif isa(features[i], B) 
>> > retval += evaluate(features[i]::B) 
>> > else 
>> > retval += evaluate(features[i]) 
>> > end 
>>
>> This is kind of the optimization I mentioned but no this will still be 
>> much slower than the other version. 
>> The compiler has no idea what the return type of the third one so this 
>> version is still type unstable and you get dynamic dispatch at every 
>> iteration for the floating point add. Of course there's more 
>> sophisticated transformation that can keep you in the fast path as 
>> long as possible and create extra code to check and handle the slow 
>> cases but it will still be slower. 
>>
>> I also recommand Jeff's talk[1] for a better explaination of the general 
>> idea. 
>>
>> [1] https://www.youtube.com/watch?v=cjzcYM9YhwA 
>>
>> > 
>> > and it would make sense for abstract types that have few subtypes. I 
>> didn't 
>> > realize that dispatch was an order of magnitude slower than type 
>> checking. 
>> > It's easy enough to write a macro generating this expansion, too. 
>> > 
>> > On Saturday, April 2, 2016 at 2:05:20 AM UTC-4, Yichao Yu wrote: 
>> >> 
>> >> On Fri, Apr 1, 2016 at 9:56 PM, Tim Wheeler  
>> wrote: 
>> >> > Hello Julia Users. 
>> >> > 
>> >> > I ran into a weird slowdown issue and reproduced a minimal working 
>> >> > example. 
>> >> > Maybe someone can help shed some light. 
>> >> > 
>> >> > abstract Feature 
>> >> > 
>> >> > type A <: Feature end 
>> >> > evaluate(f::A) = 1.0 
>> >> > 
>> >> > type B <: Feature end 
>> >> > evaluate(f::B) = 0.0 
>> >> > 
>> >> > function slow(features::Vector{Feature}) 
>> >> > retval = 0.0 
>> >> > for i in 1 : length(features) 
>> >> > retval += evaluate(features[i]) 
>> >> > end 
>> >> > retval 
>> >> > end 
>> >> > 
>> >> > function fast(features::Vector{Feature}) 
>> >> > retval = 0.0 
>> >> > for i in 1 : length(features) 
>> >> > if isa(features[i], A) 
>> >> > retval += evaluate(features[i]::A) 
>> >> > else 
>> >> > retval += evaluate(features[i]::B) 
>> >> > end 
>> >> > end 
>> >> > retval 
>> >> > end 
>> >> > 
>> >> > using ProfileView 
>> >> > 
>> >> > features = Feature[] 
>> >> > for i in 1 : 1 
>> >> > push!(features, A()) 
>> >> > end 
>> >> > 
>> >> > slow(features) 
>> >> > @time slow(features) 
>> >> > fast(features) 
>> >> > @time fast(features) 
>> >> > 
>> >> > The output is: 
>> >> > 
>> >> > 0.000136 seconds (10.15 k allocations: 166.417 KB) 
>> >> > 0.12 seconds (5 allocations: 176 bytes) 
>> >> > 
>> >> > 
>> >> > This is a HUGE difference! Am I missing something big? Is there a 
>> good 
>> >> > way 
>> >> > to inspect code to figure out where I am going wrong? 
>> >> 
>> >> This is because of type instability as you will find in the 
>> performance 
>> >> tips. 
>> >> Note that slow and fast are not equivalent since the fast version only 
>> >> accept `A` or `B` but the slow version accepts any subtype of feature 
>> >> that you may ever define. Therefore there's no way the compiler can 
>> >> rewrite the slow version to the fast version. 
>> >> There are optimizations that can be applied to bring down the gap but 
>> >> there'll always be a large difference between the two. 
>> >> 
>> >> > 
>> >> > 
>> >> > Thank you in advance for any guidance. 
>> >> > 
>> >> > 
>> >> > -Tim 
>> >> > 
>> >> > 
>> >> > 
>> >> > 
>> >> > 
>>
>

Re: [julia-users] dispatch slowdown when iterating over array with abstract values

2016-08-26 Thread Kevin Liu
Nice video recommendation, Yichao. Thanks.

On Saturday, April 2, 2016 at 1:16:07 PM UTC-3, Yichao Yu wrote:
>
> On Sat, Apr 2, 2016 at 10:26 AM, Cedric St-Jean  > wrote: 
> > 
> >> Therefore there's no way the compiler can rewrite the slow version to 
> the 
> >> fast version. 
> > 
> > 
> > It knows that the element type is a Feature, so it could produce: 
> > 
> > if isa(features[i], A) 
> > retval += evaluate(features[i]::A) 
> > elseif isa(features[i], B) 
> > retval += evaluate(features[i]::B) 
> > else 
> > retval += evaluate(features[i]) 
> > end 
>
> This is kind of the optimization I mentioned but no this will still be 
> much slower than the other version. 
> The compiler has no idea what the return type of the third one so this 
> version is still type unstable and you get dynamic dispatch at every 
> iteration for the floating point add. Of course there's more 
> sophisticated transformation that can keep you in the fast path as 
> long as possible and create extra code to check and handle the slow 
> cases but it will still be slower. 
>
> I also recommand Jeff's talk[1] for a better explaination of the general 
> idea. 
>
> [1] https://www.youtube.com/watch?v=cjzcYM9YhwA 
>
> > 
> > and it would make sense for abstract types that have few subtypes. I 
> didn't 
> > realize that dispatch was an order of magnitude slower than type 
> checking. 
> > It's easy enough to write a macro generating this expansion, too. 
> > 
> > On Saturday, April 2, 2016 at 2:05:20 AM UTC-4, Yichao Yu wrote: 
> >> 
> >> On Fri, Apr 1, 2016 at 9:56 PM, Tim Wheeler  
> wrote: 
> >> > Hello Julia Users. 
> >> > 
> >> > I ran into a weird slowdown issue and reproduced a minimal working 
> >> > example. 
> >> > Maybe someone can help shed some light. 
> >> > 
> >> > abstract Feature 
> >> > 
> >> > type A <: Feature end 
> >> > evaluate(f::A) = 1.0 
> >> > 
> >> > type B <: Feature end 
> >> > evaluate(f::B) = 0.0 
> >> > 
> >> > function slow(features::Vector{Feature}) 
> >> > retval = 0.0 
> >> > for i in 1 : length(features) 
> >> > retval += evaluate(features[i]) 
> >> > end 
> >> > retval 
> >> > end 
> >> > 
> >> > function fast(features::Vector{Feature}) 
> >> > retval = 0.0 
> >> > for i in 1 : length(features) 
> >> > if isa(features[i], A) 
> >> > retval += evaluate(features[i]::A) 
> >> > else 
> >> > retval += evaluate(features[i]::B) 
> >> > end 
> >> > end 
> >> > retval 
> >> > end 
> >> > 
> >> > using ProfileView 
> >> > 
> >> > features = Feature[] 
> >> > for i in 1 : 1 
> >> > push!(features, A()) 
> >> > end 
> >> > 
> >> > slow(features) 
> >> > @time slow(features) 
> >> > fast(features) 
> >> > @time fast(features) 
> >> > 
> >> > The output is: 
> >> > 
> >> > 0.000136 seconds (10.15 k allocations: 166.417 KB) 
> >> > 0.12 seconds (5 allocations: 176 bytes) 
> >> > 
> >> > 
> >> > This is a HUGE difference! Am I missing something big? Is there a 
> good 
> >> > way 
> >> > to inspect code to figure out where I am going wrong? 
> >> 
> >> This is because of type instability as you will find in the performance 
> >> tips. 
> >> Note that slow and fast are not equivalent since the fast version only 
> >> accept `A` or `B` but the slow version accepts any subtype of feature 
> >> that you may ever define. Therefore there's no way the compiler can 
> >> rewrite the slow version to the fast version. 
> >> There are optimizations that can be applied to bring down the gap but 
> >> there'll always be a large difference between the two. 
> >> 
> >> > 
> >> > 
> >> > Thank you in advance for any guidance. 
> >> > 
> >> > 
> >> > -Tim 
> >> > 
> >> > 
> >> > 
> >> > 
> >> > 
>


Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-25 Thread Kevin Liu
Tim Holy, I am watching your keynote speech at JuliaCon 2016 where you 
mention the best optimization is not doing the computation at all. 

Domingos talks about that in his book, where an efficient kind of learning 
is by analogy, with no model at all, and how numerous scientific 
discoveries have been made that way, e.g. Bohr's analogy of the solar 
system to the atom. Analogizers learn by hypothesizing that entities with 
similar known properties have similar unknown ones. 

MLN can reproduce structure mapping, which is the more powerful type of 
analogy, that can make inferences from one domain (solar system) to another 
(atom). This can be done by learning formulas that don't refer to any of 
the specific relations in the source domain (general formulas). 

Seth and Tim have been helping me a lot with putting the pieces together 
for MLN in the repo I created <https://github.com/hpoit/Kenya.jl/issues/2>, and 
more help is always welcome. I would like to write MLN in idiomatic Julia. 
My question at the moment to you and the community is how to keep mappings 
of first-order harmonic functions type-stable in Julia? I am just 
getting acquainted with the type field. 

On Tuesday, August 9, 2016 at 9:02:25 AM UTC-3, Kevin Liu wrote:
>
> Helping me separate the process in parts and priorities would be a lot of 
> help. 
>
> On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>>
>> Tim Holy, what if I could tap into the well of knowledge that you are to 
>> speed up things? Can you imagine if every learner had to start without 
>> priors? 
>>
>> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
>> > 
>> > I'd recommend starting by picking a very small project. For example, 
>> fix a bug 
>> > or implement a small improvement in a package that you already find 
>> useful or 
>> > interesting. That way you'll get some guidance while making a positive 
>> > contribution; once you know more about julia, it will be easier to see 
>> your 
>> > way forward. 
>> > 
>> > Best, 
>> > --Tim 
>> > 
>> >> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote: 
>> >> I have no idea where to start and where to finish. Founders' help 
>> would be 
>> >> wonderful. 
>> >> 
>> >>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote: 
>> >>> After which I have to code Felix into Julia, a relational optimizer 
>> for 
>> >>> statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/> 
>> >>> inside, for enterprise settings. 
>> >>> 
>> >>>> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote: 
>> >>>> Can I get tips on bringing Alchemy's optimized Tuffy 
>> >>>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing 
>> the 
>> >>>> best of Julia? I am going for the most correct way, even if it means 
>> >>>> coding 
>> >>>> Tuffy into C and Julia. 
>> >>>> 
>> >>>>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote: 
>> >>>>> I'll try to build it, compare it, and show it to you guys. I 
>> offered to 
>> >>>>> do this as work. I am waiting to see if they will accept it. 
>> >>>>> 
>> >>>>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski 
>> wrote: 
>> >>>>>> Kevin, as previously requested by Isaiah, please take this to some 
>> >>>>>> other forum or maybe start a blog. 
>> >>>>>> 
>> >>>>>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  
>> wrote: 
>> >>>>>>> Symmetry-based learning, Domingos, 2014 
>> >>>>>>> 
>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning 
>> >>>>>>> / 
>> >>>>>>> 
>> >>>>>>> Approach 2: Deep symmetry networks generalize convolutional 
>> neural 
>> >>>>>>> networks by tying parameters and pooling over an arbitrary 
>> symmetry 
>> >>>>>>> group, 
>> >>>>>>> not just the translation group. In preliminary experiments, they 
>> >>>>>>> outperformed convnets on a digit recognition task. 
>> >>>>>>> 
>> >>>>>>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote: 
>> >

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-09 Thread Kevin Liu
Helping me separate the process in parts and priorities would be a lot of 
help. 

On Tuesday, August 9, 2016 at 8:41:03 AM UTC-3, Kevin Liu wrote:
>
> Tim Holy, what if I could tap into the well of knowledge that you are to 
> speed up things? Can you imagine if every learner had to start without 
> priors? 
>
> > On Aug 9, 2016, at 07:06, Tim Holy  wrote: 
> > 
> > I'd recommend starting by picking a very small project. For example, fix 
> a bug 
> > or implement a small improvement in a package that you already find 
> useful or 
> > interesting. That way you'll get some guidance while making a positive 
> > contribution; once you know more about julia, it will be easier to see 
> your 
> > way forward. 
> > 
> > Best, 
> > --Tim 
> > 
> >> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote: 
> >> I have no idea where to start and where to finish. Founders' help would 
> be 
> >> wonderful. 
> >> 
> >>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote: 
> >>> After which I have to code Felix into Julia, a relational optimizer 
> for 
> >>> statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/> 
> >>> inside, for enterprise settings. 
> >>> 
> >>>> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote: 
> >>>> Can I get tips on bringing Alchemy's optimized Tuffy 
> >>>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing 
> the 
> >>>> best of Julia? I am going for the most correct way, even if it means 
> >>>> coding 
> >>>> Tuffy into C and Julia. 
> >>>> 
> >>>>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote: 
> >>>>> I'll try to build it, compare it, and show it to you guys. I offered 
> to 
> >>>>> do this as work. I am waiting to see if they will accept it. 
> >>>>> 
> >>>>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski 
> wrote: 
> >>>>>> Kevin, as previously requested by Isaiah, please take this to some 
> >>>>>> other forum or maybe start a blog. 
> >>>>>> 
> >>>>>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  
> wrote: 
> >>>>>>> Symmetry-based learning, Domingos, 2014 
> >>>>>>> 
> https://www.microsoft.com/en-us/research/video/symmetry-based-learning 
> >>>>>>> / 
> >>>>>>> 
> >>>>>>> Approach 2: Deep symmetry networks generalize convolutional neural 
> >>>>>>> networks by tying parameters and pooling over an arbitrary 
> symmetry 
> >>>>>>> group, 
> >>>>>>> not just the translation group. In preliminary experiments, they 
> >>>>>>> outperformed convnets on a digit recognition task. 
> >>>>>>> 
> >>>>>>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote: 
> >>>>>>>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
> >>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray 
> >>>>>>>> Kurzweil <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he 
> was 
> >>>>>>>> contacted by the cryonics organization Alcor Life Extension 
> >>>>>>>> Foundation 
> >>>>>>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> 
> >>>>>>>> seeking 
> >>>>>>>> Minsky's body.[41] 
> >>>>>>>> <
> https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
> >>>>>>>> Kurzweil believes that Minsky was cryonically preserved by Alcor 
> and 
> >>>>>>>> will be revived by 2045.[41] 
> >>>>>>>> <
> https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
> >>>>>>>> Minsky 
> >>>>>>>> was a member of Alcor's Scientific Advisory Board 
> >>>>>>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
> >>>>>>>> <
> https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> 
> >>>>>>>> In 
> >>>>>>&g

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-09 Thread Kevin Liu
Tim Holy, what if I could tap into the well of knowledge that you are to speed 
up things? Can you imagine if every learner had to start without priors? 

> On Aug 9, 2016, at 07:06, Tim Holy  wrote:
> 
> I'd recommend starting by picking a very small project. For example, fix a 
> bug 
> or implement a small improvement in a package that you already find useful or 
> interesting. That way you'll get some guidance while making a positive 
> contribution; once you know more about julia, it will be easier to see your 
> way forward.
> 
> Best,
> --Tim
> 
>> On Monday, August 8, 2016 8:22:01 PM CDT Kevin Liu wrote:
>> I have no idea where to start and where to finish. Founders' help would be
>> wonderful.
>> 
>>> On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote:
>>> After which I have to code Felix into Julia, a relational optimizer for
>>> statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/>
>>> inside, for enterprise settings.
>>> 
>>>> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote:
>>>> Can I get tips on bringing Alchemy's optimized Tuffy
>>>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing the
>>>> best of Julia? I am going for the most correct way, even if it means
>>>> coding
>>>> Tuffy into C and Julia.
>>>> 
>>>>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>>>>> I'll try to build it, compare it, and show it to you guys. I offered to
>>>>> do this as work. I am waiting to see if they will accept it.
>>>>> 
>>>>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>>>>> Kevin, as previously requested by Isaiah, please take this to some
>>>>>> other forum or maybe start a blog.
>>>>>> 
>>>>>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>>>>>> Symmetry-based learning, Domingos, 2014
>>>>>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning
>>>>>>> /
>>>>>>> 
>>>>>>> Approach 2: Deep symmetry networks generalize convolutional neural
>>>>>>> networks by tying parameters and pooling over an arbitrary symmetry
>>>>>>> group,
>>>>>>> not just the translation group. In preliminary experiments, they
>>>>>>> outperformed convnets on a digit recognition task.
>>>>>>> 
>>>>>>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>>>>>>> Minsky died of a cerebral hemorrhage at the age of 88.[40]
>>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray
>>>>>>>> Kurzweil <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was
>>>>>>>> contacted by the cryonics organization Alcor Life Extension
>>>>>>>> Foundation
>>>>>>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation>
>>>>>>>> seeking
>>>>>>>> Minsky's body.[41]
>>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41>
>>>>>>>> Kurzweil believes that Minsky was cryonically preserved by Alcor and
>>>>>>>> will be revived by 2045.[41]
>>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41>
>>>>>>>> Minsky
>>>>>>>> was a member of Alcor's Scientific Advisory Board
>>>>>>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42]
>>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42>
>>>>>>>> In
>>>>>>>> keeping with their policy of protecting privacy, Alcor will neither
>>>>>>>> confirm
>>>>>>>> nor deny that Alcor has cryonically preserved Minsky.[43]
>>>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43>
>>>>>>>> 
>>>>>>>> We better do a good job.
>>>>>>>> 
>>>>>>>>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>>>>>>> *So, I think in the next 20 years (2003), if we can get rid of all
>>>>>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
I have no idea where to start and where to finish. Founders' help would be 
wonderful. 

On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote:
>
> After which I have to code Felix into Julia, a relational optimizer for 
> statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/> 
> inside, for enterprise settings.
>
> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote:
>>
>> Can I get tips on bringing Alchemy's optimized Tuffy 
>> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing the 
>> best of Julia? I am going for the most correct way, even if it means coding 
>> Tuffy into C and Julia.
>>
>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>>>
>>> I'll try to build it, compare it, and show it to you guys. I offered to 
>>> do this as work. I am waiting to see if they will accept it. 
>>>
>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>>>
>>>> Kevin, as previously requested by Isaiah, please take this to some 
>>>> other forum or maybe start a blog.
>>>>
>>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>>>
>>>>> Symmetry-based learning, Domingos, 2014 
>>>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>>>>>
>>>>> Approach 2: Deep symmetry networks generalize convolutional neural 
>>>>> networks by tying parameters and pooling over an arbitrary symmetry 
>>>>> group, 
>>>>> not just the translation group. In preliminary experiments, they 
>>>>> outperformed convnets on a digit recognition task. 
>>>>>
>>>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray 
>>>>>> Kurzweil <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was 
>>>>>> contacted by the cryonics organization Alcor Life Extension 
>>>>>> Foundation 
>>>>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
>>>>>> Minsky's body.[41] 
>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>>>>> Kurzweil 
>>>>>> believes that Minsky was cryonically preserved by Alcor and will be 
>>>>>> revived 
>>>>>> by 2045.[41] 
>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>>>>> Minsky 
>>>>>> was a member of Alcor's Scientific Advisory Board 
>>>>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
>>>>>> keeping with their policy of protecting privacy, Alcor will neither 
>>>>>> confirm 
>>>>>> nor deny that Alcor has cryonically preserved Minsky.[43] 
>>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 
>>>>>>
>>>>>> We better do a good job. 
>>>>>>
>>>>>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>>>>>
>>>>>>> *So, I think in the next 20 years (2003), if we can get rid of all 
>>>>>>> of the traditional approaches to artificial intelligence, like neural 
>>>>>>> nets 
>>>>>>> and genetic algorithms and rule-based systems, and just turn our sights 
>>>>>>> a 
>>>>>>> little bit higher to say, can we make a system that can use all those 
>>>>>>> things for the right kind of problem? Some problems are good for neural 
>>>>>>> nets; we know that others, neural nets are hopeless on them. Genetic 
>>>>>>> algorithms are great for certain things; I suspect I know what they're 
>>>>>>> bad 
>>>>>>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL 
>>>>>>> MIT
>>>>>>>
>>>>>>> *Those programmers tried to find the single best way to represent 
>>>>>>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>>>>>>> attachment fr

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
After which I have to code Felix into Julia, a relational optimizer for 
statistical inference with Tuffy <http://i.stanford.edu/hazy/tuffy/> 
inside, for enterprise settings.

On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote:
>
> Can I get tips on bringing Alchemy's optimized Tuffy 
> <http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing the 
> best of Julia? I am going for the most correct way, even if it means coding 
> Tuffy into C and Julia.
>
> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>>
>> I'll try to build it, compare it, and show it to you guys. I offered to 
>> do this as work. I am waiting to see if they will accept it. 
>>
>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>>
>>> Kevin, as previously requested by Isaiah, please take this to some other 
>>> forum or maybe start a blog.
>>>
>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>>
>>>> Symmetry-based learning, Domingos, 2014 
>>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>>>>
>>>> Approach 2: Deep symmetry networks generalize convolutional neural 
>>>> networks by tying parameters and pooling over an arbitrary symmetry group, 
>>>> not just the translation group. In preliminary experiments, they 
>>>> outperformed convnets on a digit recognition task. 
>>>>
>>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray 
>>>>> Kurzweil <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was 
>>>>> contacted by the cryonics organization Alcor Life Extension Foundation 
>>>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
>>>>> Minsky's body.[41] 
>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>>>> Kurzweil 
>>>>> believes that Minsky was cryonically preserved by Alcor and will be 
>>>>> revived 
>>>>> by 2045.[41] 
>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>>>> Minsky 
>>>>> was a member of Alcor's Scientific Advisory Board 
>>>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
>>>>> keeping with their policy of protecting privacy, Alcor will neither 
>>>>> confirm 
>>>>> nor deny that Alcor has cryonically preserved Minsky.[43] 
>>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 
>>>>>
>>>>> We better do a good job. 
>>>>>
>>>>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> *So, I think in the next 20 years (2003), if we can get rid of all of 
>>>>>> the traditional approaches to artificial intelligence, like neural nets 
>>>>>> and 
>>>>>> genetic algorithms and rule-based systems, and just turn our sights a 
>>>>>> little bit higher to say, can we make a system that can use all those 
>>>>>> things for the right kind of problem? Some problems are good for neural 
>>>>>> nets; we know that others, neural nets are hopeless on them. Genetic 
>>>>>> algorithms are great for certain things; I suspect I know what they're 
>>>>>> bad 
>>>>>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>>>>>>
>>>>>> *Those programmers tried to find the single best way to represent 
>>>>>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>>>>>> attachment from his lecture)
>>>>>>
>>>>>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>>>>>>
>>>>>>> Markov Logic Network is being used for the continuous development of 
>>>>>>> drugs to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on 
>>>>>>> DARPA's largest AI project to date, Personalized Assistant that 
>>>>>>> Learns (PAL) <https://pal.sri.com/>, progenitor of Siri. One of 

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
Can I get tips on bringing Alchemy's optimized Tuffy 
<http://i.stanford.edu/hazy/tuffy/> in Java to Julia while showing the best 
of Julia? I am going for the most correct way, even if it means coding 
Tuffy into C and Julia.

On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>
> I'll try to build it, compare it, and show it to you guys. I offered to do 
> this as work. I am waiting to see if they will accept it. 
>
> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>
>> Kevin, as previously requested by Isaiah, please take this to some other 
>> forum or maybe start a blog.
>>
>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>
>>> Symmetry-based learning, Domingos, 2014 
>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>>>
>>> Approach 2: Deep symmetry networks generalize convolutional neural 
>>> networks by tying parameters and pooling over an arbitrary symmetry group, 
>>> not just the translation group. In preliminary experiments, they 
>>> outperformed convnets on a digit recognition task. 
>>>
>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray Kurzweil 
>>>> <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was contacted by 
>>>> the cryonics organization Alcor Life Extension Foundation 
>>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
>>>> Minsky's body.[41] 
>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>>> Kurzweil 
>>>> believes that Minsky was cryonically preserved by Alcor and will be 
>>>> revived 
>>>> by 2045.[41] 
>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Minsky 
>>>> was a member of Alcor's Scientific Advisory Board 
>>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
>>>> keeping with their policy of protecting privacy, Alcor will neither 
>>>> confirm 
>>>> nor deny that Alcor has cryonically preserved Minsky.[43] 
>>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 
>>>>
>>>> We better do a good job. 
>>>>
>>>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> *So, I think in the next 20 years (2003), if we can get rid of all of 
>>>>> the traditional approaches to artificial intelligence, like neural nets 
>>>>> and 
>>>>> genetic algorithms and rule-based systems, and just turn our sights a 
>>>>> little bit higher to say, can we make a system that can use all those 
>>>>> things for the right kind of problem? Some problems are good for neural 
>>>>> nets; we know that others, neural nets are hopeless on them. Genetic 
>>>>> algorithms are great for certain things; I suspect I know what they're 
>>>>> bad 
>>>>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>>>>>
>>>>> *Those programmers tried to find the single best way to represent 
>>>>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>>>>> attachment from his lecture)
>>>>>
>>>>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> Markov Logic Network is being used for the continuous development of 
>>>>>> drugs to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on 
>>>>>> DARPA's largest AI project to date, Personalized Assistant that 
>>>>>> Learns (PAL) <https://pal.sri.com/>, progenitor of Siri. One of 
>>>>>> Alchemy's largest applications to date was to learn a semantic network 
>>>>>> (knowledge graph as Google calls it) from the web. 
>>>>>>
>>>>>> Some on Probabilistic Inductive Logic Programming / Probabilistic 
>>>>>> Logic Programming / Statistical Relational Learning from CSAIL 
>>>>>> <http://people.csail.mit.edu/kersting/ecmlpkdd05_pilp/pilp_ida2005_tut.pdf>
>>>>>>  (my 
>>>>>> understanding is Alc

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-07 Thread Kevin Liu
I'll try to build it, compare it, and show it to you guys. I offered to do 
this as work. I am waiting to see if they will accept it. 

On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>
> Kevin, as previously requested by Isaiah, please take this to some other 
> forum or maybe start a blog.
>
> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu 
> > wrote:
>
>> Symmetry-based learning, Domingos, 2014 
>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>>
>> Approach 2: Deep symmetry networks generalize convolutional neural 
>> networks by tying parameters and pooling over an arbitrary symmetry group, 
>> not just the translation group. In preliminary experiments, they 
>> outperformed convnets on a digit recognition task. 
>>
>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>>
>>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray Kurzweil 
>>> <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was contacted by 
>>> the cryonics organization Alcor Life Extension Foundation 
>>> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
>>> Minsky's body.[41] 
>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> 
>>> Kurzweil 
>>> believes that Minsky was cryonically preserved by Alcor and will be revived 
>>> by 2045.[41] 
>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Minsky 
>>> was a member of Alcor's Scientific Advisory Board 
>>> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
>>> keeping with their policy of protecting privacy, Alcor will neither confirm 
>>> nor deny that Alcor has cryonically preserved Minsky.[43] 
>>> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 
>>>
>>> We better do a good job. 
>>>
>>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> *So, I think in the next 20 years (2003), if we can get rid of all of 
>>>> the traditional approaches to artificial intelligence, like neural nets 
>>>> and 
>>>> genetic algorithms and rule-based systems, and just turn our sights a 
>>>> little bit higher to say, can we make a system that can use all those 
>>>> things for the right kind of problem? Some problems are good for neural 
>>>> nets; we know that others, neural nets are hopeless on them. Genetic 
>>>> algorithms are great for certain things; I suspect I know what they're bad 
>>>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>>>>
>>>> *Those programmers tried to find the single best way to represent 
>>>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>>>> attachment from his lecture)
>>>>
>>>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Markov Logic Network is being used for the continuous development of 
>>>>> drugs to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on 
>>>>> DARPA's largest AI project to date, Personalized Assistant that 
>>>>> Learns (PAL) <https://pal.sri.com/>, progenitor of Siri. One of 
>>>>> Alchemy's largest applications to date was to learn a semantic network 
>>>>> (knowledge graph as Google calls it) from the web. 
>>>>>
>>>>> Some on Probabilistic Inductive Logic Programming / Probabilistic 
>>>>> Logic Programming / Statistical Relational Learning from CSAIL 
>>>>> <http://people.csail.mit.edu/kersting/ecmlpkdd05_pilp/pilp_ida2005_tut.pdf>
>>>>>  (my 
>>>>> understanding is Alchemy does PILP from entailment, proofs, and 
>>>>> interpretation)
>>>>>
>>>>> The MIT Probabilistic Computing Project (where there is Picture, an 
>>>>> extension of Julia, for computer vision; Watch the video from Vikash) 
>>>>> <http://probcomp.csail.mit.edu/index.html>
>>>>>
>>>>> Probabilistic programming could do for Bayesian ML what Theano has 
>>>>> done for neural networks. 
>>>>> <http://www.inference.vc/deep-learning-is-easy/> - Ferenc Huszár
>>>>>
>&

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-06 Thread Kevin Liu
Symmetry-based learning, Domingos, 2014 
https://www.microsoft.com/en-us/research/video/symmetry-based-learning/

Approach 2: Deep symmetry networks generalize convolutional neural networks 
by tying parameters and pooling over an arbitrary symmetry group, not just 
the translation group. In preliminary experiments, they outperformed 
convnets on a digit recognition task. 

On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>
> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray Kurzweil 
> <https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was contacted by the 
> cryonics organization Alcor Life Extension Foundation 
> <https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
> Minsky's body.[41] 
> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Kurzweil 
> believes that Minsky was cryonically preserved by Alcor and will be revived 
> by 2045.[41] 
> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Minsky 
> was a member of Alcor's Scientific Advisory Board 
> <https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
> keeping with their policy of protecting privacy, Alcor will neither confirm 
> nor deny that Alcor has cryonically preserved Minsky.[43] 
> <https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 
>
> We better do a good job. 
>
> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>
>> *So, I think in the next 20 years (2003), if we can get rid of all of the 
>> traditional approaches to artificial intelligence, like neural nets and 
>> genetic algorithms and rule-based systems, and just turn our sights a 
>> little bit higher to say, can we make a system that can use all those 
>> things for the right kind of problem? Some problems are good for neural 
>> nets; we know that others, neural nets are hopeless on them. Genetic 
>> algorithms are great for certain things; I suspect I know what they're bad 
>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>>
>> *Those programmers tried to find the single best way to represent 
>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>> attachment from his lecture)
>>
>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>>
>>> Markov Logic Network is being used for the continuous development of 
>>> drugs to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on 
>>> DARPA's largest AI project to date, Personalized Assistant that Learns 
>>> (PAL) <https://pal.sri.com/>, progenitor of Siri. One of Alchemy's 
>>> largest applications to date was to learn a semantic network (knowledge 
>>> graph as Google calls it) from the web. 
>>>
>>> Some on Probabilistic Inductive Logic Programming / Probabilistic Logic 
>>> Programming / Statistical Relational Learning from CSAIL 
>>> <http://people.csail.mit.edu/kersting/ecmlpkdd05_pilp/pilp_ida2005_tut.pdf> 
>>> (my 
>>> understanding is Alchemy does PILP from entailment, proofs, and 
>>> interpretation)
>>>
>>> The MIT Probabilistic Computing Project (where there is Picture, an 
>>> extension of Julia, for computer vision; Watch the video from Vikash) 
>>> <http://probcomp.csail.mit.edu/index.html>
>>>
>>> Probabilistic programming could do for Bayesian ML what Theano has done 
>>> for neural networks. <http://www.inference.vc/deep-learning-is-easy/> - 
>>> Ferenc Huszár
>>>
>>> Picture doesn't appear to be open-source, even though its Paper is 
>>> available. 
>>>
>>> I'm in the process of comparing the Picture Paper and Alchemy code and 
>>> would like to have an open-source PILP from Julia that combines the best of 
>>> both. 
>>>
>>> On Wednesday, August 3, 2016 at 5:01:02 PM UTC-3, Christof Stocker wrote:
>>>>
>>>> This sounds like it could be a great contribution. I shall keep a 
>>>> curious eye on your progress
>>>>
>>>> Am Mittwoch, 3. August 2016 21:53:54 UTC+2 schrieb Kevin Liu:
>>>>>
>>>>> Thanks for the advice Cristof. I am only interested in people wanting 
>>>>> to code it in Julia, from R by Domingos. The algo has been successfully 
>>>>> applied in many areas, even though there are many other areas remaining. 
>>>>>
>>>&g

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-05 Thread Kevin Liu
Minsky died of a cerebral hemorrhage at the age of 88.[40] 
<https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-40> Ray Kurzweil 
<https://en.wikipedia.org/wiki/Ray_Kurzweil> says he was contacted by the 
cryonics organization Alcor Life Extension Foundation 
<https://en.wikipedia.org/wiki/Alcor_Life_Extension_Foundation> seeking 
Minsky's body.[41] 
<https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Kurzweil 
believes that Minsky was cryonically preserved by Alcor and will be revived 
by 2045.[41] 
<https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-Kurzweil-41> Minsky 
was a member of Alcor's Scientific Advisory Board 
<https://en.wikipedia.org/wiki/Advisory_Board>.[42] 
<https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-AlcorBoard-42> In 
keeping with their policy of protecting privacy, Alcor will neither confirm 
nor deny that Alcor has cryonically preserved Minsky.[43] 
<https://en.wikipedia.org/wiki/Marvin_Minsky#cite_note-43> 

We better do a good job. 

On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>
> *So, I think in the next 20 years (2003), if we can get rid of all of the 
> traditional approaches to artificial intelligence, like neural nets and 
> genetic algorithms and rule-based systems, and just turn our sights a 
> little bit higher to say, can we make a system that can use all those 
> things for the right kind of problem? Some problems are good for neural 
> nets; we know that others, neural nets are hopeless on them. Genetic 
> algorithms are great for certain things; I suspect I know what they're bad 
> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>
> *Those programmers tried to find the single best way to represent 
> knowledge - Only Logic protects us from paradox.* - Minsky (see 
> attachment from his lecture)
>
> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>
>> Markov Logic Network is being used for the continuous development of 
>> drugs to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on 
>> DARPA's largest AI project to date, Personalized Assistant that Learns 
>> (PAL) <https://pal.sri.com/>, progenitor of Siri. One of Alchemy's 
>> largest applications to date was to learn a semantic network (knowledge 
>> graph as Google calls it) from the web. 
>>
>> Some on Probabilistic Inductive Logic Programming / Probabilistic Logic 
>> Programming / Statistical Relational Learning from CSAIL 
>> <http://people.csail.mit.edu/kersting/ecmlpkdd05_pilp/pilp_ida2005_tut.pdf> 
>> (my 
>> understanding is Alchemy does PILP from entailment, proofs, and 
>> interpretation)
>>
>> The MIT Probabilistic Computing Project (where there is Picture, an 
>> extension of Julia, for computer vision; Watch the video from Vikash) 
>> <http://probcomp.csail.mit.edu/index.html>
>>
>> Probabilistic programming could do for Bayesian ML what Theano has done 
>> for neural networks. <http://www.inference.vc/deep-learning-is-easy/> - 
>> Ferenc Huszár
>>
>> Picture doesn't appear to be open-source, even though its Paper is 
>> available. 
>>
>> I'm in the process of comparing the Picture Paper and Alchemy code and 
>> would like to have an open-source PILP from Julia that combines the best of 
>> both. 
>>
>> On Wednesday, August 3, 2016 at 5:01:02 PM UTC-3, Christof Stocker wrote:
>>>
>>> This sounds like it could be a great contribution. I shall keep a 
>>> curious eye on your progress
>>>
>>> Am Mittwoch, 3. August 2016 21:53:54 UTC+2 schrieb Kevin Liu:
>>>>
>>>> Thanks for the advice Cristof. I am only interested in people wanting 
>>>> to code it in Julia, from R by Domingos. The algo has been successfully 
>>>> applied in many areas, even though there are many other areas remaining. 
>>>>
>>>> On Wed, Aug 3, 2016 at 4:45 PM, Christof Stocker >>> > wrote:
>>>>
>>>>> Hello Kevin,
>>>>>
>>>>> Enthusiasm is a good thing and you should hold on to that. But to save 
>>>>> yourself some headache or disappointment down the road I advice a level 
>>>>> head. Nothing is really as bluntly obviously solved as it may seems at 
>>>>> first glance after listening to brilliant people explain things. A 
>>>>> physics 
>>>>> professor of mine once told me that one of the (he thinks) most malicious 
>>>>> factors to his past students progress where overstated 
>>>>> results/conclu

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-05 Thread Kevin Liu
Markov Logic Network is being used for the continuous development of drugs 
to cure cancer at MIT's CanceRX <http://cancerx.mit.edu/>, on DARPA's 
largest AI project to date, Personalized Assistant that Learns (PAL) 
<https://pal.sri.com/>, progenitor of Siri. One of Alchemy's largest 
applications to date was to learn a semantic network (knowledge graph as 
Google calls it) from the web. 

Some on Probabilistic Inductive Logic Programming / Probabilistic Logic 
Programming / Statistical Relational Learning from CSAIL 
<http://people.csail.mit.edu/kersting/ecmlpkdd05_pilp/pilp_ida2005_tut.pdf> (my 
understanding is Alchemy does PILP from entailment, proofs, and 
interpretation)

The MIT Probabilistic Computing Project (where there is Picture, an 
extension of Julia, for computer vision; Watch the video from Vikash) 
<http://probcomp.csail.mit.edu/index.html>

Probabilistic programming could do for Bayesian ML what Theano has done for 
neural networks. <http://www.inference.vc/deep-learning-is-easy/> - Ferenc 
Huszár

Picture doesn't appear to be open-source, even though its Paper is 
available. 

I'm in the process of comparing the Picture Paper and Alchemy code and 
would like to have an open-source PILP from Julia that combines the best of 
both. 

On Wednesday, August 3, 2016 at 5:01:02 PM UTC-3, Christof Stocker wrote:
>
> This sounds like it could be a great contribution. I shall keep a curious 
> eye on your progress
>
> Am Mittwoch, 3. August 2016 21:53:54 UTC+2 schrieb Kevin Liu:
>>
>> Thanks for the advice Cristof. I am only interested in people wanting to 
>> code it in Julia, from R by Domingos. The algo has been successfully 
>> applied in many areas, even though there are many other areas remaining. 
>>
>> On Wed, Aug 3, 2016 at 4:45 PM, Christof Stocker  
>> wrote:
>>
>>> Hello Kevin,
>>>
>>> Enthusiasm is a good thing and you should hold on to that. But to save 
>>> yourself some headache or disappointment down the road I advice a level 
>>> head. Nothing is really as bluntly obviously solved as it may seems at 
>>> first glance after listening to brilliant people explain things. A physics 
>>> professor of mine once told me that one of the (he thinks) most malicious 
>>> factors to his past students progress where overstated results/conclusions 
>>> by other researches (such as premature announcements from CERN). I am no 
>>> mathematician, but as far as I can judge is the no free lunch theorem of 
>>> pure mathematical nature and not something induced empirically. These kind 
>>> of results are not that easily to get rid of. If someone (especially an 
>>> expert) states such a theorem will prove wrong I would be inclined to 
>>> believe that he is not talking about literally, but instead is just trying 
>>> to make a point about a more or less practical implication.
>>>
>>>
>>> Am Mittwoch, 3. August 2016 21:27:05 UTC+2 schrieb Kevin Liu:
>>>>
>>>> The Markov logic network represents a probability distribution over the 
>>>> states of a complex system (i.e. a cell), comprised of entities, where 
>>>> logic formulas encode the dependencies between them. 
>>>>
>>>> On Wednesday, August 3, 2016 at 4:19:09 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Alchemy is like an inductive Turing machine, to be programmed to learn 
>>>>> broadly or restrictedly.
>>>>>
>>>>> The logic formulas from rules through which it represents can be 
>>>>> inconsistent, incomplete, or even incorrect-- the learning and 
>>>>> probabilistic reasoning will correct them. The key point is that Alchemy 
>>>>> doesn't have to learn from scratch, proving Wolpert and Macready's no 
>>>>> free 
>>>>> lunch theorem wrong by performing well on a variety of classes of 
>>>>> problems, 
>>>>> not just some.
>>>>>
>>>>> On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> Hello Community, 
>>>>>>
>>>>>> I'm in the last pages of Pedro Domingos' book, the Master Algo, one 
>>>>>> of two recommended by Bill Gates to learn about AI. 
>>>>>>
>>>>>> From the book, I understand all learners have to represent, evaluate, 
>>>>>> and optimize. There are many types of learners that do this. What 
>>>>>> Domingos 
>>>>>> does is generalize these three parts, (1) using Ma

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
Thanks for the advice Cristof. I am only interested in people wanting to
code it in Julia, from R by Domingos. The algo has been successfully
applied in many areas, even though there are many other areas remaining.

On Wed, Aug 3, 2016 at 4:45 PM, Christof Stocker  wrote:

> Hello Kevin,
>
> Enthusiasm is a good thing and you should hold on to that. But to save
> yourself some headache or disappointment down the road I advice a level
> head. Nothing is really as bluntly obviously solved as it may seems at
> first glance after listening to brilliant people explain things. A physics
> professor of mine once told me that one of the (he thinks) most malicious
> factors to his past students progress where overstated results/conclusions
> by other researches (such as premature announcements from CERN). I am no
> mathematician, but as far as I can judge is the no free lunch theorem of
> pure mathematical nature and not something induced empirically. These kind
> of results are not that easily to get rid of. If someone (especially an
> expert) states such a theorem will prove wrong I would be inclined to
> believe that he is not talking about literally, but instead is just trying
> to make a point about a more or less practical implication.
>
>
> Am Mittwoch, 3. August 2016 21:27:05 UTC+2 schrieb Kevin Liu:
>>
>> The Markov logic network represents a probability distribution over the
>> states of a complex system (i.e. a cell), comprised of entities, where
>> logic formulas encode the dependencies between them.
>>
>> On Wednesday, August 3, 2016 at 4:19:09 PM UTC-3, Kevin Liu wrote:
>>>
>>> Alchemy is like an inductive Turing machine, to be programmed to learn
>>> broadly or restrictedly.
>>>
>>> The logic formulas from rules through which it represents can be
>>> inconsistent, incomplete, or even incorrect-- the learning and
>>> probabilistic reasoning will correct them. The key point is that Alchemy
>>> doesn't have to learn from scratch, proving Wolpert and Macready's no free
>>> lunch theorem wrong by performing well on a variety of classes of problems,
>>> not just some.
>>>
>>> On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> Hello Community,
>>>>
>>>> I'm in the last pages of Pedro Domingos' book, the Master Algo, one of
>>>> two recommended by Bill Gates to learn about AI.
>>>>
>>>> From the book, I understand all learners have to represent, evaluate,
>>>> and optimize. There are many types of learners that do this. What Domingos
>>>> does is generalize these three parts, (1) using Markov Logic Network to
>>>> represent, (2) posterior probability to evaluate, and (3) genetic search
>>>> with gradient descent to optimize. The posterior can be replaced for
>>>> another accuracy measure when it is easier, as genetic search replaced by
>>>> hill climbing. Where there are 15 popular options for representing,
>>>> evaluating, and optimizing, Domingos generalized them into three options.
>>>> The idea is to have one unified learner for any application.
>>>>
>>>> There is code already done in R https://alchemy.cs.washington.edu/. My
>>>> question: anybody in the community vested in coding it into Julia?
>>>>
>>>> Thanks. Kevin
>>>>
>>>> On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> https://github.com/tbreloff/OnlineAI.jl/issues/5
>>>>>
>>>>> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> I plan to write Julia for the rest of me life... given it remains
>>>>>> suitable. I am still reading all of Colah's material on nets. I ran
>>>>>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for
>>>>>> jumping in and telling me about OnlineAI.jl, I will look into it once I 
>>>>>> am
>>>>>> ready. From a quick look, perhaps I could help and learn by building a 
>>>>>> very
>>>>>> clear documentation of it. Would really like to see Julia a leap ahead of
>>>>>> other languages, and plan to contribute heavily to it, but at the moment 
>>>>>> am
>>>>>> still getting introduced to CS, programming, and nets at the basic level.
>>>>>>
>>>>>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>>>>>
>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
Alchemy is also less expensive and opaque than Watson's meta learning 
<http://researcher.watson.ibm.com/researcher/files/il-DAVIDBO/multiobjectiveSOMMOSoptimization_c.pdf>:
 
'I believe you have prostate cancer because the decision tree, the genetic 
algorithm, and Naïve Bayes say so, although the multilayer perceptron and 
the SVM disagree.'

On Wednesday, August 3, 2016 at 4:36:52 PM UTC-3, Kevin Liu wrote:
>
> Another important cool thing I think is worth noting: he added the 
> possibility of weights to rules (attachment). Each line is equivalent to a 
> desired conclusion. 
>
> On Wednesday, August 3, 2016 at 4:27:05 PM UTC-3, Kevin Liu wrote:
>>
>> The Markov logic network represents a probability distribution over the 
>> states of a complex system (i.e. a cell), comprised of entities, where 
>> logic formulas encode the dependencies between them. 
>>
>> On Wednesday, August 3, 2016 at 4:19:09 PM UTC-3, Kevin Liu wrote:
>>>
>>> Alchemy is like an inductive Turing machine, to be programmed to learn 
>>> broadly or restrictedly.
>>>
>>> The logic formulas from rules through which it represents can be 
>>> inconsistent, incomplete, or even incorrect-- the learning and 
>>> probabilistic reasoning will correct them. The key point is that Alchemy 
>>> doesn't have to learn from scratch, proving Wolpert and Macready's no free 
>>> lunch theorem wrong by performing well on a variety of classes of problems, 
>>> not just some.
>>>
>>> On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> Hello Community, 
>>>>
>>>> I'm in the last pages of Pedro Domingos' book, the Master Algo, one of 
>>>> two recommended by Bill Gates to learn about AI. 
>>>>
>>>> From the book, I understand all learners have to represent, evaluate, 
>>>> and optimize. There are many types of learners that do this. What Domingos 
>>>> does is generalize these three parts, (1) using Markov Logic Network to 
>>>> represent, (2) posterior probability to evaluate, and (3) genetic search 
>>>> with gradient descent to optimize. The posterior can be replaced for 
>>>> another accuracy measure when it is easier, as genetic search replaced by 
>>>> hill climbing. Where there are 15 popular options for representing, 
>>>> evaluating, and optimizing, Domingos generalized them into three options. 
>>>> The idea is to have one unified learner for any application. 
>>>>
>>>> There is code already done in R https://alchemy.cs.washington.edu/. My 
>>>> question: anybody in the community vested in coding it into Julia?
>>>>
>>>> Thanks. Kevin
>>>>
>>>> On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> https://github.com/tbreloff/OnlineAI.jl/issues/5
>>>>>
>>>>> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> I plan to write Julia for the rest of me life... given it remains 
>>>>>> suitable. I am still reading all of Colah's material on nets. I ran 
>>>>>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks 
>>>>>> for 
>>>>>> jumping in and telling me about OnlineAI.jl, I will look into it once I 
>>>>>> am 
>>>>>> ready. From a quick look, perhaps I could help and learn by building a 
>>>>>> very 
>>>>>> clear documentation of it. Would really like to see Julia a leap ahead 
>>>>>> of 
>>>>>> other languages, and plan to contribute heavily to it, but at the moment 
>>>>>> am 
>>>>>> still getting introduced to CS, programming, and nets at the basic 
>>>>>> level. 
>>>>>>
>>>>>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>>>>>
>>>>>>> Kevin: computers that program themselves is a concept which is much 
>>>>>>> closer to reality than most would believe, but julia-users isn't really 
>>>>>>> the 
>>>>>>> best place for this speculation. If you're actually interested in 
>>>>>>> writing 
>>>>>>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>>>>>>> might tackle code generation using a neural framework I&#

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
Another important cool thing I think is worth noting: he added the 
possibility of weights to rules (attachment). Each line is equivalent to a 
desired conclusion. 

On Wednesday, August 3, 2016 at 4:27:05 PM UTC-3, Kevin Liu wrote:
>
> The Markov logic network represents a probability distribution over the 
> states of a complex system (i.e. a cell), comprised of entities, where 
> logic formulas encode the dependencies between them. 
>
> On Wednesday, August 3, 2016 at 4:19:09 PM UTC-3, Kevin Liu wrote:
>>
>> Alchemy is like an inductive Turing machine, to be programmed to learn 
>> broadly or restrictedly.
>>
>> The logic formulas from rules through which it represents can be 
>> inconsistent, incomplete, or even incorrect-- the learning and 
>> probabilistic reasoning will correct them. The key point is that Alchemy 
>> doesn't have to learn from scratch, proving Wolpert and Macready's no free 
>> lunch theorem wrong by performing well on a variety of classes of problems, 
>> not just some.
>>
>> On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>>>
>>> Hello Community, 
>>>
>>> I'm in the last pages of Pedro Domingos' book, the Master Algo, one of 
>>> two recommended by Bill Gates to learn about AI. 
>>>
>>> From the book, I understand all learners have to represent, evaluate, 
>>> and optimize. There are many types of learners that do this. What Domingos 
>>> does is generalize these three parts, (1) using Markov Logic Network to 
>>> represent, (2) posterior probability to evaluate, and (3) genetic search 
>>> with gradient descent to optimize. The posterior can be replaced for 
>>> another accuracy measure when it is easier, as genetic search replaced by 
>>> hill climbing. Where there are 15 popular options for representing, 
>>> evaluating, and optimizing, Domingos generalized them into three options. 
>>> The idea is to have one unified learner for any application. 
>>>
>>> There is code already done in R https://alchemy.cs.washington.edu/. My 
>>> question: anybody in the community vested in coding it into Julia?
>>>
>>> Thanks. Kevin
>>>
>>> On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> https://github.com/tbreloff/OnlineAI.jl/issues/5
>>>>
>>>> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> I plan to write Julia for the rest of me life... given it remains 
>>>>> suitable. I am still reading all of Colah's material on nets. I ran 
>>>>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for 
>>>>> jumping in and telling me about OnlineAI.jl, I will look into it once I 
>>>>> am 
>>>>> ready. From a quick look, perhaps I could help and learn by building a 
>>>>> very 
>>>>> clear documentation of it. Would really like to see Julia a leap ahead of 
>>>>> other languages, and plan to contribute heavily to it, but at the moment 
>>>>> am 
>>>>> still getting introduced to CS, programming, and nets at the basic level. 
>>>>>
>>>>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>>>>
>>>>>> Kevin: computers that program themselves is a concept which is much 
>>>>>> closer to reality than most would believe, but julia-users isn't really 
>>>>>> the 
>>>>>> best place for this speculation. If you're actually interested in 
>>>>>> writing 
>>>>>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>>>>>> might tackle code generation using a neural framework I'm working on. 
>>>>>>
>>>>>> On Friday, June 3, 2016, Kevin Liu  wrote:
>>>>>>
>>>>>>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did 
>>>>>>> not lecture at Google with a TensorFlow question in the end), were 
>>>>>>> unsuccessful penny traders, Julia was a language for web design, and 
>>>>>>> the 
>>>>>>> tribes in the video didn't actually solve problems, perhaps this would 
>>>>>>> be a 
>>>>>>> wildly off-topic, speculative discussion. But these statements couldn't 
>>>>>>> be 
>>>>>>> farther from the truth. In fact, i

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
The Markov logic network represents a probability distribution over the 
states of a complex system (i.e. a cell), comprised of entities, where 
logic formulas encode the dependencies between them. 

On Wednesday, August 3, 2016 at 4:19:09 PM UTC-3, Kevin Liu wrote:
>
> Alchemy is like an inductive Turing machine, to be programmed to learn 
> broadly or restrictedly.
>
> The logic formulas from rules through which it represents can be 
> inconsistent, incomplete, or even incorrect-- the learning and 
> probabilistic reasoning will correct them. The key point is that Alchemy 
> doesn't have to learn from scratch, proving Wolpert and Macready's no free 
> lunch theorem wrong by performing well on a variety of classes of problems, 
> not just some.
>
> On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>>
>> Hello Community, 
>>
>> I'm in the last pages of Pedro Domingos' book, the Master Algo, one of 
>> two recommended by Bill Gates to learn about AI. 
>>
>> From the book, I understand all learners have to represent, evaluate, and 
>> optimize. There are many types of learners that do this. What Domingos does 
>> is generalize these three parts, (1) using Markov Logic Network to 
>> represent, (2) posterior probability to evaluate, and (3) genetic search 
>> with gradient descent to optimize. The posterior can be replaced for 
>> another accuracy measure when it is easier, as genetic search replaced by 
>> hill climbing. Where there are 15 popular options for representing, 
>> evaluating, and optimizing, Domingos generalized them into three options. 
>> The idea is to have one unified learner for any application. 
>>
>> There is code already done in R https://alchemy.cs.washington.edu/. My 
>> question: anybody in the community vested in coding it into Julia?
>>
>> Thanks. Kevin
>>
>> On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>>>
>>> https://github.com/tbreloff/OnlineAI.jl/issues/5
>>>
>>> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> I plan to write Julia for the rest of me life... given it remains 
>>>> suitable. I am still reading all of Colah's material on nets. I ran 
>>>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for 
>>>> jumping in and telling me about OnlineAI.jl, I will look into it once I am 
>>>> ready. From a quick look, perhaps I could help and learn by building a 
>>>> very 
>>>> clear documentation of it. Would really like to see Julia a leap ahead of 
>>>> other languages, and plan to contribute heavily to it, but at the moment 
>>>> am 
>>>> still getting introduced to CS, programming, and nets at the basic level. 
>>>>
>>>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>>>
>>>>> Kevin: computers that program themselves is a concept which is much 
>>>>> closer to reality than most would believe, but julia-users isn't really 
>>>>> the 
>>>>> best place for this speculation. If you're actually interested in writing 
>>>>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>>>>> might tackle code generation using a neural framework I'm working on. 
>>>>>
>>>>> On Friday, June 3, 2016, Kevin Liu  wrote:
>>>>>
>>>>>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did 
>>>>>> not lecture at Google with a TensorFlow question in the end), were 
>>>>>> unsuccessful penny traders, Julia was a language for web design, and the 
>>>>>> tribes in the video didn't actually solve problems, perhaps this would 
>>>>>> be a 
>>>>>> wildly off-topic, speculative discussion. But these statements couldn't 
>>>>>> be 
>>>>>> farther from the truth. In fact, if I had known about this video some 
>>>>>> months ago I would've understood better on how to solve a problem I was 
>>>>>> working on.  
>>>>>>
>>>>>> For the founders of Julia: I understand your tribe is mainly CS. This 
>>>>>> master algorithm, as you are aware, would require collaboration with 
>>>>>> other 
>>>>>> tribes. Just citing the obvious. 
>>>>>>
>>>>>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
Alchemy is like an inductive Turing machine, to be programmed to learn 
broadly or restrictedly.

The logic formulas from rules through which it represents can be 
inconsistent, incomplete, or even incorrect-- the learning and 
probabilistic reasoning will correct them. The key point is that Alchemy 
doesn't have to learn from scratch, proving Wolpert and Macready's no free 
lunch theorem wrong by performing well on a variety of classes of problems, 
not just some.

On Wednesday, August 3, 2016 at 4:01:15 PM UTC-3, Kevin Liu wrote:
>
> Hello Community, 
>
> I'm in the last pages of Pedro Domingos' book, the Master Algo, one of two 
> recommended by Bill Gates to learn about AI. 
>
> From the book, I understand all learners have to represent, evaluate, and 
> optimize. There are many types of learners that do this. What Domingos does 
> is generalize these three parts, (1) using Markov Logic Network to 
> represent, (2) posterior probability to evaluate, and (3) genetic search 
> with gradient descent to optimize. The posterior can be replaced for 
> another accuracy measure when it is easier, as genetic search replaced by 
> hill climbing. Where there are 15 popular options for representing, 
> evaluating, and optimizing, Domingos generalized them into three options. 
> The idea is to have one unified learner for any application. 
>
> There is code already done in R https://alchemy.cs.washington.edu/. My 
> question: anybody in the community vested in coding it into Julia?
>
> Thanks. Kevin
>
> On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>>
>> https://github.com/tbreloff/OnlineAI.jl/issues/5
>>
>> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>>
>>> I plan to write Julia for the rest of me life... given it remains 
>>> suitable. I am still reading all of Colah's material on nets. I ran 
>>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for 
>>> jumping in and telling me about OnlineAI.jl, I will look into it once I am 
>>> ready. From a quick look, perhaps I could help and learn by building a very 
>>> clear documentation of it. Would really like to see Julia a leap ahead of 
>>> other languages, and plan to contribute heavily to it, but at the moment am 
>>> still getting introduced to CS, programming, and nets at the basic level. 
>>>
>>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>>
>>>> Kevin: computers that program themselves is a concept which is much 
>>>> closer to reality than most would believe, but julia-users isn't really 
>>>> the 
>>>> best place for this speculation. If you're actually interested in writing 
>>>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>>>> might tackle code generation using a neural framework I'm working on. 
>>>>
>>>> On Friday, June 3, 2016, Kevin Liu  wrote:
>>>>
>>>>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did 
>>>>> not lecture at Google with a TensorFlow question in the end), were 
>>>>> unsuccessful penny traders, Julia was a language for web design, and the 
>>>>> tribes in the video didn't actually solve problems, perhaps this would be 
>>>>> a 
>>>>> wildly off-topic, speculative discussion. But these statements couldn't 
>>>>> be 
>>>>> farther from the truth. In fact, if I had known about this video some 
>>>>> months ago I would've understood better on how to solve a problem I was 
>>>>> working on.  
>>>>>
>>>>> For the founders of Julia: I understand your tribe is mainly CS. This 
>>>>> master algorithm, as you are aware, would require collaboration with 
>>>>> other 
>>>>> tribes. Just citing the obvious. 
>>>>>
>>>>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>>>>
>>>>>> There could be parts missing as Domingos mentions, but induction, 
>>>>>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>>>>>> working together-- what's speculative about the improved versions of 
>>>>>> these? 
>>>>>>
>>>>>> Julia was made for AI. Isn't it time for a consolidated view on how 
>>>>>> to reach it? 
>>>>>>
>>>>>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>>>>>
>>>>>>> This is not a forum for wildly off-topic, speculative discussion.
>>>>>>>
>>>>>>> Take this to Reddit, Hacker News, etc.
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>>>>>
>>>>>>>> I am wondering how Julia fits in with the unified tribes
>>>>>>>>
>>>>>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>>>>>
>>>>>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>>>>>
>>>>>>>
>>>>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-03 Thread Kevin Liu
Hello Community, 

I'm in the last pages of Pedro Domingos' book, the Master Algo, one of two 
recommended by Bill Gates to learn about AI. 

>From the book, I understand all learners have to represent, evaluate, and 
optimize. There are many types of learners that do this. What Domingos does 
is generalize these three parts, (1) using Markov Logic Network to 
represent, (2) posterior probability to evaluate, and (3) genetic search 
with gradient descent to optimize. The posterior can be replaced for 
another accuracy measure when it is easier, as genetic search replaced by 
hill climbing. Where there are 15 popular options for representing, 
evaluating, and optimizing, Domingos generalized them into three options. 
The idea is to have one unified learner for any application. 

There is code already done in R https://alchemy.cs.washington.edu/. My 
question: anybody in the community vested in coding it into Julia?

Thanks. Kevin

On Friday, June 3, 2016 at 3:44:09 PM UTC-3, Kevin Liu wrote:
>
> https://github.com/tbreloff/OnlineAI.jl/issues/5
>
> On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>>
>> I plan to write Julia for the rest of me life... given it remains 
>> suitable. I am still reading all of Colah's material on nets. I ran 
>> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for 
>> jumping in and telling me about OnlineAI.jl, I will look into it once I am 
>> ready. From a quick look, perhaps I could help and learn by building a very 
>> clear documentation of it. Would really like to see Julia a leap ahead of 
>> other languages, and plan to contribute heavily to it, but at the moment am 
>> still getting introduced to CS, programming, and nets at the basic level. 
>>
>> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>>
>>> Kevin: computers that program themselves is a concept which is much 
>>> closer to reality than most would believe, but julia-users isn't really the 
>>> best place for this speculation. If you're actually interested in writing 
>>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>>> might tackle code generation using a neural framework I'm working on. 
>>>
>>> On Friday, June 3, 2016, Kevin Liu  wrote:
>>>
>>>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
>>>> lecture at Google with a TensorFlow question in the end), were 
>>>> unsuccessful 
>>>> penny traders, Julia was a language for web design, and the tribes in the 
>>>> video didn't actually solve problems, perhaps this would be a wildly 
>>>> off-topic, speculative discussion. But these statements couldn't be 
>>>> farther 
>>>> from the truth. In fact, if I had known about this video some months ago I 
>>>> would've understood better on how to solve a problem I was working on.  
>>>>
>>>> For the founders of Julia: I understand your tribe is mainly CS. This 
>>>> master algorithm, as you are aware, would require collaboration with other 
>>>> tribes. Just citing the obvious. 
>>>>
>>>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> There could be parts missing as Domingos mentions, but induction, 
>>>>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>>>>> working together-- what's speculative about the improved versions of 
>>>>> these? 
>>>>>
>>>>> Julia was made for AI. Isn't it time for a consolidated view on how to 
>>>>> reach it? 
>>>>>
>>>>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>>>>
>>>>>> This is not a forum for wildly off-topic, speculative discussion.
>>>>>>
>>>>>> Take this to Reddit, Hacker News, etc.
>>>>>>
>>>>>>
>>>>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>>>>
>>>>>>> I am wondering how Julia fits in with the unified tribes
>>>>>>>
>>>>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>>>>
>>>>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>>>>
>>>>>>
>>>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
https://github.com/tbreloff/OnlineAI.jl/issues/5

On Friday, June 3, 2016 at 11:17:28 AM UTC-3, Kevin Liu wrote:
>
> I plan to write Julia for the rest of me life... given it remains 
> suitable. I am still reading all of Colah's material on nets. I ran 
> Mocha.jl a couple weeks ago and was very happy to see it work. Thanks for 
> jumping in and telling me about OnlineAI.jl, I will look into it once I am 
> ready. From a quick look, perhaps I could help and learn by building a very 
> clear documentation of it. Would really like to see Julia a leap ahead of 
> other languages, and plan to contribute heavily to it, but at the moment am 
> still getting introduced to CS, programming, and nets at the basic level. 
>
> On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>>
>> Kevin: computers that program themselves is a concept which is much 
>> closer to reality than most would believe, but julia-users isn't really the 
>> best place for this speculation. If you're actually interested in writing 
>> code, I'm happy to discuss in OnlineAI.jl. I was thinking about how we 
>> might tackle code generation using a neural framework I'm working on. 
>>
>> On Friday, June 3, 2016, Kevin Liu  wrote:
>>
>>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
>>> lecture at Google with a TensorFlow question in the end), were unsuccessful 
>>> penny traders, Julia was a language for web design, and the tribes in the 
>>> video didn't actually solve problems, perhaps this would be a wildly 
>>> off-topic, speculative discussion. But these statements couldn't be farther 
>>> from the truth. In fact, if I had known about this video some months ago I 
>>> would've understood better on how to solve a problem I was working on.  
>>>
>>> For the founders of Julia: I understand your tribe is mainly CS. This 
>>> master algorithm, as you are aware, would require collaboration with other 
>>> tribes. Just citing the obvious. 
>>>
>>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>>
>>>> There could be parts missing as Domingos mentions, but induction, 
>>>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>>>> working together-- what's speculative about the improved versions of 
>>>> these? 
>>>>
>>>> Julia was made for AI. Isn't it time for a consolidated view on how to 
>>>> reach it? 
>>>>
>>>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>>>
>>>>> This is not a forum for wildly off-topic, speculative discussion.
>>>>>
>>>>> Take this to Reddit, Hacker News, etc.
>>>>>
>>>>>
>>>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>>>
>>>>>> I am wondering how Julia fits in with the unified tribes
>>>>>>
>>>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>>>
>>>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>>>
>>>>>
>>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
I plan to write Julia for the rest of me life... given it remains suitable. 
I am still reading all of Colah's material on nets. I ran Mocha.jl a couple 
weeks ago and was very happy to see it work. Thanks for jumping in and 
telling me about OnlineAI.jl, I will look into it once I am ready. From a 
quick look, perhaps I could help and learn by building a very clear 
documentation of it. Would really like to see Julia a leap ahead of other 
languages, and plan to contribute heavily to it, but at the moment am still 
getting introduced to CS, programming, and nets at the basic level. 

On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>
> Kevin: computers that program themselves is a concept which is much closer 
> to reality than most would believe, but julia-users isn't really the best 
> place for this speculation. If you're actually interested in writing code, 
> I'm happy to discuss in OnlineAI.jl. I was thinking about how we might 
> tackle code generation using a neural framework I'm working on. 
>
> On Friday, June 3, 2016, Kevin Liu > wrote:
>
>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
>> lecture at Google with a TensorFlow question in the end), were unsuccessful 
>> penny traders, Julia was a language for web design, and the tribes in the 
>> video didn't actually solve problems, perhaps this would be a wildly 
>> off-topic, speculative discussion. But these statements couldn't be farther 
>> from the truth. In fact, if I had known about this video some months ago I 
>> would've understood better on how to solve a problem I was working on.  
>>
>> For the founders of Julia: I understand your tribe is mainly CS. This 
>> master algorithm, as you are aware, would require collaboration with other 
>> tribes. Just citing the obvious. 
>>
>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>
>>> There could be parts missing as Domingos mentions, but induction, 
>>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>>> working together-- what's speculative about the improved versions of these? 
>>>
>>> Julia was made for AI. Isn't it time for a consolidated view on how to 
>>> reach it? 
>>>
>>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>>
>>>> This is not a forum for wildly off-topic, speculative discussion.
>>>>
>>>> Take this to Reddit, Hacker News, etc.
>>>>
>>>>
>>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>>
>>>>> I am wondering how Julia fits in with the unified tribes
>>>>>
>>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>>
>>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>>
>>>>
>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
I plan to write Julia for the rest of me life... given it remains suitable. 
I am still reading all of Colah's material on nets. I ran Mocha.jl a couple 
weeks ago and was very happy to see it work. Thanks for jumping in and 
telling me about OnlineAI.jl, I will look into it once I am ready. From a 
quick look, perhaps I could help and learn by building a very clear 
documentation for it. Would really like to see Julia a leap ahead of other 
languages, and plan to contribute heavily to it, but at the moment am still 
getting introduced to CS, programming, and nets at the basic level. 

On Friday, June 3, 2016 at 10:48:15 AM UTC-3, Tom Breloff wrote:
>
> Kevin: computers that program themselves is a concept which is much closer 
> to reality than most would believe, but julia-users isn't really the best 
> place for this speculation. If you're actually interested in writing code, 
> I'm happy to discuss in OnlineAI.jl. I was thinking about how we might 
> tackle code generation using a neural framework I'm working on. 
>
> On Friday, June 3, 2016, Kevin Liu > wrote:
>
>> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
>> lecture at Google with a TensorFlow question in the end), were unsuccessful 
>> penny traders, Julia was a language for web design, and the tribes in the 
>> video didn't actually solve problems, perhaps this would be a wildly 
>> off-topic, speculative discussion. But these statements couldn't be farther 
>> from the truth. In fact, if I had known about this video some months ago I 
>> would've understood better on how to solve a problem I was working on.  
>>
>> For the founders of Julia: I understand your tribe is mainly CS. This 
>> master algorithm, as you are aware, would require collaboration with other 
>> tribes. Just citing the obvious. 
>>
>> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>>
>>> There could be parts missing as Domingos mentions, but induction, 
>>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>>> working together-- what's speculative about the improved versions of these? 
>>>
>>> Julia was made for AI. Isn't it time for a consolidated view on how to 
>>> reach it? 
>>>
>>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>>
>>>> This is not a forum for wildly off-topic, speculative discussion.
>>>>
>>>> Take this to Reddit, Hacker News, etc.
>>>>
>>>>
>>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>>
>>>>> I am wondering how Julia fits in with the unified tribes
>>>>>
>>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>>
>>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>>
>>>>
>>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
Correction: Founders: tribe is mainly of Symbolists?

On Friday, June 3, 2016 at 10:36:01 AM UTC-3, Kevin Liu wrote:
>
> If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
> lecture at Google with a TensorFlow question in the end), were unsuccessful 
> penny traders, Julia was a language for web design, and the tribes in the 
> video didn't actually solve problems, perhaps this would be a wildly 
> off-topic, speculative discussion. But these statements couldn't be farther 
> from the truth. In fact, if I had known about this video some months ago I 
> would've understood better on how to solve a problem I was working on.  
>
> For the founders of Julia: I understand your tribe is mainly CS. This 
> master algorithm, as you are aware, would require collaboration with other 
> tribes. Just citing the obvious. 
>
> On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>>
>> There could be parts missing as Domingos mentions, but induction, 
>> backpropagation, genetic programming, probabilistic inference, and SVMs 
>> working together-- what's speculative about the improved versions of these? 
>>
>> Julia was made for AI. Isn't it time for a consolidated view on how to 
>> reach it? 
>>
>> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>>
>>> This is not a forum for wildly off-topic, speculative discussion.
>>>
>>> Take this to Reddit, Hacker News, etc.
>>>
>>>
>>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>>
>>>> I am wondering how Julia fits in with the unified tribes
>>>>
>>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>>
>>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>>
>>>
>>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
If Andrew Ng who cited Gates, and Gates who cited Domingos (who did not 
lecture at Google with a TensorFlow question in the end), were unsuccessful 
penny traders, Julia was a language for web design, and the tribes in the 
video didn't actually solve problems, perhaps this would be a wildly 
off-topic, speculative discussion. But these statements couldn't be farther 
from the truth. In fact, if I had known about this video some months ago I 
would've understood better on how to solve a problem I was working on.  

For the founders of Julia: I understand your tribe is mainly CS. This 
master algorithm, as you are aware, would require collaboration with other 
tribes. Just citing the obvious. 

On Friday, June 3, 2016 at 10:21:25 AM UTC-3, Kevin Liu wrote:
>
> There could be parts missing as Domingos mentions, but induction, 
> backpropagation, genetic programming, probabilistic inference, and SVMs 
> working together-- what's speculative about the improved versions of these? 
>
> Julia was made for AI. Isn't it time for a consolidated view on how to 
> reach it? 
>
> On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>>
>> This is not a forum for wildly off-topic, speculative discussion.
>>
>> Take this to Reddit, Hacker News, etc.
>>
>>
>> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu  wrote:
>>
>>> I am wondering how Julia fits in with the unified tribes
>>>
>>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>>
>>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>>
>>
>>

Re: [julia-users] Is the master algorithm on the roadmap?

2016-06-03 Thread Kevin Liu
There could be parts missing as Domingos mentions, but induction, 
backpropagation, genetic programming, probabilistic inference, and SVMs 
working together-- what's speculative about the improved versions of these? 

Julia was made for AI. Isn't it time for a consolidated view on how to 
reach it? 

On Thursday, June 2, 2016 at 11:20:35 PM UTC-3, Isaiah wrote:
>
> This is not a forum for wildly off-topic, speculative discussion.
>
> Take this to Reddit, Hacker News, etc.
>
>
> On Thu, Jun 2, 2016 at 10:01 PM, Kevin Liu 
> > wrote:
>
>> I am wondering how Julia fits in with the unified tribes
>>
>> mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ
>>
>> https://www.youtube.com/watch?v=B8J4uefCQMc
>>
>
>

[julia-users] Is the master algorithm on the roadmap?

2016-06-02 Thread Kevin Liu
I am wondering how Julia fits in with the unified tribes

mashable.com/2016/06/01/bill-gates-ai-code-conference/#8VmBFjIiYOqJ

https://www.youtube.com/watch?v=B8J4uefCQMc


Re: [julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
Got it. Thanks.

On Thursday, June 2, 2016 at 8:24:54 PM UTC-3, Isaiah wrote:
>
> I'm going to assume that there is a terminology barrier here...
>
> My interpretation of the question is: can we directly translate programs 
> from other languages?
>
> This is called source to source translation or compilation (
> https://en.m.wikipedia.org/wiki/Source-to-source_compiler), and the 
> answer is "it's complicated".
>
> Kevin, hopefully that terminology will give you a better start on looking 
> for answers -- stackoverflow probably has better treatments than I can give 
> regarding why this is difficult.
>
> On Thursday, June 2, 2016, Kevin Liu > 
> wrote:
>
>> Is there an answer somewhere here? 
>>
>> On Thursday, June 2, 2016 at 7:54:04 PM UTC-3, Kristoffer Carlsson wrote:
>>>
>>> Someone taught you wrong.
>>
>>

Re: [julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
Look who jumped into the wiseass train. An honest, short answer would be 
appreciated.

On Thursday, June 2, 2016 at 7:54:04 PM UTC-3, Kristoffer Carlsson wrote:
>
> Someone taught you wrong.



Re: [julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
Is there an answer here somewhere?

On Thursday, June 2, 2016 at 7:54:04 PM UTC-3, Kristoffer Carlsson wrote:
>
> Someone taught you wrong.



Re: [julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
Is there an answer somewhere here? 

On Thursday, June 2, 2016 at 7:54:04 PM UTC-3, Kristoffer Carlsson wrote:
>
> Someone taught you wrong.



Re: [julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
I'm sorry. I was taught no questions were inane.

Assuming we know what the packages ready in other languages do, can't the 
missing Julia packages be done through induction?

On Thursday, June 2, 2016 at 2:45:03 PM UTC-3, Stefan Karpinski wrote:
>
> Please don't spam the list with inane questions like this.
>
> On Thu, Jun 2, 2016 at 12:06 PM, Kevin Liu 
> > wrote:
>
>> Isn't package development mechanical? 
>>
>
>

[julia-users] Can we make Julia develop itself?

2016-06-02 Thread Kevin Liu
Isn't package development mechanical? 


[julia-users] Can we make Julia make itself complete?

2016-06-02 Thread Kevin Liu
Instead of developing packages, make Julia develop on its own? Isn't 
package development mechanical? 


Re: [julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-20 Thread Kevin Liu
If accuracy is 'the nearness of a calculation to the true value', and the 
assumption is the relation between all variables remain linear, then I 
don't see why accuracy wouldn't be useful in GLM. 

On Friday, May 20, 2016 at 6:24:06 PM UTC-3, Kevin Liu wrote:
>
> Pkg.update() updated all packages, did the job, thanks. So suppose I have 
> a dataset called train. R2(train), rˆ2(train), r2(train) didn't work. 
>
> I meant predictive accuracy. Does it apply to GLM? 
>
> On Friday, May 20, 2016 at 1:26:47 PM UTC-3, Milan Bouchet-Valat wrote:
>>
>> Le vendredi 20 mai 2016 à 08:59 -0700, Kevin Liu a écrit : 
>> > I think accuracy doesn't make sense for a linear model whose purpose 
>> > isn't to predict. Do you agree?  
>> Sorry, I don't know what you mean by "accuracy". Anyway, only users 
>> know the purpose of their models. All we can do is provide the support 
>> for indicators and let them choose the appropriate ones. 
>>
>>
>> Regards 
>>
>> > > Pkg.update("GLM") 
>> > > ERROR: MethodError: `update` has no method matching 
>> > > update(::ASCIIString) 
>> > > 
>> > > 
>> > > 
>> > > > Le jeudi 19 mai 2016 à 19:08 -0700, Kevin Liu a écrit :  
>> > > > > Thanks. I might need some help if I encounter problems on this 
>> > > > pseudo  
>> > > > > version.   
>> > > > I've just tagged a new 0.5.2 release, so this shouldn't be 
>> > > > necessary  
>> > > > now (just run Pkg.update()).  
>> > > > 
>> > > > 
>> > > > Regards  
>> > > > 
>> > > > > > Le jeudi 19 mai 2016 à 09:30 -0700, Kevin Liu a écrit :   
>> > > > > > > It seems the pkg owners are still deciding   
>> > > > > > >   
>> > > > > > > Funcs to evaluate fit   
>> > > > 
>> > > > > > >https://github.com/JuliaStats/GLM.jl/issues/74   
>> > > > > > > Add fit statistics functions and document existing ones  ht 
>> > > > tps://gith   
>> > > > > > > ub.com/JuliaStats/StatsBase.jl/pull/146   
>> > > > > > > Implement fit statistics functions 
>> > > > 
>> > > > > > >   https://github.com/JuliaStats/GLM.jl/pull/115   
>> > > > > > These PRs have been merged, so we just need to tag a new 
>> > > > release. Until   
>> > > > > > then, you can use Pkg.checkout() to use the development 
>> > > > version   
>> > > > > > (function is called R² or R2).   
>> > > > > >  
>> > > > > >  
>> > > > > > Regards   
>> > > > > >  
>> > > > > > >   
>> > > > > > > > I looked in GLM.jl but couldn't find a function for 
>> > > > calculating the   
>> > > > > > > > R2 or the accuracy of the R2 estimate.   
>> > > > > > > >   
>> > > > > > > > My understanding is that both should appear with the 
>> > > > glm()   
>> > > > > > > > function. Help would be appreciated.
>> > > > > > > >   
>> > > > > > > > Kevin   
>> > > > > > > >   
>>
>

Re: [julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-20 Thread Kevin Liu
Pkg.update() updated all packages, did the job, thanks. So suppose I have a 
dataset called train. R2(train), rˆ2(train), r2(train) didn't work. 

I meant predictive accuracy. Does it apply to GLM? 

On Friday, May 20, 2016 at 1:26:47 PM UTC-3, Milan Bouchet-Valat wrote:
>
> Le vendredi 20 mai 2016 à 08:59 -0700, Kevin Liu a écrit : 
> > I think accuracy doesn't make sense for a linear model whose purpose 
> > isn't to predict. Do you agree?  
> Sorry, I don't know what you mean by "accuracy". Anyway, only users 
> know the purpose of their models. All we can do is provide the support 
> for indicators and let them choose the appropriate ones. 
>
>
> Regards 
>
> > > Pkg.update("GLM") 
> > > ERROR: MethodError: `update` has no method matching 
> > > update(::ASCIIString) 
> > > 
> > > 
> > > 
> > > > Le jeudi 19 mai 2016 à 19:08 -0700, Kevin Liu a écrit :  
> > > > > Thanks. I might need some help if I encounter problems on this 
> > > > pseudo  
> > > > > version.   
> > > > I've just tagged a new 0.5.2 release, so this shouldn't be 
> > > > necessary  
> > > > now (just run Pkg.update()).  
> > > > 
> > > > 
> > > > Regards  
> > > > 
> > > > > > Le jeudi 19 mai 2016 à 09:30 -0700, Kevin Liu a écrit :   
> > > > > > > It seems the pkg owners are still deciding   
> > > > > > >   
> > > > > > > Funcs to evaluate fit   
> > > > 
> > > > > > >https://github.com/JuliaStats/GLM.jl/issues/74   
> > > > > > > Add fit statistics functions and document existing ones  ht 
> > > > tps://gith   
> > > > > > > ub.com/JuliaStats/StatsBase.jl/pull/146   
> > > > > > > Implement fit statistics functions 
> > > > 
> > > > > > >   https://github.com/JuliaStats/GLM.jl/pull/115   
> > > > > > These PRs have been merged, so we just need to tag a new 
> > > > release. Until   
> > > > > > then, you can use Pkg.checkout() to use the development 
> > > > version   
> > > > > > (function is called R² or R2).   
> > > > > >  
> > > > > >  
> > > > > > Regards   
> > > > > >  
> > > > > > >   
> > > > > > > > I looked in GLM.jl but couldn't find a function for 
> > > > calculating the   
> > > > > > > > R2 or the accuracy of the R2 estimate.   
> > > > > > > >   
> > > > > > > > My understanding is that both should appear with the 
> > > > glm()   
> > > > > > > > function. Help would be appreciated.
> > > > > > > >   
> > > > > > > > Kevin   
> > > > > > > >   
>


Re: [julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-20 Thread Kevin Liu
I think accuracy doesn't make sense for a linear model whose purpose isn't 
to predict. Do you agree? 

On Friday, May 20, 2016 at 12:37:44 PM UTC-3, Kevin Liu wrote:
>
> Pkg.update("GLM")
>
> *ERROR: MethodError: `update` has no method matching update(::ASCIIString)*
>
>
>
> On Friday, May 20, 2016 at 4:29:20 AM UTC-3, Milan Bouchet-Valat wrote:
>>
>> Le jeudi 19 mai 2016 à 19:08 -0700, Kevin Liu a écrit : 
>> > Thanks. I might need some help if I encounter problems on this pseudo 
>> > version.  
>> I've just tagged a new 0.5.2 release, so this shouldn't be necessary 
>> now (just run Pkg.update()). 
>>
>>
>> Regards 
>>
>> > > Le jeudi 19 mai 2016 à 09:30 -0700, Kevin Liu a écrit :  
>> > > > It seems the pkg owners are still deciding  
>> > > >  
>> > > > Funcs to evaluate fit   
>>
>> > > >https://github.com/JuliaStats/GLM.jl/issues/74  
>> > > > Add fit statistics functions and document existing ones  
>> https://gith  
>> > > > ub.com/JuliaStats/StatsBase.jl/pull/146  
>> > > > Implement fit statistics functions 
>>
>> > > >   https://github.com/JuliaStats/GLM.jl/pull/115  
>> > > These PRs have been merged, so we just need to tag a new release. 
>> Until  
>> > > then, you can use Pkg.checkout() to use the development version  
>> > > (function is called R² or R2).  
>> > > 
>> > > 
>> > > Regards  
>> > > 
>> > > >  
>> > > > > I looked in GLM.jl but couldn't find a function for calculating 
>> the  
>> > > > > R2 or the accuracy of the R2 estimate.  
>> > > > >  
>> > > > > My understanding is that both should appear with the glm()  
>> > > > > function. Help would be appreciated.   
>> > > > >  
>> > > > > Kevin  
>> > > > >  
>>
>

Re: [julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-20 Thread Kevin Liu
Pkg.update("GLM")

*ERROR: MethodError: `update` has no method matching update(::ASCIIString)*



On Friday, May 20, 2016 at 4:29:20 AM UTC-3, Milan Bouchet-Valat wrote:
>
> Le jeudi 19 mai 2016 à 19:08 -0700, Kevin Liu a écrit : 
> > Thanks. I might need some help if I encounter problems on this pseudo 
> > version.  
> I've just tagged a new 0.5.2 release, so this shouldn't be necessary 
> now (just run Pkg.update()). 
>
>
> Regards 
>
> > > Le jeudi 19 mai 2016 à 09:30 -0700, Kevin Liu a écrit :  
> > > > It seems the pkg owners are still deciding  
> > > >  
> > > > Funcs to evaluate fit   
>
> > > >https://github.com/JuliaStats/GLM.jl/issues/74  
> > > > Add fit statistics functions and document existing ones  
> https://gith  
> > > > ub.com/JuliaStats/StatsBase.jl/pull/146  
> > > > Implement fit statistics functions 
>
> > > >   https://github.com/JuliaStats/GLM.jl/pull/115  
> > > These PRs have been merged, so we just need to tag a new release. 
> Until  
> > > then, you can use Pkg.checkout() to use the development version  
> > > (function is called R² or R2).  
> > > 
> > > 
> > > Regards  
> > > 
> > > >  
> > > > > I looked in GLM.jl but couldn't find a function for calculating 
> the  
> > > > > R2 or the accuracy of the R2 estimate.  
> > > > >  
> > > > > My understanding is that both should appear with the glm()  
> > > > > function. Help would be appreciated.   
> > > > >  
> > > > > Kevin  
> > > > >  
>


Re: [julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-19 Thread Kevin Liu
Thanks. I might need some help if I encounter problems on this pseudo 
version. 

On Thursday, May 19, 2016 at 1:54:37 PM UTC-3, Milan Bouchet-Valat wrote:
>
> Le jeudi 19 mai 2016 à 09:30 -0700, Kevin Liu a écrit : 
> > It seems the pkg owners are still deciding 
> > 
> > Funcs to evaluate fit 
> >https://github.com/JuliaStats/GLM.jl/issues/74 
> > Add fit statistics functions and document existing ones  https://gith 
> > ub.com/JuliaStats/StatsBase.jl/pull/146 
> > Implement fit statistics functions   
> >   https://github.com/JuliaStats/GLM.jl/pull/115 
> These PRs have been merged, so we just need to tag a new release. Until 
> then, you can use Pkg.checkout() to use the development version 
> (function is called R² or R2). 
>
>
> Regards 
>
> > 
> > > I looked in GLM.jl but couldn't find a function for calculating the 
> > > R2 or the accuracy of the R2 estimate. 
> > > 
> > > My understanding is that both should appear with the glm() 
> > > function. Help would be appreciated.  
> > > 
> > > Kevin 
> > > 
>


[julia-users] Re: Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-19 Thread Kevin Liu
It seems the pkg owners are still deciding

Funcs to evaluate fit   
 https://github.com/JuliaStats/GLM.jl/issues/74
Add fit statistics functions and document existing ones 
 https://github.com/JuliaStats/StatsBase.jl/pull/146
Implement fit statistics functions 
https://github.com/JuliaStats/GLM.jl/pull/115

On Thursday, May 19, 2016 at 1:15:17 PM UTC-3, Kevin Liu wrote:
>
> I looked in GLM.jl but couldn't find a function for calculating the R2 or 
> the accuracy of the R2 estimate.
>
> My understanding is that both should appear with the glm() function. Help 
> would be appreciated. 
>
> Kevin
>


[julia-users] Coefficient of determination/R2/r-squared of model and accuracy of R2 estimate

2016-05-19 Thread Kevin Liu
I looked in GLM.jl but couldn't find a function for calculating the R2 or 
the accuracy of the R2 estimate.

My understanding is that both should appear with the glm() function. Help 
would be appreciated. 

Kevin


Re: [julia-users] Using Juno/LT to run Julia Code, Error, `getindex` has no method matching getindex(::DataFrame, ::ASCIIString

2016-05-18 Thread Kevin Liu
Ok

On Wed, May 18, 2016 at 4:17 PM, Stefan Karpinski 
wrote:

> In the future, please start a new thread for a new question/topic.
>
> On Wed, May 18, 2016 at 1:29 PM, Kevin Liu  wrote:
>
>> Stefan, since you helped aar...@udel.edu I thought maybe you'd help me.
>>
>>
>> On Wednesday, May 18, 2016 at 2:26:34 PM UTC-3, Kevin Liu wrote:
>>>
>>> Thanks Tom, that worked.
>>>
>>> Stefan, I'm sorry, I got this thread from
>>> https://github.com/JuliaLang/julia/issues/13782
>>>
>>> On Wednesday, May 18, 2016 at 12:20:39 PM UTC-3, Tom Breloff wrote:
>>>>
>>>> I think you might need target on the other side of the tilde:  "target
>>>> ~ x"?
>>>>
>>>> On Wed, May 18, 2016 at 11:16 AM, Kevin Liu  wrote:
>>>>
>>>>> Hi Stefan, what's wrong with my code? These are symbols for columns.
>>>>>
>>>>>
>>>>> *OLS=glm(x016399044a+x0342faceb5+x04e7268385+x06888ceac9+x072b7e8f27+x087235d61e+x0b846350ef+x0e2ab0831c+x12eda2d982+x136c1727c3+x173b6590ae+x174825d438+x1f222e3669+x1f3058af83+x1fa099bb01+x20f1afc5c7+x253eb5ef11+x25bbf0e7e7+x2719b72c0d+x298ed82b22+x29bbd86997+x2a457d15d9+x2bc6ab42f7+x2d7fe4693a+x2e874bc151+x384bec5dd1+x3df2300fa2+x3e200bf766+x3eb53ae932+x435dec85e2+x4468394575+x49756d8e0f+x4fc17427c8+x55907cc1de+x55cf3f7627+x56371466d7+x5b862c0a8f+x5f360995ef+x60ec1426ce+x63bcf89b1d+x6516422788+x65aed7dc1f+x6db53d265a+x7734c0c22f+x7743f273c2+x779d13189e+x77b3b41efa+x7841b6a5b1+x789b5244a9+x7925993f42+x7cb7913148+x7fe6cb4c98+x8311343404+x87b982928b+x8a21502326+x8c2e088a3d+x8de0382f02+x8f5f7c556a+x96c30c7eef+x96e6f0be58+x98475257f7+x99d44111c9+x9a575e82a4+x9b6e0b36c2+a14fd026ce+a24802caa5+aa69c802b6+abca7a848f+ac826f0013+ae08d2297e+aee1e4fc85+b4112a94a6+b709f75447+b9a487ac3c+ba54a2a637+bdf934caa7+beb6e17af1+c0c3df65b1+c1b8ce2354+c58f611921+d035af6ffa+d2c775fa99+d4d6566f9c+dcfcbc2ea1+e0a0772df0+e5efa4d39a+e7ee22effb+e86a2190c1+ea0f4a32e3+ed7e658a27+ee2ac696ff+f013b60e50+f0a0febd35+f66b98dd69+fbf66c8021+fdf8628ca7+fe0318e273+fe8cdd80ba+ffd1cdcfc1~target,train,Normal(),IdentityLink())*
>>>>>
>>>>> On Tuesday, October 27, 2015 at 12:35:12 PM UTC-2, Stefan Karpinski
>>>>> wrote:
>>>>>>
>>>>>> For quite some time, the way to access columns of DataFrames is to
>>>>>> index with symbols like :ID rather than strings like "ID".
>>>>>>
>>>>>> On Sun, Oct 25, 2015 at 2:01 PM,  wrote:
>>>>>>
>>>>>>> As you can see in the screen shot below this is the chunk of code I
>>>>>>> am trying to run. Error seems to occur with the `getindex` and it seems 
>>>>>>> to
>>>>>>> be related to the line of code:
>>>>>>>
>>>>>>> for (index, idImage) in enumerate(labelsInfoTrain["ID"], as both are
>>>>>>> highlighted in pink,
>>>>>>>
>>>>>>> Very confused how to fix, look forward to the help
>>>>>>>
>>>>>>>
>>>>>>> <https://lh3.googleusercontent.com/-vR_H0qTvRCo/Vi0Y2lBJ5QI/AAU/gEn4eNmZKQ4/s1600/Screenshot%2B%252830%2529.png>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>
>


Re: [julia-users] Using Juno/LT to run Julia Code, Error, `getindex` has no method matching getindex(::DataFrame, ::ASCIIString

2016-05-18 Thread Kevin Liu
Stefan, since you helped aar...@udel.edu I thought maybe you'd help me. 

On Wednesday, May 18, 2016 at 2:26:34 PM UTC-3, Kevin Liu wrote:
>
> Thanks Tom, that worked. 
>
> Stefan, I'm sorry, I got this thread from 
> https://github.com/JuliaLang/julia/issues/13782
>
> On Wednesday, May 18, 2016 at 12:20:39 PM UTC-3, Tom Breloff wrote:
>>
>> I think you might need target on the other side of the tilde:  "target ~ 
>> x...."?
>>
>> On Wed, May 18, 2016 at 11:16 AM, Kevin Liu  wrote:
>>
>>> Hi Stefan, what's wrong with my code? These are symbols for columns.
>>>
>>>
>>> *OLS=glm(x016399044a+x0342faceb5+x04e7268385+x06888ceac9+x072b7e8f27+x087235d61e+x0b846350ef+x0e2ab0831c+x12eda2d982+x136c1727c3+x173b6590ae+x174825d438+x1f222e3669+x1f3058af83+x1fa099bb01+x20f1afc5c7+x253eb5ef11+x25bbf0e7e7+x2719b72c0d+x298ed82b22+x29bbd86997+x2a457d15d9+x2bc6ab42f7+x2d7fe4693a+x2e874bc151+x384bec5dd1+x3df2300fa2+x3e200bf766+x3eb53ae932+x435dec85e2+x4468394575+x49756d8e0f+x4fc17427c8+x55907cc1de+x55cf3f7627+x56371466d7+x5b862c0a8f+x5f360995ef+x60ec1426ce+x63bcf89b1d+x6516422788+x65aed7dc1f+x6db53d265a+x7734c0c22f+x7743f273c2+x779d13189e+x77b3b41efa+x7841b6a5b1+x789b5244a9+x7925993f42+x7cb7913148+x7fe6cb4c98+x8311343404+x87b982928b+x8a21502326+x8c2e088a3d+x8de0382f02+x8f5f7c556a+x96c30c7eef+x96e6f0be58+x98475257f7+x99d44111c9+x9a575e82a4+x9b6e0b36c2+a14fd026ce+a24802caa5+aa69c802b6+abca7a848f+ac826f0013+ae08d2297e+aee1e4fc85+b4112a94a6+b709f75447+b9a487ac3c+ba54a2a637+bdf934caa7+beb6e17af1+c0c3df65b1+c1b8ce2354+c58f611921+d035af6ffa+d2c775fa99+d4d6566f9c+dcfcbc2ea1+e0a0772df0+e5efa4d39a+e7ee22effb+e86a2190c1+ea0f4a32e3+ed7e658a27+ee2ac696ff+f013b60e50+f0a0febd35+f66b98dd69+fbf66c8021+fdf8628ca7+fe0318e273+fe8cdd80ba+ffd1cdcfc1~target,train,Normal(),IdentityLink())*
>>>
>>> On Tuesday, October 27, 2015 at 12:35:12 PM UTC-2, Stefan Karpinski 
>>> wrote:
>>>>
>>>> For quite some time, the way to access columns of DataFrames is to 
>>>> index with symbols like :ID rather than strings like "ID".
>>>>
>>>> On Sun, Oct 25, 2015 at 2:01 PM,  wrote:
>>>>
>>>>> As you can see in the screen shot below this is the chunk of code I am 
>>>>> trying to run. Error seems to occur with the `getindex` and it seems to 
>>>>> be 
>>>>> related to the line of code:
>>>>>
>>>>> for (index, idImage) in enumerate(labelsInfoTrain["ID"], as both are 
>>>>> highlighted in pink,
>>>>>
>>>>> Very confused how to fix, look forward to the help
>>>>>
>>>>>
>>>>> <https://lh3.googleusercontent.com/-vR_H0qTvRCo/Vi0Y2lBJ5QI/AAU/gEn4eNmZKQ4/s1600/Screenshot%2B%252830%2529.png>
>>>>>
>>>>>
>>>>>
>>>>
>>

Re: [julia-users] Using Juno/LT to run Julia Code, Error, `getindex` has no method matching getindex(::DataFrame, ::ASCIIString

2016-05-18 Thread Kevin Liu
Thanks Tom, that worked. 

Stefan, I'm sorry, I got this thread 
from https://github.com/JuliaLang/julia/issues/13782

On Wednesday, May 18, 2016 at 12:20:39 PM UTC-3, Tom Breloff wrote:
>
> I think you might need target on the other side of the tilde:  "target ~ 
> x"?
>
> On Wed, May 18, 2016 at 11:16 AM, Kevin Liu  > wrote:
>
>> Hi Stefan, what's wrong with my code? These are symbols for columns.
>>
>>
>> *OLS=glm(x016399044a+x0342faceb5+x04e7268385+x06888ceac9+x072b7e8f27+x087235d61e+x0b846350ef+x0e2ab0831c+x12eda2d982+x136c1727c3+x173b6590ae+x174825d438+x1f222e3669+x1f3058af83+x1fa099bb01+x20f1afc5c7+x253eb5ef11+x25bbf0e7e7+x2719b72c0d+x298ed82b22+x29bbd86997+x2a457d15d9+x2bc6ab42f7+x2d7fe4693a+x2e874bc151+x384bec5dd1+x3df2300fa2+x3e200bf766+x3eb53ae932+x435dec85e2+x4468394575+x49756d8e0f+x4fc17427c8+x55907cc1de+x55cf3f7627+x56371466d7+x5b862c0a8f+x5f360995ef+x60ec1426ce+x63bcf89b1d+x6516422788+x65aed7dc1f+x6db53d265a+x7734c0c22f+x7743f273c2+x779d13189e+x77b3b41efa+x7841b6a5b1+x789b5244a9+x7925993f42+x7cb7913148+x7fe6cb4c98+x8311343404+x87b982928b+x8a21502326+x8c2e088a3d+x8de0382f02+x8f5f7c556a+x96c30c7eef+x96e6f0be58+x98475257f7+x99d44111c9+x9a575e82a4+x9b6e0b36c2+a14fd026ce+a24802caa5+aa69c802b6+abca7a848f+ac826f0013+ae08d2297e+aee1e4fc85+b4112a94a6+b709f75447+b9a487ac3c+ba54a2a637+bdf934caa7+beb6e17af1+c0c3df65b1+c1b8ce2354+c58f611921+d035af6ffa+d2c775fa99+d4d6566f9c+dcfcbc2ea1+e0a0772df0+e5efa4d39a+e7ee22effb+e86a2190c1+ea0f4a32e3+ed7e658a27+ee2ac696ff+f013b60e50+f0a0febd35+f66b98dd69+fbf66c8021+fdf8628ca7+fe0318e273+fe8cdd80ba+ffd1cdcfc1~target,train,Normal(),IdentityLink())*
>>
>> On Tuesday, October 27, 2015 at 12:35:12 PM UTC-2, Stefan Karpinski wrote:
>>>
>>> For quite some time, the way to access columns of DataFrames is to index 
>>> with symbols like :ID rather than strings like "ID".
>>>
>>> On Sun, Oct 25, 2015 at 2:01 PM,  wrote:
>>>
>>>> As you can see in the screen shot below this is the chunk of code I am 
>>>> trying to run. Error seems to occur with the `getindex` and it seems to be 
>>>> related to the line of code:
>>>>
>>>> for (index, idImage) in enumerate(labelsInfoTrain["ID"], as both are 
>>>> highlighted in pink,
>>>>
>>>> Very confused how to fix, look forward to the help
>>>>
>>>>
>>>> <https://lh3.googleusercontent.com/-vR_H0qTvRCo/Vi0Y2lBJ5QI/AAU/gEn4eNmZKQ4/s1600/Screenshot%2B%252830%2529.png>
>>>>
>>>>
>>>>
>>>
>

Re: [julia-users] Using Juno/LT to run Julia Code, Error, `getindex` has no method matching getindex(::DataFrame, ::ASCIIString

2016-05-18 Thread Kevin Liu
the dataset https://github.com/kzahedi/RNN.jl/issues/1

On Wednesday, May 18, 2016 at 12:16:49 PM UTC-3, Kevin Liu wrote:
>
> Hi Stefan, what's wrong with my code? These are symbols for columns.
>
>
> *OLS=glm(x016399044a+x0342faceb5+x04e7268385+x06888ceac9+x072b7e8f27+x087235d61e+x0b846350ef+x0e2ab0831c+x12eda2d982+x136c1727c3+x173b6590ae+x174825d438+x1f222e3669+x1f3058af83+x1fa099bb01+x20f1afc5c7+x253eb5ef11+x25bbf0e7e7+x2719b72c0d+x298ed82b22+x29bbd86997+x2a457d15d9+x2bc6ab42f7+x2d7fe4693a+x2e874bc151+x384bec5dd1+x3df2300fa2+x3e200bf766+x3eb53ae932+x435dec85e2+x4468394575+x49756d8e0f+x4fc17427c8+x55907cc1de+x55cf3f7627+x56371466d7+x5b862c0a8f+x5f360995ef+x60ec1426ce+x63bcf89b1d+x6516422788+x65aed7dc1f+x6db53d265a+x7734c0c22f+x7743f273c2+x779d13189e+x77b3b41efa+x7841b6a5b1+x789b5244a9+x7925993f42+x7cb7913148+x7fe6cb4c98+x8311343404+x87b982928b+x8a21502326+x8c2e088a3d+x8de0382f02+x8f5f7c556a+x96c30c7eef+x96e6f0be58+x98475257f7+x99d44111c9+x9a575e82a4+x9b6e0b36c2+a14fd026ce+a24802caa5+aa69c802b6+abca7a848f+ac826f0013+ae08d2297e+aee1e4fc85+b4112a94a6+b709f75447+b9a487ac3c+ba54a2a637+bdf934caa7+beb6e17af1+c0c3df65b1+c1b8ce2354+c58f611921+d035af6ffa+d2c775fa99+d4d6566f9c+dcfcbc2ea1+e0a0772df0+e5efa4d39a+e7ee22effb+e86a2190c1+ea0f4a32e3+ed7e658a27+ee2ac696ff+f013b60e50+f0a0febd35+f66b98dd69+fbf66c8021+fdf8628ca7+fe0318e273+fe8cdd80ba+ffd1cdcfc1~target,train,Normal(),IdentityLink())*
>
> On Tuesday, October 27, 2015 at 12:35:12 PM UTC-2, Stefan Karpinski wrote:
>>
>> For quite some time, the way to access columns of DataFrames is to index 
>> with symbols like :ID rather than strings like "ID".
>>
>> On Sun, Oct 25, 2015 at 2:01 PM,  wrote:
>>
>>> As you can see in the screen shot below this is the chunk of code I am 
>>> trying to run. Error seems to occur with the `getindex` and it seems to be 
>>> related to the line of code:
>>>
>>> for (index, idImage) in enumerate(labelsInfoTrain["ID"], as both are 
>>> highlighted in pink,
>>>
>>> Very confused how to fix, look forward to the help
>>>
>>>
>>> <https://lh3.googleusercontent.com/-vR_H0qTvRCo/Vi0Y2lBJ5QI/AAU/gEn4eNmZKQ4/s1600/Screenshot%2B%252830%2529.png>
>>>
>>>
>>>
>>

Re: [julia-users] Using Juno/LT to run Julia Code, Error, `getindex` has no method matching getindex(::DataFrame, ::ASCIIString

2016-05-18 Thread Kevin Liu
Hi Stefan, what's wrong with my code? These are symbols for columns.

*OLS=glm(x016399044a+x0342faceb5+x04e7268385+x06888ceac9+x072b7e8f27+x087235d61e+x0b846350ef+x0e2ab0831c+x12eda2d982+x136c1727c3+x173b6590ae+x174825d438+x1f222e3669+x1f3058af83+x1fa099bb01+x20f1afc5c7+x253eb5ef11+x25bbf0e7e7+x2719b72c0d+x298ed82b22+x29bbd86997+x2a457d15d9+x2bc6ab42f7+x2d7fe4693a+x2e874bc151+x384bec5dd1+x3df2300fa2+x3e200bf766+x3eb53ae932+x435dec85e2+x4468394575+x49756d8e0f+x4fc17427c8+x55907cc1de+x55cf3f7627+x56371466d7+x5b862c0a8f+x5f360995ef+x60ec1426ce+x63bcf89b1d+x6516422788+x65aed7dc1f+x6db53d265a+x7734c0c22f+x7743f273c2+x779d13189e+x77b3b41efa+x7841b6a5b1+x789b5244a9+x7925993f42+x7cb7913148+x7fe6cb4c98+x8311343404+x87b982928b+x8a21502326+x8c2e088a3d+x8de0382f02+x8f5f7c556a+x96c30c7eef+x96e6f0be58+x98475257f7+x99d44111c9+x9a575e82a4+x9b6e0b36c2+a14fd026ce+a24802caa5+aa69c802b6+abca7a848f+ac826f0013+ae08d2297e+aee1e4fc85+b4112a94a6+b709f75447+b9a487ac3c+ba54a2a637+bdf934caa7+beb6e17af1+c0c3df65b1+c1b8ce2354+c58f611921+d035af6ffa+d2c775fa99+d4d6566f9c+dcfcbc2ea1+e0a0772df0+e5efa4d39a+e7ee22effb+e86a2190c1+ea0f4a32e3+ed7e658a27+ee2ac696ff+f013b60e50+f0a0febd35+f66b98dd69+fbf66c8021+fdf8628ca7+fe0318e273+fe8cdd80ba+ffd1cdcfc1~target,train,Normal(),IdentityLink())*

On Tuesday, October 27, 2015 at 12:35:12 PM UTC-2, Stefan Karpinski wrote:
>
> For quite some time, the way to access columns of DataFrames is to index 
> with symbols like :ID rather than strings like "ID".
>
> On Sun, Oct 25, 2015 at 2:01 PM, > wrote:
>
>> As you can see in the screen shot below this is the chunk of code I am 
>> trying to run. Error seems to occur with the `getindex` and it seems to be 
>> related to the line of code:
>>
>> for (index, idImage) in enumerate(labelsInfoTrain["ID"], as both are 
>> highlighted in pink,
>>
>> Very confused how to fix, look forward to the help
>>
>>
>> 
>>
>>
>>
>

[julia-users] Re: Venturebeat 11/2015: Open-source DL Frameworks? Take your pick

2016-05-03 Thread Kevin Liu
The right link http://venturebeat.com/2015/11/14/deep-learning-frameworks/

On Tuesday, May 3, 2016 at 6:44:13 PM UTC-3, Kevin Liu wrote:
>
>
> http://venturebeat.com/2016/05/03/microsoft-launches-new-website-for-developer-documentation-with-user-friendly-urls-estimated-reading-time/
>
> Would like to see Julia here!
>
> Kevin
>


[julia-users] Re: Venturebeat 11/2015: Open-source DL Frameworks? Take your pick

2016-05-03 Thread Kevin Liu
Google moves to 
TF 
http://googleresearch.blogspot.com.br/2016/04/deepmind-moves-to-tensorflow.html

Would like for Julia to be the World's research-to-mass-scale engine. 

On Tuesday, May 3, 2016 at 6:44:13 PM UTC-3, Kevin Liu wrote:
>
>
> http://venturebeat.com/2016/05/03/microsoft-launches-new-website-for-developer-documentation-with-user-friendly-urls-estimated-reading-time/
>
> Would like to see Julia here!
>
> Kevin
>


[julia-users] Venturebeat 11/2015: Open-source DL Frameworks? Take your pick

2016-05-03 Thread Kevin Liu
http://venturebeat.com/2016/05/03/microsoft-launches-new-website-for-developer-documentation-with-user-friendly-urls-estimated-reading-time/

Would like to see Julia here!

Kevin


[julia-users] Will Julia adapt to new models of computation?

2016-03-21 Thread Kevin Liu
http://www.cs.cmu.edu/~lblum/PAPERS/TuringMeetsNewton.pdf

Thanks. Kevin


Re: [julia-users] Re: MongoDB and Julia

2015-09-02 Thread Kevin Liu
Hi Ferenc. Thanks for posting this. I won't be able to get back to you until 
later. Hope you get help from the community, which is great. Take care.



> On Sep 1, 2015, at 13:14, Ferenc Szalma  wrote:
> 
> Kevin,
> 
> I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am 
> trying to make it work in v0.4 but getting an error message while trying to 
> insert:
> 
> oid = insert(collection, {"name"=>"time series"})
> 
> 
> 
> WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use 
> "Dict{Any,Any}(a=>b, ...)" instead. 
> 
> LoadError: MethodError: `convert` has no method matching 
> convert(::Type{Ptr{Void}}, ::Array{UInt8,1})
> This may have arisen from a call to the constructor Ptr{Void}(...),
> since type constructors fall back to convert methods.
> Closest candidates are:
>  call{T}(::Type{T}, ::Any)
>  convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
>  convert{T}(::Type{Ptr{T}}, !Matched::Int64)
>  ...
> while loading In[36], in expression starting on line 1
>  
> 
>  in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42
> 
> 
> Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to how to 
> go about getting rid of the LoadError above? It seems like a generic problem 
> when switching from v0.3 to v0.4.
> 
> Cheers
> 


Re: [julia-users] Re: MongoDB and Julia

2015-08-05 Thread Kevin Liu
Hi Kevin! Thank you immensely for the advice. I read about libclang's 
limitations 
<http://eli.thegreenplace.net/2011/07/03/parsing-c-in-python-with-clang/> and 
I decided I will work on CMongo.jl and make it right and complete. Will 
keep the community posted. Cheers 

On Saturday, August 1, 2015 at 10:00:07 AM UTC-3, Kevin Squire wrote:
>
> Hi, Kevin,
>
> Great that you got Mongo.jl to work!  My suggestion would be that you play 
> with that package a little, and try extending it by adding a method or 
> two, fixing warnings, etc. 
>
> Regarding modeling language, I wouldn't recommend it here. It's great that 
> you want a complete package, and it's fine if you want to learn some 
> modeling language. But my opinion is that modeling languages like UML are 
> targeted at projects in OO languages with much more overhead than Julia 
> (e.g., Java or C++) or projects which need a strong specification (e.g., 
> government projects), and where that spec needs to be shared among people 
> who may not need, want, or be able to talk with one another.  I don't have 
> the impression that this project fits any of those requirements, and most 
> people reading this list won't be familiar with them.  (If you're doing it 
> just to learn that tool, fine.)
>
> That doesn't mean you shouldn't be organized--it really means you 
> shouldn't worry about designing and implementing the project all at once. 
>
> I recommend incremental implementation. That either means 1. continue 
> working with the Mongo.jl code, adding functionality (and 
> tests) incrementally, or 2. wrap the library using Clang.jl, and add tests 
> for each function incrementally. 
>
> Anyway, just my opinion. 
>
> Cheers,
>Kevin 
>
> On Wed, Jul 29, 2015 at 6:51 PM, Kevin Liu  > wrote:
>
>> Hey Kevin and Community, I got Pzion's Mongo.jl to work (see picture). I 
>> had to Pkg.add("Mongo") manually for it to enter the required packages list 
>> in Pkg.status(). 
>>
>> Question now: i plan to sketch the technical specifications of the stack 
>> design on https://en.wikipedia.org/wiki/Unified_Modeling_Language for 
>> other developers to use. Is there any other gp modeling language you or the 
>> Julia users community would recommend instead? 
>>
>> Thanks!
>>
>>
>> On Wednesday, July 29, 2015 at 8:23:56 PM UTC-3, Kevin Squire wrote:
>>>
>>> Good luck!
>>>
>>> On Thu, Jul 30, 2015 at 1:01 AM, Kevin Liu  wrote:
>>>
>>>> Haha this is my first major project, period! Thanks a lot for putting 
>>>> in the time and effort into guiding. I come from the finance world but 
>>>> became interested in Julia and MongoDB for what they can do with science. 
>>>> It's a hand into understanding so much. Focused on Mongo.jl and will 
>>>> digest 
>>>> your comments after that. I got ahead of myself from Mongo.jl after seeing 
>>>> there was so much more to be done to make the stack fully functioning. 
>>>> Thank you! Will keep you and the community posted. Cheers mate!
>>>>
>>>> On Wed, Jul 29, 2015 at 6:50 PM, Kevin Squire  
>>>> wrote:
>>>>
>>>>> Hi Kevin,
>>>>>
>>>>> I'm sorry to hear (and see) that you haven't gotten much help on the 
>>>>> Julia mailing list.  It's probably just that your request hasn't fallen 
>>>>> on 
>>>>> the ears of anyone who has interest in both Mongo and Julia and the time 
>>>>> to 
>>>>> help.  I'm almost in that category, in that I have a small interest in 
>>>>> Mongo (and a lot of interest in Julia), but only so much time.
>>>>>
>>>>> I will say that once you get something working, the mailing list will 
>>>>> usually be a good resource, but it works best if you have specific 
>>>>> questions or problems (e.g., I'm trying to do this with the following 
>>>>> code, 
>>>>> but it's not working--what am I doing wrong) vs. general requests (e.g., 
>>>>> asking for help wrapping mongo with a mostly empty repo).
>>>>>
>>>>> I'm assuming this is your first major Julia project?  If so, my first 
>>>>> suggestion is to ignore everything written below (for now), and try to 
>>>>> get 
>>>>> the Mongo.jl library running on a modern Julia first, and make sure you 
>>>>> understand everything that it's doing.  Only after that would I try 
&

Re: [julia-users] Re: MongoDB and Julia

2015-07-29 Thread Kevin Liu
Thanks! Will need it!

On Wed, Jul 29, 2015 at 8:23 PM, Kevin Squire 
wrote:

> Good luck!
>
> On Thu, Jul 30, 2015 at 1:01 AM, Kevin Liu  wrote:
>
>> Haha this is my first major project, period! Thanks a lot for putting in
>> the time and effort into guiding. I come from the finance world but became
>> interested in Julia and MongoDB for what they can do with science. It's a
>> hand into understanding so much. Focused on Mongo.jl and will digest your
>> comments after that. I got ahead of myself from Mongo.jl after seeing there
>> was so much more to be done to make the stack fully functioning. Thank you!
>> Will keep you and the community posted. Cheers mate!
>>
>> On Wed, Jul 29, 2015 at 6:50 PM, Kevin Squire 
>> wrote:
>>
>>> Hi Kevin,
>>>
>>> I'm sorry to hear (and see) that you haven't gotten much help on the
>>> Julia mailing list.  It's probably just that your request hasn't fallen on
>>> the ears of anyone who has interest in both Mongo and Julia and the time to
>>> help.  I'm almost in that category, in that I have a small interest in
>>> Mongo (and a lot of interest in Julia), but only so much time.
>>>
>>> I will say that once you get something working, the mailing list will
>>> usually be a good resource, but it works best if you have specific
>>> questions or problems (e.g., I'm trying to do this with the following code,
>>> but it's not working--what am I doing wrong) vs. general requests (e.g.,
>>> asking for help wrapping mongo with a mostly empty repo).
>>>
>>> I'm assuming this is your first major Julia project?  If so, my first
>>> suggestion is to ignore everything written below (for now), and try to get
>>> the Mongo.jl library running on a modern Julia first, and make sure you
>>> understand everything that it's doing.  Only after that would I try
>>> anything below.
>>>
>>> 
>>>
>>> In truth, I don't know the best way forward, but I can give you a little
>>> more information, and maybe it will help you decide.
>>>
>>> Clang.jl is a general framework for wrapping C libraries.  It's never
>>> necessary--any C library can be called directly using ccall.  But ccalls
>>> can be kind of clunky, so Clang.jl provides an API which very closely
>>> mimics the C API provided in some header file: functions have very similar
>>> signatures, and structs become julia types.  For a large library with lots
>>> of functions, or complicated structs with lots of members, this can make
>>> wrapping relatively easy.
>>>
>>> While the output of Clang.jl is usually reasonably nice, it's still a
>>> very low level of abstraction.  Actually, very little abstraction, because
>>> it matches the C library, which is usually very low level, and still
>>> somewhat inconvenient in Julia (but more convenient than ccalls).  To be
>>> useful, you'll often want to add a higher level API on top of that, which
>>> adds functions and/or types that encompass or simplify the lower level
>>> calls.
>>>
>>> It's also often the case that Clang.jl doesn't give you exactly what you
>>> need--e.g., it doesn't know how to wrap certain things, such as unions and
>>> C macros.  In those cases, you'll have to edit the output by hand, or spend
>>> some time programmatically filtering/modifying the results.  I do this a
>>> bit in the wrapper for VideoIO
>>> <https://github.com/kmsquire/VideoIO.jl/blob/master/util/wrap_libav_split.jl>
>>>  (and
>>> I still have to edit some files by hand at the end).
>>>
>>> The main alternative is to wrap a subset of useful functions by hand (or
>>> simply use ccall directly, which amounts to pretty much the same thing).
>>> This is probably closer to what the Mongo folks had in mind when they
>>> directed you to the Lua driver.  If you only need access to a few
>>> functions, or if your code is highly specialized in a way that Clang.jl has
>>> trouble with, wrapping by hand can be the way to go.  For this, you're
>>> basically writing Julia functions which ccall out to external library
>>> functions (such as those in mongo), and returns the result (or some
>>> modification thereof that matches what would normally be done in Julia).
>>>
>>> Hopefully this was useful.  Please feel free to post back here with
>>> questions, and I (and maybe others, if the question is right) 

Re: [julia-users] Re: MongoDB and Julia

2015-07-29 Thread Kevin Liu
I'm also sharing the stack design in the hopes of speeding the process. 
Comments are welcome. 

https://docs.google.com/spreadsheets/d/1rgqtCay8HhnVuYR4UCZi9IGa0bzjbc3VupPxRUT9DS0/edit?usp=sharing

On Wednesday, July 29, 2015 at 8:01:11 PM UTC-3, Kevin Liu wrote:
>
> Haha this is my first major project, period! Thanks a lot for putting in 
> the time and effort into guiding. I come from the finance world but became 
> interested in Julia and MongoDB for what they can do with science. It's a 
> hand into understanding so much. Focused on Mongo.jl and will digest your 
> comments after that. I got ahead of myself from Mongo.jl after seeing there 
> was so much more to be done to make the stack fully functioning. Thank you! 
> Will keep you and the community posted. Cheers mate!
>
> On Wed, Jul 29, 2015 at 6:50 PM, Kevin Squire  
> wrote:
>
>> Hi Kevin,
>>
>> I'm sorry to hear (and see) that you haven't gotten much help on the 
>> Julia mailing list.  It's probably just that your request hasn't fallen on 
>> the ears of anyone who has interest in both Mongo and Julia and the time to 
>> help.  I'm almost in that category, in that I have a small interest in 
>> Mongo (and a lot of interest in Julia), but only so much time.
>>
>> I will say that once you get something working, the mailing list will 
>> usually be a good resource, but it works best if you have specific 
>> questions or problems (e.g., I'm trying to do this with the following code, 
>> but it's not working--what am I doing wrong) vs. general requests (e.g., 
>> asking for help wrapping mongo with a mostly empty repo).
>>
>> I'm assuming this is your first major Julia project?  If so, my first 
>> suggestion is to ignore everything written below (for now), and try to get 
>> the Mongo.jl library running on a modern Julia first, and make sure you 
>> understand everything that it's doing.  Only after that would I try 
>> anything below.
>>
>> 
>>
>> In truth, I don't know the best way forward, but I can give you a little 
>> more information, and maybe it will help you decide. 
>>
>> Clang.jl is a general framework for wrapping C libraries.  It's never 
>> necessary--any C library can be called directly using ccall.  But ccalls 
>> can be kind of clunky, so Clang.jl provides an API which very closely 
>> mimics the C API provided in some header file: functions have very similar 
>> signatures, and structs become julia types.  For a large library with lots 
>> of functions, or complicated structs with lots of members, this can make 
>> wrapping relatively easy.
>>
>> While the output of Clang.jl is usually reasonably nice, it's still a 
>> very low level of abstraction.  Actually, very little abstraction, because 
>> it matches the C library, which is usually very low level, and still 
>> somewhat inconvenient in Julia (but more convenient than ccalls).  To be 
>> useful, you'll often want to add a higher level API on top of that, which 
>> adds functions and/or types that encompass or simplify the lower level 
>> calls.
>>
>> It's also often the case that Clang.jl doesn't give you exactly what you 
>> need--e.g., it doesn't know how to wrap certain things, such as unions and 
>> C macros.  In those cases, you'll have to edit the output by hand, or spend 
>> some time programmatically filtering/modifying the results.  I do this a 
>> bit in the wrapper for VideoIO 
>> <https://github.com/kmsquire/VideoIO.jl/blob/master/util/wrap_libav_split.jl>
>>  (and 
>> I still have to edit some files by hand at the end).
>>
>> The main alternative is to wrap a subset of useful functions by hand (or 
>> simply use ccall directly, which amounts to pretty much the same thing).  
>> This is probably closer to what the Mongo folks had in mind when they 
>> directed you to the Lua driver.  If you only need access to a few 
>> functions, or if your code is highly specialized in a way that Clang.jl has 
>> trouble with, wrapping by hand can be the way to go.  For this, you're 
>> basically writing Julia functions which ccall out to external library 
>> functions (such as those in mongo), and returns the result (or some 
>> modification thereof that matches what would normally be done in Julia).
>>
>> Hopefully this was useful.  Please feel free to post back here with 
>> questions, and I (and maybe others, if the question is right) will try to 
>> answer as we have time.
>>
>> Cheers!
>>Kevin
>>
>> On Wed, Jul 29, 201

Re: [julia-users] Re: MongoDB and Julia

2015-07-29 Thread Kevin Liu
Haha this is my first major project, period! Thanks a lot for putting in
the time and effort into guiding. I come from the finance world but became
interested in Julia and MongoDB for what they can do with science. It's a
hand into understanding so much. Focused on Mongo.jl and will digest your
comments after that. I got ahead of myself from Mongo.jl after seeing there
was so much more to be done to make the stack fully functioning. Thank you!
Will keep you and the community posted. Cheers mate!

On Wed, Jul 29, 2015 at 6:50 PM, Kevin Squire 
wrote:

> Hi Kevin,
>
> I'm sorry to hear (and see) that you haven't gotten much help on the Julia
> mailing list.  It's probably just that your request hasn't fallen on the
> ears of anyone who has interest in both Mongo and Julia and the time to
> help.  I'm almost in that category, in that I have a small interest in
> Mongo (and a lot of interest in Julia), but only so much time.
>
> I will say that once you get something working, the mailing list will
> usually be a good resource, but it works best if you have specific
> questions or problems (e.g., I'm trying to do this with the following code,
> but it's not working--what am I doing wrong) vs. general requests (e.g.,
> asking for help wrapping mongo with a mostly empty repo).
>
> I'm assuming this is your first major Julia project?  If so, my first
> suggestion is to ignore everything written below (for now), and try to get
> the Mongo.jl library running on a modern Julia first, and make sure you
> understand everything that it's doing.  Only after that would I try
> anything below.
>
> 
>
> In truth, I don't know the best way forward, but I can give you a little
> more information, and maybe it will help you decide.
>
> Clang.jl is a general framework for wrapping C libraries.  It's never
> necessary--any C library can be called directly using ccall.  But ccalls
> can be kind of clunky, so Clang.jl provides an API which very closely
> mimics the C API provided in some header file: functions have very similar
> signatures, and structs become julia types.  For a large library with lots
> of functions, or complicated structs with lots of members, this can make
> wrapping relatively easy.
>
> While the output of Clang.jl is usually reasonably nice, it's still a very
> low level of abstraction.  Actually, very little abstraction, because it
> matches the C library, which is usually very low level, and still somewhat
> inconvenient in Julia (but more convenient than ccalls).  To be useful,
> you'll often want to add a higher level API on top of that, which adds
> functions and/or types that encompass or simplify the lower level calls.
>
> It's also often the case that Clang.jl doesn't give you exactly what you
> need--e.g., it doesn't know how to wrap certain things, such as unions and
> C macros.  In those cases, you'll have to edit the output by hand, or spend
> some time programmatically filtering/modifying the results.  I do this a
> bit in the wrapper for VideoIO
> <https://github.com/kmsquire/VideoIO.jl/blob/master/util/wrap_libav_split.jl> 
> (and
> I still have to edit some files by hand at the end).
>
> The main alternative is to wrap a subset of useful functions by hand (or
> simply use ccall directly, which amounts to pretty much the same thing).
> This is probably closer to what the Mongo folks had in mind when they
> directed you to the Lua driver.  If you only need access to a few
> functions, or if your code is highly specialized in a way that Clang.jl has
> trouble with, wrapping by hand can be the way to go.  For this, you're
> basically writing Julia functions which ccall out to external library
> functions (such as those in mongo), and returns the result (or some
> modification thereof that matches what would normally be done in Julia).
>
> Hopefully this was useful.  Please feel free to post back here with
> questions, and I (and maybe others, if the question is right) will try to
> answer as we have time.
>
> Cheers!
>Kevin
>
> On Wed, Jul 29, 2015 at 3:00 PM, Kevin Liu  wrote:
>
>> Thanks for the valuable advice and modesty, there is no code in the reps.
>> I'm learning how to do this properly. From the Mongo side I have received a
>> little bit of guidance but from the Julia side, very little. Changed it to
>> CMongo.jl. Clang.jl will be of great help (I had bookmarked it, but didn't
>> realize it was a wrapper).
>>
>> On Wed, Jul 29, 2015 at 3:47 AM, Kevin Squire 
>> wrote:
>>
>>> I would suggest CMongo.jl (when renaming this, be careful if you're on a
>>> case-insensitive filesystem).  Of the

Re: [julia-users] Re: MongoDB and Julia

2015-07-29 Thread Kevin Liu
Thanks for the valuable advice and modesty, there is no code in the reps.
I'm learning how to do this properly. From the Mongo side I have received a
little bit of guidance but from the Julia side, very little. Changed it to
CMongo.jl. Clang.jl will be of great help (I had bookmarked it, but didn't
realize it was a wrapper).

On Wed, Jul 29, 2015 at 3:47 AM, Kevin Squire 
wrote:

> I would suggest CMongo.jl (when renaming this, be careful if you're on a
> case-insensitive filesystem).  Of the fully capitalized names in Julia,
> most of them imply that they are acronyms.
>
> Suggestion: at this point, there isn't much code in the repo.  Since not
> many people have (publicly) responded to your posts, IMHO, it would be good
> to try to get a basic working system in place, and post your progress when
> something basic is working--this will make it easier for people to
> contribute.
>
> One more thing you should look at is Clang.jl, which makes wrapping C
> libraries easier.
>
> Cheers!
>Kevin
>
> On Tue, Jul 28, 2015 at 5:21 PM, Kevin Liu  wrote:
>
>> I'll name it CMONGO.jl
>>
>> On Tue, Jul 28, 2015 at 9:13 PM, Kevin Liu 
>> wrote:
>>
>>> Any suggestions for the name? I just want to remember this will be a
>>> wrapper around C Mongo.
>>>
>>> On Tue, Jul 28, 2015 at 9:10 PM, Kevin Liu 
>>> wrote:
>>>
>>>> Hey Kevin,
>>>>
>>>> That's great. Thanks for the advice. On it right now.
>>>>
>>>> Cheers!
>>>>
>>>> On Tue, Jul 28, 2015 at 7:50 PM, Kevin Squire 
>>>> wrote:
>>>>
>>>>> Hi Kevin,
>>>>>
>>>>> If you plan to make this a Julia package (and I encourage you to do
>>>>> so), it would be good to look at the Julia package naming conventions
>>>>> <http://julia.readthedocs.org/en/latest/manual/packages/#guidelines-for-naming-a-package>.
>>>>> You might consider choosing a different name, generating a package
>>>>> skeleton, and moving the files in this repo there.  Alternatively, 
>>>>> renaming
>>>>> that repo shouldn't be hard.
>>>>>
>>>>> (This isn't mentioned explicitly there, but dashes also won't work for
>>>>> Julia package names.)
>>>>>
>>>>> Cheers!
>>>>>Kevin
>>>>>
>>>>> On Mon, Jul 27, 2015 at 8:28 PM, Kevin Liu 
>>>>> wrote:
>>>>>
>>>>>> Hi Julia Users, feel free to contribute to the Julia wrapper of the C
>>>>>> Mongo Driver, maintained by Mongo
>>>>>>
>>>>>> https://github.com/tenthdimension/Julia-C-Mongo
>>>>>>
>>>>>> This Julia wrapper is based on the Lua wrapper of the C Mongo Driver.
>>>>>> Jesse Davis from MongoDB recommended I used it as a reference.
>>>>>>
>>>>>> On Thursday, July 23, 2015 at 8:26:14 PM UTC-3, Kevin Liu wrote:
>>>>>>>
>>>>>>> Thanks
>>>>>>>
>>>>>>> On Thursday, July 23, 2015 at 8:24:12 PM UTC-3,
>>>>>>> tim@multiscalehn.com wrote:
>>>>>>>>
>>>>>>>> https://github.com/pzion/LibBSON.jl/pull/4
>>>>>>>> https://github.com/pzion/Mongo.jl/pull/6
>>>>>>>>
>>>>>>>> On Thursday, July 23, 2015 at 3:28:17 PM UTC-7, Kevin Liu wrote:
>>>>>>>>>
>>>>>>>>> I'm sorry Tim, check this out
>>>>>>>>> https://github.com/10gen-labs/mongorover/issues/16
>>>>>>>>>
>>>>>>>>> Could you share how you made it work properly?
>>>>>>>>>
>>>>>>>>> On Wednesday, July 22, 2015 at 5:20:42 PM UTC-3, Kevin Liu wrote:
>>>>>>>>>>
>>>>>>>>>> Hi Tim, did it pass specs 1, 2, and part of 3? 4 hasn't been
>>>>>>>>>> started yet.
>>>>>>>>>>
>>>>>>>>>> These new MongoDB drivers conform to published specifications.
>>>>>>>>>>
>>>>>>>>>> 1. Server Selection - Deciding which server to send database
>>>>>>>>>> operations to in a MongoDB deployment.
>>>>>>>>>> 2. Server Discovery and Monitoring

Re: [julia-users] Re: MongoDB and Julia

2015-07-28 Thread Kevin Liu
I'll name it CMONGO.jl

On Tue, Jul 28, 2015 at 9:13 PM, Kevin Liu  wrote:

> Any suggestions for the name? I just want to remember this will be a
> wrapper around C Mongo.
>
> On Tue, Jul 28, 2015 at 9:10 PM, Kevin Liu  wrote:
>
>> Hey Kevin,
>>
>> That's great. Thanks for the advice. On it right now.
>>
>> Cheers!
>>
>> On Tue, Jul 28, 2015 at 7:50 PM, Kevin Squire 
>> wrote:
>>
>>> Hi Kevin,
>>>
>>> If you plan to make this a Julia package (and I encourage you to do so),
>>> it would be good to look at the Julia package naming conventions
>>> <http://julia.readthedocs.org/en/latest/manual/packages/#guidelines-for-naming-a-package>.
>>> You might consider choosing a different name, generating a package
>>> skeleton, and moving the files in this repo there.  Alternatively, renaming
>>> that repo shouldn't be hard.
>>>
>>> (This isn't mentioned explicitly there, but dashes also won't work for
>>> Julia package names.)
>>>
>>> Cheers!
>>>Kevin
>>>
>>> On Mon, Jul 27, 2015 at 8:28 PM, Kevin Liu 
>>> wrote:
>>>
>>>> Hi Julia Users, feel free to contribute to the Julia wrapper of the C
>>>> Mongo Driver, maintained by Mongo
>>>>
>>>> https://github.com/tenthdimension/Julia-C-Mongo
>>>>
>>>> This Julia wrapper is based on the Lua wrapper of the C Mongo Driver.
>>>> Jesse Davis from MongoDB recommended I used it as a reference.
>>>>
>>>> On Thursday, July 23, 2015 at 8:26:14 PM UTC-3, Kevin Liu wrote:
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Thursday, July 23, 2015 at 8:24:12 PM UTC-3,
>>>>> tim....@multiscalehn.com wrote:
>>>>>>
>>>>>> https://github.com/pzion/LibBSON.jl/pull/4
>>>>>> https://github.com/pzion/Mongo.jl/pull/6
>>>>>>
>>>>>> On Thursday, July 23, 2015 at 3:28:17 PM UTC-7, Kevin Liu wrote:
>>>>>>>
>>>>>>> I'm sorry Tim, check this out
>>>>>>> https://github.com/10gen-labs/mongorover/issues/16
>>>>>>>
>>>>>>> Could you share how you made it work properly?
>>>>>>>
>>>>>>> On Wednesday, July 22, 2015 at 5:20:42 PM UTC-3, Kevin Liu wrote:
>>>>>>>>
>>>>>>>> Hi Tim, did it pass specs 1, 2, and part of 3? 4 hasn't been
>>>>>>>> started yet.
>>>>>>>>
>>>>>>>> These new MongoDB drivers conform to published specifications.
>>>>>>>>
>>>>>>>> 1. Server Selection - Deciding which server to send database
>>>>>>>> operations to in a MongoDB deployment.
>>>>>>>> 2. Server Discovery and Monitoring - All the logic required to make
>>>>>>>> a MongoDB application highly available.
>>>>>>>> 3. CRUD API - The API for how we Create, Read, Update and Delete
>>>>>>>> data from MongoDB.
>>>>>>>> 4. Authentication - The rules for how to authenticate to MongoDB
>>>>>>>> servers.
>>>>>>>>
>>>>>>>>
>>>>>>>> https://www.mongodb.com/blog/post/announcing-next-generation-drivers-mongodb
>>>>>>>>
>>>>>>>> On Wednesday, July 22, 2015 at 4:30:00 PM UTC-3,
>>>>>>>> tim@multiscalehn.com wrote:
>>>>>>>>>
>>>>>>>>> I have just made pull requests to pzion/LibBSON.jl and
>>>>>>>>> pzion/Mongo.jl to fix the driver in v0.4. Works fine for me, after 
>>>>>>>>> adding
>>>>>>>>> @compats
>>>>>>>>>
>>>>>>>>> On Sunday, July 12, 2015 at 12:17:44 AM UTC-7, Kevin Liu wrote:
>>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> I have Julia 0.3, Mongodb-osx-x86_64-3.0.4,
>>>>>>>>>> and Mongo-c-driver-1.1.9 installed, but can't get Julia to access 
>>>>>>>>>> the Mongo
>>>>>>>>>> Client through this 'untestable' package
>>>>>>>>>> https://github.com/pzion/Mongo.jl, according to
>>>>>>>>>> http://pkg.julialang.org/.
>>>>>>>>>>
>>>>>>>>>> I have tried Lytol/Mongo.jl and the command require("Mongo.jl")
>>>>>>>>>> can't open file Mongo.jl, or the auto-generated deps.jl.
>>>>>>>>>>
>>>>>>>>>> Is anyone having similar problems trying to make Julia work with
>>>>>>>>>> Mongo?
>>>>>>>>>>
>>>>>>>>>> Thank you
>>>>>>>>>>
>>>>>>>>>> Kevin
>>>>>>>>>>
>>>>>>>>>
>>>
>>
>


Re: [julia-users] Re: MongoDB and Julia

2015-07-28 Thread Kevin Liu
Any suggestions for the name? I just want to remember this will be a
wrapper around C Mongo.

On Tue, Jul 28, 2015 at 9:10 PM, Kevin Liu  wrote:

> Hey Kevin,
>
> That's great. Thanks for the advice. On it right now.
>
> Cheers!
>
> On Tue, Jul 28, 2015 at 7:50 PM, Kevin Squire 
> wrote:
>
>> Hi Kevin,
>>
>> If you plan to make this a Julia package (and I encourage you to do so),
>> it would be good to look at the Julia package naming conventions
>> <http://julia.readthedocs.org/en/latest/manual/packages/#guidelines-for-naming-a-package>.
>> You might consider choosing a different name, generating a package
>> skeleton, and moving the files in this repo there.  Alternatively, renaming
>> that repo shouldn't be hard.
>>
>> (This isn't mentioned explicitly there, but dashes also won't work for
>> Julia package names.)
>>
>> Cheers!
>>Kevin
>>
>> On Mon, Jul 27, 2015 at 8:28 PM, Kevin Liu 
>> wrote:
>>
>>> Hi Julia Users, feel free to contribute to the Julia wrapper of the C
>>> Mongo Driver, maintained by Mongo
>>>
>>> https://github.com/tenthdimension/Julia-C-Mongo
>>>
>>> This Julia wrapper is based on the Lua wrapper of the C Mongo Driver.
>>> Jesse Davis from MongoDB recommended I used it as a reference.
>>>
>>> On Thursday, July 23, 2015 at 8:26:14 PM UTC-3, Kevin Liu wrote:
>>>>
>>>> Thanks
>>>>
>>>> On Thursday, July 23, 2015 at 8:24:12 PM UTC-3,
>>>> tim@multiscalehn.com wrote:
>>>>>
>>>>> https://github.com/pzion/LibBSON.jl/pull/4
>>>>> https://github.com/pzion/Mongo.jl/pull/6
>>>>>
>>>>> On Thursday, July 23, 2015 at 3:28:17 PM UTC-7, Kevin Liu wrote:
>>>>>>
>>>>>> I'm sorry Tim, check this out
>>>>>> https://github.com/10gen-labs/mongorover/issues/16
>>>>>>
>>>>>> Could you share how you made it work properly?
>>>>>>
>>>>>> On Wednesday, July 22, 2015 at 5:20:42 PM UTC-3, Kevin Liu wrote:
>>>>>>>
>>>>>>> Hi Tim, did it pass specs 1, 2, and part of 3? 4 hasn't been started
>>>>>>> yet.
>>>>>>>
>>>>>>> These new MongoDB drivers conform to published specifications.
>>>>>>>
>>>>>>> 1. Server Selection - Deciding which server to send database
>>>>>>> operations to in a MongoDB deployment.
>>>>>>> 2. Server Discovery and Monitoring - All the logic required to make
>>>>>>> a MongoDB application highly available.
>>>>>>> 3. CRUD API - The API for how we Create, Read, Update and Delete
>>>>>>> data from MongoDB.
>>>>>>> 4. Authentication - The rules for how to authenticate to MongoDB
>>>>>>> servers.
>>>>>>>
>>>>>>>
>>>>>>> https://www.mongodb.com/blog/post/announcing-next-generation-drivers-mongodb
>>>>>>>
>>>>>>> On Wednesday, July 22, 2015 at 4:30:00 PM UTC-3,
>>>>>>> tim@multiscalehn.com wrote:
>>>>>>>>
>>>>>>>> I have just made pull requests to pzion/LibBSON.jl and
>>>>>>>> pzion/Mongo.jl to fix the driver in v0.4. Works fine for me, after 
>>>>>>>> adding
>>>>>>>> @compats
>>>>>>>>
>>>>>>>> On Sunday, July 12, 2015 at 12:17:44 AM UTC-7, Kevin Liu wrote:
>>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> I have Julia 0.3, Mongodb-osx-x86_64-3.0.4,
>>>>>>>>> and Mongo-c-driver-1.1.9 installed, but can't get Julia to access the 
>>>>>>>>> Mongo
>>>>>>>>> Client through this 'untestable' package
>>>>>>>>> https://github.com/pzion/Mongo.jl, according to
>>>>>>>>> http://pkg.julialang.org/.
>>>>>>>>>
>>>>>>>>> I have tried Lytol/Mongo.jl and the command require("Mongo.jl")
>>>>>>>>> can't open file Mongo.jl, or the auto-generated deps.jl.
>>>>>>>>>
>>>>>>>>> Is anyone having similar problems trying to make Julia work with
>>>>>>>>> Mongo?
>>>>>>>>>
>>>>>>>>> Thank you
>>>>>>>>>
>>>>>>>>> Kevin
>>>>>>>>>
>>>>>>>>
>>
>


  1   2   >