Question- Why do you need theano?
Aside from the benefits of symbolic graph optimization, what does Theano
provide that Julia doesn't do? With Julia you can write normal imperative
code that is easier to read/write than theano, and then do autodiff on that.
On Tuesday, November 24, 2015 at
Hi Viral,
I want to be a part of JuliaML.
~ Ravish
On Wednesday, November 11, 2015 at 4:48:07 PM UTC+5:30, Viral Shah wrote:
>
> I think TensorFlow.jl is a great idea. Also their distributed computation
> framework is also the kind that we want to have in Julia.
>
> I have created JuliaML.
JuliaML is a collection of repos, not people. If you create a package that
ends up in JuliaML or make significant contributions to one of them, then
the owner of some JuliaML package may give you commit access to that
package.
On Mon, Nov 16, 2015 at 4:24 AM, Ravish Mishra
Thanks Stefan. Just starting on Julia. Hope to start contributing soon.
~ Ravish
On Mon, Nov 16, 2015 at 11:16 PM, Stefan Karpinski
wrote:
> JuliaML is a collection of repos, not people. If you create a package that
> ends up in JuliaML or make significant contributions
On Monday, November 16, 2015 at 11:46:14 AM UTC-8, George Coles wrote:
>
> Does MXNet provide features that are analogous with Theano? I would rather
> do machine learning in one language, than a mix of python + c + a DSL like
> Theano.
MXNet.jl is a wrapper around libmxnet so there is c
Does MXNet provide features that are analogous with Theano? I would rather do
machine learning in one language, than a mix of python + c + a DSL like Theano.
It is always cool to be able to quickly wrap native libraries, but Julia would
really gain momentum if it could obviate Theano et al (as
Does MXNet provide features that are analogous with Theano? I would rather do
machine learning in one language, than a mix of python + c + a DSL like Theano.
It is always cool to be able to quickly wrap native libraries, but Julia would
really gain momentum if it could obviate Theano et al (as
Den torsdag 12 november 2015 kl. 06:36:28 UTC+1 skrev Alireza Nejati
>
> Anyway, the problem I'm facing right now is that even though TensorFlow's
> python interface works fine, I can't get TensorFlow's C library to build!
> Has anyone else had any luck with this? I've had to update java AND gcc
Good to know that.
On Wednesday, November 11, 2015 at 12:18:07 PM UTC+1, Viral Shah wrote:
>
> I think TensorFlow.jl is a great idea. Also their distributed computation
> framework is also the kind that we want to have in Julia.
>
> I have created JuliaML. Send me email if you want to be part
I think TensorFlow.jl is a great idea. Also their distributed computation
framework is also the kind that we want to have in Julia.
I have created JuliaML. Send me email if you want to be part of it, and I
will make you an owner. Perhaps we can even move some of the JuliaStats ML
projects to
Sure. I'm not against anyone doing anything, just that it seems like Julia
suffers from an "expert/edge case" problem right now. For me, it'd be
awesome if there was just a scikit-learn (Python) or caret (R) type
mega-interface that ties together the packages that are already coded
together.
This is definitely already in progress, but we've a ways to go before it's
as easy as scikit-learn. I suspect that having an organization will be more
effective at coordinating the various efforts than people might expect.
On Wed, Nov 11, 2015 at 9:46 AM, Tom Breloff wrote:
>
i have the same philosophy: "An enduser should never have to type a
unicode character"
On 2015-11-11 17:11, Cedric St-Jean wrote:
scikit-learn uses greek letters in its implementation, which I'm fine
with since domain experts work on those, but I wish that in the
visible interface they had
+1 to consistent interfaces for machine learning algorithms.
On Wed, Nov 11, 2015 at 9:29 AM, Randy Zwitch
wrote:
> Sure. I'm not against anyone doing anything, just that it seems like Julia
> suffers from an "expert/edge case" problem right now. For me, it'd be
>
I agree. I personally think the ML efforts should follow the StatsBase
and Optim conventions where it makes sense.
The notational differences are inconvenient, but they are manageable. I
think readability should be the goal there. For example if you implement
some algorithm one should use the
Randy, see LearnBase.jl, MachineLearning.jl, Learn.jl (just a readme for
now), Orchestra.jl, and many others. Many people have the same goal, and
wrapping TensorFlow isn't going to change the need for a high level
interface. I do agree that a good high level interface is higher on the
priority
One of the tricky things to figure out is how to separate statistics from
machine learning, as they overlap heavily (completely?) but with different
terminology and goals. I think it's really important that JuliaStats and
JuliaML/JuliaLearn play nicely together, and this probably means that any
>
> if you implement some algorithm one should use the notation from the
> referenced paper
This can be easier to implement (essentially just copy from the paper) but
will make for a mess and a maintenance nightmare. I don't want to have to
read a paper just to understand what someone's code
I'm afraid it is not as easy as simply wrapping "existing"
functionality, unless one is ok with a lot of wrapper packages for C
backends. I do realize that a lot of people might be ok with this, but
to some (me included) that would defeat the purpose of using Julia in
the first place. I really
I understand that. But that would imply that a group of people that are
used to different notation would need to reach a consensus. Also there
would be an uglyness to it. For example SVMs have a pretty standardized
notation for the most things. I think it would not help anyone if we
would
scikit-learn uses greek letters in its implementation, which I'm fine with
since domain experts work on those, but I wish that in the visible
interface they had consistently used more descriptive names (eg.
regularization_strength instead of alpha).
On Wednesday, November 11, 2015 at 11:00:56
Awesome. Feel free to open up a LightGraphs issue to track.
On Wednesday, November 11, 2015 at 2:24:13 PM UTC-8, Alireza Nejati wrote:
>
> Both! :)
On Tuesday, November 10, 2015 at 9:57:21 PM UTC-8, Valentin Churavy wrote:
>
> It fits in the same niche that Mocha.jl and MXNet.jl are filling right
> now. MXNet is a ML library that shares many of the same design ideas of
> TensorFlow and has great Julia support
> From reading through some of the TensorFlow docs, it seems to currently
only run on one machine. This is where MXNet has an advantage (and
MXNet.jl) as it can run across multiple machines/gpus
I think it's fair to assume that Google will soon release a distributed
version.
> problem is,
On Tuesday, November 10, 2015 at 8:28:32 PM UTC-8, Alireza Nejati wrote:
>
> Randy: To answer your question, I'd reckon that the two major gaps in
> julia that TensorFlow could fill are:
>
> 1. Lack of automatic differentiation on arbitrary graph structures.
> 2. Lack of ability to map
I think rather than always matching papers we should endeavor to use
consistent and standard terminology and notation. When there is
disagreement, we need to have a discussion and come to some kind of
agreement within our own community at least. So far that's gone quite well
in StatsBase (and
Sounds fine to me... are you volunteering to do it, or just suggesting a
plan?
On Wed, Nov 11, 2015 at 5:09 PM, Alireza Nejati
wrote:
> So I had a look at the C api. Seems simple enough. I propose a basic
> TensorFlow.jl package that does the following:
>
>- Defines
Both! :)
I'm interested as well. Who wants to claim TensorFlow.jl?
On Tue, Nov 10, 2015 at 9:11 AM, Ben Moran wrote:
> I'm very interested in this. I haven't gone through the details yet but
> they say that C++ API currently only supports a subset of the Python API
> (weird!).
>
>
I'm very interested in this. I haven't gone through the details yet but
they say that C++ API currently only supports a subset of the Python API
(weird!).
One possibility is to use PyCall to wrap the Python version, like was done
for PyPlot, SymPy and like I began tentatively for Theano here
For me, the bigger question is how does TensorFlow fit in/fill in gaps in
currently available Julia libraries? I'm not saying that someone who is
sufficiently interested shouldn't wrap the library, but it'd be great to
identify what major gaps remain in ML for Julia and figure out if
It fits in the same niche that Mocha.jl and MXNet.jl are filling right now.
MXNet is a ML library that shares many of the same design ideas of
TensorFlow and has great Julia support https://github.com/dmlc/MXNet.jl
On Wednesday, 11 November 2015 01:04:00 UTC+9, Randy Zwitch wrote:
>
> For me,
If anyone draws up an initial implementation (or pathway to implementation,
even), I'd gladly contribute. I think it's highly strategically important
to have a julia interface to TensorFlow.
Randy: To answer your question, I'd reckon that the two major gaps in julia
that TensorFlow could fill are:
1. Lack of automatic differentiation on arbitrary graph structures.
2. Lack of ability to map computations across cpus and clusters.
Funny enough, I was thinking about (1) for the past
Looks like they used SWIG to create the Python bindings. I don't see Julia
listed as an output target for SWIG.
On Monday, November 9, 2015 at 1:02:36 PM UTC-8, Phil Tomson wrote:
>
> Google has released it's deep learning library called TensorFlow as open
> source code:
>
>
35 matches
Mail list logo