Re: Spark ANN

2015-09-15 Thread Ruslan Dautkhanov
Pentreath; user; na...@yandex.ru > *Subject:* Re: Spark ANN > > > > My 2 cents: > > > > * There is frequency domain processing available already (e.g. spark.ml > DCT transformer) but no FFT transformer yet because complex numbers are not > currently a Spark SQ

Re: Spark ANN

2015-09-09 Thread Feynman Liang
ered as well http://arxiv.org/pdf/1312.5851.pdf. > > > > *From:* Feynman Liang [mailto:fli...@databricks.com] > *Sent:* Tuesday, September 08, 2015 12:07 PM > *To:* Ulanov, Alexander > *Cc:* Ruslan Dautkhanov; Nick Pentreath; user; na...@yandex.ru > *Subject:* Re: Spark ANN &g

RE: Spark ANN

2015-09-09 Thread Ulanov, Alexander
; user; na...@yandex.ru Subject: Re: Spark ANN My 2 cents: * There is frequency domain processing available already (e.g. spark.ml<http://spark.ml> DCT transformer) but no FFT transformer yet because complex numbers are not currently a Spark SQL datatype * We shouldn't assume signals are even,

Re: Spark ANN

2015-09-08 Thread Feynman Liang
r > > *From:* Ruslan Dautkhanov [mailto:dautkha...@gmail.com] > *Sent:* Monday, September 07, 2015 10:09 PM > *To:* Feynman Liang > *Cc:* Nick Pentreath; user; na...@yandex.ru > *Subject:* Re: Spark ANN > > > > Found a dropout commit from avulanov: > > > https://g

RE: Spark ANN

2015-09-08 Thread Ulanov, Alexander
...@yandex.ru Subject: Re: Spark ANN Just wondering, why do we need tensors? Is the implementation of convnets using im2col (see here<http://cs231n.github.io/convolutional-networks/>) insufficient? On Tue, Sep 8, 2015 at 11:55 AM, Ulanov, Alexander <alexander.ula...@hpe.com<mailto:a

Spark ANN

2015-09-07 Thread Ruslan Dautkhanov
http://people.apache.org/~pwendell/spark-releases/latest/ml-ann.html Implementation seems missing backpropagation? Was there is a good reason to omit BP? What are the drawbacks of a pure feedforward-only ANN? Thanks! -- Ruslan Dautkhanov

Re: Spark ANN

2015-09-07 Thread Ruslan Dautkhanov
Thanks! It does not look Spark ANN yet supports dropout/dropconnect or any other techniques that help avoiding overfitting? http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf https://cs.nyu.edu/~wanli/dropc/dropc.pdf ps. There is a small copy-paste typo in https://github.com/apache

Re: Spark ANN

2015-09-07 Thread Ruslan Dautkhanov
c regression. > > > On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <dautkha...@gmail.com> > wrote: > >> Thanks! >> >> It does not look Spark ANN yet supports dropout/dropconnect or any other >> techniques that help avoiding overfitting? >> http:

Re: Spark ANN

2015-09-07 Thread Feynman Liang
c regression. On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <dautkha...@gmail.com> wrote: > Thanks! > > It does not look Spark ANN yet supports dropout/dropconnect or any other > techniques that help avoiding overfitting? > http://www.cs.toronto.edu/~rsalakhu/papers/sriva

Re: Spark ANN

2015-09-07 Thread Feynman Liang
Ruslan Dautkhanov <dautkha...@gmail.com> > wrote: > >> Thanks! >> >> It does not look Spark ANN yet supports dropout/dropconnect or any other >> techniques that help avoiding overfitting? >> http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.p

Re: Spark ANN

2015-09-07 Thread Debasish Das
MLlib-dropout> for >> dropout regularized logistic regression. >> >> >> On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <dautkha...@gmail.com> >> wrote: >> >>> Thanks! >>> >>> It does not look Spark ANN yet supports dropout/dr

Re: Spark ANN

2015-09-07 Thread Nick Pentreath
Haven't checked the actual code but that doc says "MLPC employes backpropagation for learning the model. .."? — Sent from Mailbox On Mon, Sep 7, 2015 at 8:18 PM, Ruslan Dautkhanov wrote: > http://people.apache.org/~pwendell/spark-releases/latest/ml-ann.html >

Re: Spark ANN

2015-09-07 Thread Feynman Liang
Backprop is used to compute the gradient here , which is then optimized by SGD or LBFGS here