Re: [MLLib]:choosing the Loss function

2014-08-11 Thread SK
of gradientDescent as the optimization algorithm, it would be helpful if you can post it. thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-choosing-the-Loss-function-tp11738p11913.html Sent from the Apache Spark User List mailing list archive

Re: [MLLib]:choosing the Loss function

2014-08-11 Thread Burak Yavuz
skrishna...@gmail.com To: u...@spark.incubator.apache.org Sent: Monday, August 11, 2014 11:52:04 AM Subject: Re: [MLLib]:choosing the Loss function Hi, Thanks for the reference to the LBFGS optimizer. I tried to use the LBFGS optimizer, but I am not able to pass it as an input

[MLLib]:choosing the Loss function

2014-08-07 Thread SK
this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-choosing-the-Loss-function-tp11738.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user

Re: [MLLib]:choosing the Loss function

2014-08-07 Thread Evan R. Sparks
? thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-choosing-the-Loss-function-tp11738.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: [MLLib]:choosing the Loss function

2014-08-07 Thread Burak Yavuz
both much faster and more accurate. Burak - Original Message - From: SK skrishna...@gmail.com To: u...@spark.incubator.apache.org Sent: Thursday, August 7, 2014 6:31:14 PM Subject: [MLLib]:choosing the Loss function Hi, According to the MLLib guide, there seems to be support for different