Hi Theodore,
I'm currently working on elastic-net regression in ML framework, and I
decided not to have any extra layer of abstraction for now but focus
on accuracy and performance. We may come out with proper solution
later. Any idea is welcome.
Sincerely,
DB Tsai
1) Norm(weights, N) will return (w_1^N + w_2^N +)^(1/N), so norm
* norm is required.
2) This is bug as you said. I intend to fix this using weighted
regularization, and intercept term will be regularized with weight
zero. https://github.com/apache/spark/pull/1518 But I never actually
have
: Tuesday, April 07, 2015 3:28 PM
To: Ulanov, Alexander
Cc: dev@spark.apache.org
Subject: Re: Regularization in MLlib
1) Norm(weights, N) will return (w_1^N + w_2^N +)^(1/N), so norm
* norm is required.
2) This is bug as you said. I intend to fix this using weighted regularization