Thanks dbtsai for the info.
Are you using the case class for:
Case(response, vec) = ?
Also, what library do I need to import to use .toBreeze ?
Thanks,
tri
-Original Message-
From: dbt...@dbtsai.com [mailto:dbt...@dbtsai.com]
Sent: Friday, December 12, 2014 3:27 PM
To: Bui,
Hi,
Trying to use LBFGS as the optimizer, do I need to implement feature scaling
via StandardScaler or does LBFGS do it by default?
Following code generated error Failure again! Giving up and returning,
Maybe the objective is just poorly behaved ?.
val data =
You need to do the StandardScaler to help the convergency yourself.
LBFGS just takes whatever objective function you provide without doing
any scaling. I will like to provide LinearRegressionWithLBFGS which
does the scaling internally in the nearly feature.
Sincerely,
DB Tsai
Thanks for the confirmation.
Fyi..The code below works for similar dataset, but with the feature magnitude
changed, LBFGS converged to the right weights.
Example, time sequential Feature value 1, 2, 3, 4, 5, would generate the error
while sequential feature 14111, 14112, 14113,14115 would
It seems that your response is not scaled which will cause issue in
LBFGS. Typically, people train Linear Regression with
zero-mean/unit-variable feature and response without training the
intercept. Since the response is zero-mean, the intercept will be
always zero. When you convert the
Thanks for the info.
How do I use StandardScaler() to scale example data (10246.0,[14111.0,1.0]) ?
Thx
tri
-Original Message-
From: dbt...@dbtsai.com [mailto:dbt...@dbtsai.com]
Sent: Friday, December 12, 2014 1:26 PM
To: Bui, Tri
Cc: user@spark.apache.org
Subject: Re: Do I need to
You can do something like the following.
val rddVector = input.map({
case (response, vec) = {
val newVec = MLUtils.appendBias(vec)
newVec.toBreeze(newVec.size - 1) = response
newVec
}
}
val scalerWithResponse = new StandardScaler(true, true).fit(rddVector)
val trainingData =