You need to do the StandardScaler to help the convergency yourself. LBFGS just takes whatever objective function you provide without doing any scaling. I will like to provide LinearRegressionWithLBFGS which does the scaling internally in the nearly feature.
Sincerely, DB Tsai ------------------------------------------------------- My Blog: https://www.dbtsai.com LinkedIn: https://www.linkedin.com/in/dbtsai On Fri, Dec 12, 2014 at 8:49 AM, Bui, Tri <tri....@verizonwireless.com.invalid> wrote: > Hi, > > > > Trying to use LBFGS as the optimizer, do I need to implement feature scaling > via StandardScaler or does LBFGS do it by default? > > > > Following code generated error “ Failure again! Giving up and returning, > Maybe the objective is just poorly behaved ?”. > > > > val data = sc.textFile("file:///data/Train/final2.train") > > val parsedata = data.map { line => > > val partsdata = line.split(',') > > LabeledPoint(partsdata(0).toDouble, Vectors.dense(partsdata(1).split(' > ').map(_.toDouble))) > > } > > > > val train = parsedata.map(x => (x.label, > MLUtils.appendBias(x.features))).cache() > > > > val numCorrections = 10 > > val convergenceTol = 1e-4 > > val maxNumIterations = 50 > > val regParam = 0.1 > > val initialWeightsWithIntercept = Vectors.dense(new Array[Double](2)) > > > > val (weightsWithIntercept, loss) = LBFGS.runLBFGS(train, > > new LeastSquaresGradient(), > > new SquaredL2Updater(), > > numCorrections, > > convergenceTol, > > maxNumIterations, > > regParam, > > initialWeightsWithIntercept) > > > > Did I implement LBFGS for Linear Regression via “LeastSquareGradient()” > correctly? > > > > Thanks > > Tri --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org