Error while saving plots

2016-05-24 Thread njoshi
formance: AuROC").saveAsPNG(outputpath + "/plots/global.png") However, I am getting the following exception. Does anyone have idea of the cause? Exception in thread "main" java.io.FileNotFoundException: file:/home/njoshi/dev/outputs/test_/plots/glob

Classification model method not found

2015-12-22 Thread njoshi
Hi, I have a custom extended LogisticRegression model which I want to test against a parameter grid search. I am running as follows: / val exLR = new ExtendedLR() .setMaxIter(100) .setFitIntercept(true) /* * Cross Validator parameter grid */ val paramGrid = new

Unable to import SharedSparkContext

2015-11-18 Thread njoshi
Hi, Doesn't *SharedSparkContext* come with spark-core? Do I need to include any special package in the library dependancies for using SharedSparkContext? I am trying to write a testSuite similar to the *LogisticRegressionSuite* test in the Spak-ml. Unfortunately, I am unable to import any of

Spark LogisticRegression returns scaled coefficients

2015-11-17 Thread njoshi
I am testing the LogisticRegression performance on a synthetically generated data. The weights I have as input are w = [2, 3, 4] with no intercept and three features. After training on 1000 synthetically generated datapoint assuming random normal distribution for each, the Spark

Extending Spark ML LogisticRegression Object

2015-10-30 Thread njoshi
Hi, I am extending Spark ML package locally to include one of the specialized model I need to try. In particular, I am trying to extend the LogisticRegression model with one which takes a custom object Weights as weights, and I am getting the following compilation error could not find implicit

Does Spark.ml LogisticRegression assumes only Double valued features?

2015-09-03 Thread njoshi
Hi, I was looking at the `Spark 1.5` dataframe/row api and the implementation for the logistic regression

Spark.ml vs Spark.mllib

2015-08-26 Thread njoshi
Hi, We are in the process of developing a new product/Spark application. While the official Spark 1.4.1 page http://spark.apache.org/docs/latest/ml-guide.html invites users and developers to use *Spark.mllib* and optionally contribute to *Spark.ml*, this