[ 
https://issues.apache.org/jira/browse/SPARK-10182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vyacheslav Baranov updated SPARK-10182:
---------------------------------------
    Description: 
The problem might be reproduced in spark-shell with following code snippet:

{code}
import org.apache.spark.SparkContext
import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint

val samples = Seq[LabeledPoint](
  LabeledPoint(1.0, Vectors.dense(1.0, 0.0)),
  LabeledPoint(1.0, Vectors.dense(0.0, 1.0)),
  LabeledPoint(0.0, Vectors.dense(1.0, 1.0)),
  LabeledPoint(0.0, Vectors.dense(0.0, 0.0))
)

val rdd = sc.parallelize(samples)

for (i <- 0 until 10) {
  val model = {
    new LogisticRegressionWithLBFGS()
      .setNumClasses(2)
      .run(rdd)
      .clearThreshold()
  }
}
{code}

After code execution there are 10 {{MapPartitionsRDD}} objects on "Storage" tab 
in Spark application UI.

  was:
The problem might be reproduced in spark-shell with following code snippet:

{code}
import org.apache.spark.SparkContext
import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint

val samples = Seq[LabeledPoint](
  LabeledPoint(1.0, Vectors.dense(1.0, 0.0)),
  LabeledPoint(1.0, Vectors.dense(0.0, 1.0)),
  LabeledPoint(0.0, Vectors.dense(1.0, 1.0)),
  LabeledPoint(0.0, Vectors.dense(0.0, 0.0))
)

val rdd = sc.parallelize(samples)

for (i <- 0 until 10) {
  val model = {
    new LogisticRegressionWithLBFGS()
      .setNumClasses(2)
      .run(rdd)
      .clearThreshold()
  }
}
{code}

After code execution there are 10 {{MapPartitionsRDD}} objects.


> GeneralizedLinearModel doesn't unpersist cached data
> ----------------------------------------------------
>
>                 Key: SPARK-10182
>                 URL: https://issues.apache.org/jira/browse/SPARK-10182
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.4.1
>            Reporter: Vyacheslav Baranov
>
> The problem might be reproduced in spark-shell with following code snippet:
> {code}
> import org.apache.spark.SparkContext
> import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
> import org.apache.spark.mllib.linalg.Vectors
> import org.apache.spark.mllib.regression.LabeledPoint
> val samples = Seq[LabeledPoint](
>   LabeledPoint(1.0, Vectors.dense(1.0, 0.0)),
>   LabeledPoint(1.0, Vectors.dense(0.0, 1.0)),
>   LabeledPoint(0.0, Vectors.dense(1.0, 1.0)),
>   LabeledPoint(0.0, Vectors.dense(0.0, 0.0))
> )
> val rdd = sc.parallelize(samples)
> for (i <- 0 until 10) {
>   val model = {
>     new LogisticRegressionWithLBFGS()
>       .setNumClasses(2)
>       .run(rdd)
>       .clearThreshold()
>   }
> }
> {code}
> After code execution there are 10 {{MapPartitionsRDD}} objects on "Storage" 
> tab in Spark application UI.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to