[jira] [Updated] (SPARK-18469) Cannot make MLlib model predictions in Spark streaming with checkpointing

2019-05-20 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-18469:
-
Labels: bulk-closed  (was: )

> Cannot make MLlib model predictions in Spark streaming with checkpointing
> -
>
> Key: SPARK-18469
> URL: https://issues.apache.org/jira/browse/SPARK-18469
> Project: Spark
>  Issue Type: Bug
>  Components: DStreams, MLlib
>Affects Versions: 1.6.2
>Reporter: Alex Jarman
>Priority: Major
>  Labels: bulk-closed
>
> Enabling checkpointing whilst trying to produce predictions with an offline 
> MLlib model in Spark Streaming throws up the following error: 
> "Exception: It appears that you are attempting to reference SparkContext from 
> a broadcast variable, action, or transformation. SparkContext can only be 
> used on the driver, not in code that it run on workers. For more information, 
> see SPARK-5063." 
> The line in my code that appears to cause this is when calling the transform  
> transformation on the DStream with model.predictAll i.e. : 
> predictions = ratings.map(lambda r: 
> (int(r[1]),int(r[2]))).transform(lambda rdd: 
> model.predictAll(rdd)).map(lambda r: (r[0], r[1], r[2])) 
> See 
> http://stackoverflow.com/questions/40566492/spark-streaming-model-predictions-with-checkpointing-reference-sparkcontext-fr
>  for a fuller description.. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-18469) Cannot make MLlib model predictions in Spark streaming with checkpointing

2016-11-19 Thread Sean Owen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-18469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-18469:
--
Target Version/s:   (was: 1.6.2)

> Cannot make MLlib model predictions in Spark streaming with checkpointing
> -
>
> Key: SPARK-18469
> URL: https://issues.apache.org/jira/browse/SPARK-18469
> Project: Spark
>  Issue Type: Bug
>  Components: DStreams, MLlib
>Affects Versions: 1.6.2
>Reporter: Alex Jarman
>
> Enabling checkpointing whilst trying to produce predictions with an offline 
> MLlib model in Spark Streaming throws up the following error: 
> "Exception: It appears that you are attempting to reference SparkContext from 
> a broadcast variable, action, or transformation. SparkContext can only be 
> used on the driver, not in code that it run on workers. For more information, 
> see SPARK-5063." 
> The line in my code that appears to cause this is when calling the transform  
> transformation on the DStream with model.predictAll i.e. : 
> predictions = ratings.map(lambda r: 
> (int(r[1]),int(r[2]))).transform(lambda rdd: 
> model.predictAll(rdd)).map(lambda r: (r[0], r[1], r[2])) 
> See 
> http://stackoverflow.com/questions/40566492/spark-streaming-model-predictions-with-checkpointing-reference-sparkcontext-fr
>  for a fuller description.. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org