[
https://issues.apache.org/jira/browse/SPARK-6752?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Patrick Wendell reopened SPARK-6752:
I had to revert this because it caused test failures with the Hadoop 1.0 build.
To reproduce them use:
build/sbt -Dhadoop.version=1.0.4 -Pkinesis-asl -Phive -Phive-thriftserver
-Phive-0.12.0 streaming/test:compile
The errors are:
{code}
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1740:
error: cannot find symbol
[error] Assert.assertTrue("new context not created",
newContextCreated.isTrue());
[error] ^
[error] symbol: method isTrue()
[error] location: variable newContextCreated of type MutableBoolean
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1746:
error: cannot find symbol
[error] Assert.assertTrue("new context not created",
newContextCreated.isTrue());
[error] ^
[error] symbol: method isTrue()
[error] location: variable newContextCreated of type MutableBoolean
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1752:
error: cannot find symbol
[error] Assert.assertTrue("old context not recovered",
newContextCreated.isFalse());
[error] ^
[error] symbol: method isFalse()
[error] location: variable newContextCreated of type MutableBoolean
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1768:
error: cannot find symbol
[error] Assert.assertTrue("new context not created",
newContextCreated.isTrue());
[error] ^
[error] symbol: method isTrue()
[error] location: variable newContextCreated of type MutableBoolean
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1773:
error: cannot find symbol
[error] Assert.assertTrue("new context not created",
newContextCreated.isTrue());
[error] ^
[error] symbol: method isTrue()
[error] location: variable newContextCreated of type MutableBoolean
[error]
/Users/pwendell/Documents/spark/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java:1778:
error: cannot find symbol
[error] Assert.assertTrue("old context not recovered",
newContextCreated.isFalse());
[error] ^
[error] symbol: method isFalse()
[error] location: variable newContextCreated of type MutableBoolean
[error] 6 errors
[error] (streaming/test:compile) javac returned nonzero exit code
[error] Total time: 94 s, completed Apr 25, 2015 10:30:20 AM
pwendell @ admins-mbp : ~/Documents/spark (detached HEAD|REBASE 9/11)
{code}
> Allow StreamingContext to be recreated from checkpoint and existing
> SparkContext
>
>
> Key: SPARK-6752
> URL: https://issues.apache.org/jira/browse/SPARK-6752
> Project: Spark
> Issue Type: Improvement
> Components: Streaming
>Affects Versions: 1.1.1, 1.2.1, 1.3.1
>Reporter: Tathagata Das
>Assignee: Tathagata Das
>Priority: Critical
> Fix For: 1.4.0
>
>
> Currently if you want to create a StreamingContext from checkpoint
> information, the system will create a new SparkContext. This prevent
> StreamingContext to be recreated from checkpoints in managed environments
> where SparkContext is precreated.
> Proposed solution: Introduce the following methods on StreamingContext
> 1. {{new StreamingContext(checkpointDirectory, sparkContext)}}
> - Recreate StreamingContext from checkpoint using the provided SparkContext
> 2. {{new StreamingContext(checkpointDirectory, hadoopConf, sparkContext)}}
> - Recreate StreamingContext from checkpoint using the provided SparkContext
> and hadoop conf to read the checkpoint
> 3. {{StreamingContext.getOrCreate(checkpointDirectory, sparkContext,
> createFunction: SparkContext => StreamingContext)}}
> - If checkpoint file exists, then recreate StreamingContext using the
> provided SparkContext (that is, call 1.), else create StreamingContext using
> the provided createFunction
> Also, the corresponding Java and Python API has to be added as well.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.ap