Hi Todd,
Thanks for the hint.
As it happened this works
//Create the sparkconf for streaming as usual
val sparkConf = new SparkConf().
setAppName(sparkAppName).
set("spark.driver.allowMultipleContexts", "true").
Hi Mich,
Perhaps the issue is having multiple SparkContexts in the same JVM (
https://issues.apache.org/jira/browse/SPARK-2243).
While it is possible, I don't think it is encouraged.
As you know, the call your currently invoking to create the
StreamingContext also creates a
SparkContext.
/** *
Ok I managed to sort that one out.
This is what I am facing
val sparkConf = new SparkConf().
setAppName(sparkAppName).
set("spark.driver.allowMultipleContexts", "true").
set("spark.hadoop.validateOutputSpecs", "false")
// change the values
Hi,
This may not be feasible in Spark streaming.
I am trying to create a HiveContext in Spark streaming within the streaming
context
// Create a local StreamingContext with two working thread and batch
interval of 2 seconds.
val sparkConf = new SparkConf().