hi all,

I have been experimenting with creating a sparkcontext -> streamingcontext
-> a few streams -> starting -> stopping -> creating new streams -> starting
a new (or the existing) streamingcontext with the new streams

(I need to keep the existing sparkcontext alive as it would run other spark
jobs)

I ran into a few problems:

- I cannot seem to create a new streaming context after another one was shut
down. I get this error:
15/01/21 12:43:16 INFO MetricsSystem: Metrics already registered
java.lang.IllegalArgumentException: A metric named
app-20150121123832-0008.driver.Spark
shell.StreamingMetrics.streaming.lastCompletedBatch_processStartTime already
exists

- Or if I try to start the one that was stopped I get this:
org.apache.spark.SparkException: StreamingContext has already been started

- It seems even after the streaming context is stopped, it still shows up in
the job info (spark web UI).

is there a better way to do this?

thanks,





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Possible-to-restart-or-stop-and-create-a-StreamingContext-tp21291.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to