It means what it says. You should not have multiple SparkContexts running in one JVM. It was always the wrong thing to do, but now is an explicit error. When you run spark-shell, you already have a SparkContext (sc) so there is no need to make another one. Just don't do that.
On Wed, Feb 4, 2015 at 12:20 AM, gavin zhang <[email protected]> wrote: > I have a cluster which running CDH5.1.0 with Spark component. > Because the default version of Spark from CDH5.1.0 is 1.0.0 while I want to > use some features of Spark 1.2.0, I compiled another Spark with Maven. > But when I run into Spark-shell and created a new SparkContext, I met the > below error: > > 15/02/04 14:08:19 WARN SparkContext: Multiple running SparkContexts detected > in the same JVM! > org.apache.spark.SparkException: Only one SparkContext may be running in > this JVM (see SPARK-2243). To ignore this error, set > spark.driver.allowMultipleContexts = true. The currently running > SparkContext was created at > ... > > And I tried to delete the default Spark and > *set("spark.driver.allowMultipleContexts", "true") * option, But It didn't > work. > > How could I fix it? > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-running-SparkContexts-detected-in-the-same-JVM-tp21492.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
