Re: No active SparkContext

2016-03-31 Thread Max Schmidt
oolExecutor.java:1142) > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > java.lang.Thread.run(Thread.java:745) > > The currently active SparkContext was created at: > > (No active SparkContext.) > > at >

Re: No active SparkContext

2016-03-24 Thread Max Schmidt
Am 2016-03-24 18:00, schrieb Mark Hamstra: You seem to be confusing the concepts of Job and Application.  A Spark Application has a SparkContext.  A Spark Application is capable of running multiple Jobs, each with its own ID, visible in the webUI. Obviously I mixed it up, but then I would like

Re: No active SparkContext

2016-03-24 Thread Mark Hamstra
You seem to be confusing the concepts of Job and Application. A Spark Application has a SparkContext. A Spark Application is capable of running multiple Jobs, each with its own ID, visible in the webUI. On Thu, Mar 24, 2016 at 6:11 AM, Max Schmidt wrote: > Am 24.03.2016 um

Re: No active SparkContext

2016-03-24 Thread Max Schmidt
Am 24.03.2016 um 10:34 schrieb Simon Hafner: > 2016-03-24 9:54 GMT+01:00 Max Schmidt >: > > we're using with the java-api (1.6.0) a ScheduledExecutor that > continuously > > executes a SparkJob to a standalone cluster. > I'd recommend Scala. Why should

Re: No active SparkContext

2016-03-24 Thread Simon Hafner
2016-03-24 9:54 GMT+01:00 Max Schmidt : > we're using with the java-api (1.6.0) a ScheduledExecutor that continuously > executes a SparkJob to a standalone cluster. I'd recommend Scala. > After each job we close the JavaSparkContext and create a new one. Why do that? You can

No active SparkContext

2016-03-24 Thread Max Schmidt
(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745) The currently active SparkContext was created at: (No active SparkContext.) at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped