Re: Using sparkContext.stop()

2016-09-09 Thread Mich Talebzadeh
Hi

We are talking about spark streaming in here?

Depending on what is streamed, you can work out an exit strategy through
the total messages streamed in or through a time window in which you can
monitor the duration and exit if the duration > Window allocated (not to be
confused with windows interval etc).

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 9 September 2016 at 18:45, Bruno Faria  wrote:

> Hey all,
>
> I have created a Spark Job that runs successfully but if I do not use
> sc.stop() at the end, the job hangs. It shows some "cleaned accumulator 0"
> messages but never finishes.
>
> I intent to use these jobs in production via spark-submit and schedule it
> in cron.
>
> Is that the best practice use sc.stop() or is there something else I am
> missing.
>
> One interesting point is, if I run the job for 100 lines, the job finishes
> completely (without using sc.stop(), but when running with actual data
> (more than millions) that happens.
>
> i've waited for more than 24 hours but it never releases the prompt and in
> the UI it appears as RUNNING.
>
> Appreciate any help
>
> Thanks
>
>


Using sparkContext.stop()

2016-09-09 Thread Bruno Faria
Hey all,

I have created a Spark Job that runs successfully but if I do not use sc.stop() 
at the end, the job hangs. It shows some "cleaned accumulator 0" messages but 
never finishes.

I intent to use these jobs in production via spark-submit and schedule it in 
cron.

Is that the best practice use sc.stop() or is there something else I am missing.

One interesting point is, if I run the job for 100 lines, the job finishes 
completely (without using sc.stop(), but when running with actual data (more 
than millions) that happens.

i've waited for more than 24 hours but it never releases the prompt and in the 
UI it appears as RUNNING.

Appreciate any help

Thanks



Re: SparkContext.stop() takes too long to complete

2016-03-18 Thread Nezih Yigitbasi
Hadoop 2.4.0. Here is the relevant logs from executor 1136

16/03/18 21:26:58 INFO mapred.SparkHadoopMapRedUtil:
attempt_201603182126_0276_m_000484_0: Committed16/03/18 21:26:58 INFO
executor.Executor: Finished task 484.0 in stage 276.0 (TID 59663).
1080 bytes result sent to driver16/03/18 21:38:18 ERROR
executor.CoarseGrainedExecutorBackend: RECEIVED SIGNAL 15:
SIGTERM16/03/18 21:38:18 INFO storage.DiskBlockManager: Shutdown hook
called16/03/18 21:38:18 INFO util.ShutdownHookManager: Shutdown hook
called

On Fri, Mar 18, 2016 at 4:21 PM Ted Yu  wrote:

Which version of hadoop do you use ?
>
> bq. Requesting to kill executor(s) 1136
>
> Can you find more information on executor 1136 ?
>
> Thanks
>
> On Fri, Mar 18, 2016 at 4:16 PM, Nezih Yigitbasi <
> nyigitb...@netflix.com.invalid> wrote:
>
>> Hi Spark experts,
>> I am using Spark 1.5.2 on YARN with dynamic allocation enabled. I see in
>> the driver/application master logs that the app is marked as SUCCEEDED and
>> then SparkContext stop is called. However, this stop sequence takes > 10
>> minutes to complete, and YARN resource manager kills the application master
>> as it didn’t receive a heartbeat within the last 10 minutes. The resource
>> manager then kills the application master. Any ideas about what may be
>> going on?
>>
>> Here are the relevant logs:
>>
>> *6/03/18 21:26:58 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, 
>> exitCode: 0
>> 16/03/18 21:26:58 INFO spark.SparkContext: Invoking stop() from shutdown 
>> hook*16/03/18 21:26:58 INFO handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18 21:26:58 
>> INFO handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18 21:26:58 
>> INFO handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/metrics/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/stage/kill,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/api,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/static,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}16/03/18 
>> 21:26:58 INFO handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/executors/threadDump,null}16/03/18 21:26:58 
>> INFO handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/executors/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/executors,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/environment/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/environment,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/storage/rdd/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/storage/rdd,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/storage/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/storage,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/pool/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/pool,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/stage/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/stage,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages/json,null}16/03/18 21:26:58 INFO 
>> handler.ContextHandler: stopped 
>> o.s.j.s.ServletContextHandler{/stages,null}16/03/18 

Re: SparkContext.stop() takes too long to complete

2016-03-18 Thread Ted Yu
Which version of hadoop do you use ?

bq. Requesting to kill executor(s) 1136

Can you find more information on executor 1136 ?

Thanks

On Fri, Mar 18, 2016 at 4:16 PM, Nezih Yigitbasi <
nyigitb...@netflix.com.invalid> wrote:

> Hi Spark experts,
> I am using Spark 1.5.2 on YARN with dynamic allocation enabled. I see in
> the driver/application master logs that the app is marked as SUCCEEDED and
> then SparkContext stop is called. However, this stop sequence takes > 10
> minutes to complete, and YARN resource manager kills the application master
> as it didn’t receive a heartbeat within the last 10 minutes. The resource
> manager then kills the application master. Any ideas about what may be
> going on?
>
> Here are the relevant logs:
>
> *6/03/18 21:26:58 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, 
> exitCode: 0
> 16/03/18 21:26:58 INFO spark.SparkContext: Invoking stop() from shutdown 
> hook*16/03/18 21:26:58 INFO handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/metrics/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/stage/kill,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/api,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}16/03/18 
> 21:26:58 INFO handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/static,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}16/03/18 
> 21:26:58 INFO handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/executors/threadDump,null}16/03/18 21:26:58 
> INFO handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/executors/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/executors,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/environment/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/environment,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/storage/rdd/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/storage/rdd,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/storage/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/storage,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/pool/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/pool,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/stage/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/stage,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/stages,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/jobs/job/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/jobs/job,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/jobs/json,null}16/03/18 21:26:58 INFO 
> handler.ContextHandler: stopped 
> o.s.j.s.ServletContextHandler{/jobs,null}16/03/18 21:26:58 INFO ui.SparkUI: 
> Stopped Spark web UI at http://10.143.240.240:5270616/03/18 21:27:58 INFO 
> cluster.YarnClusterSchedulerBackend: Requesting to kill executor(s) 
> 113516/03/18 21:27:58 INFO yarn.YarnAllocator: Driver requested a total 
> 

SparkContext.stop() takes too long to complete

2016-03-18 Thread Nezih Yigitbasi
Hi Spark experts,
I am using Spark 1.5.2 on YARN with dynamic allocation enabled. I see in
the driver/application master logs that the app is marked as SUCCEEDED and
then SparkContext stop is called. However, this stop sequence takes > 10
minutes to complete, and YARN resource manager kills the application master
as it didn’t receive a heartbeat within the last 10 minutes. The resource
manager then kills the application master. Any ideas about what may be
going on?

Here are the relevant logs:

*6/03/18 21:26:58 INFO yarn.ApplicationMaster: Final app status:
SUCCEEDED, exitCode: 0
16/03/18 21:26:58 INFO spark.SparkContext: Invoking stop() from
shutdown hook*16/03/18 21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static/sql,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution/json,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/json,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}16/03/18
21:26:58 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}16/03/18 21:26:58
INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}16/03/18 21:26:58 INFO
handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}16/03/18 21:26:58 INFO
ui.SparkUI: Stopped Spark web UI at
http://10.143.240.240:5270616/03/18 21:27:58 INFO
cluster.YarnClusterSchedulerBackend: Requesting to kill executor(s)
113516/03/18 21:27:58 INFO yarn.YarnAllocator: Driver requested a
total number of 208 executor(s).16/03/18 21:27:58 INFO
yarn.ApplicationMaster$AMEndpoint: Driver requested to kill
executor(s) 1135.16/03/18 21:27:58 INFO
spark.ExecutorAllocationManager: Removing executor 1135 because it has
been idle for 60 seconds (new desired total will be 208)16/03/18
21:27:58 INFO cluster.YarnClusterSchedulerBackend: Requesting to kill
executor(s) 112316/03/18 21:27:58 INFO yarn.YarnAllocator: Driver
requested a total number of 207 executor(s).16/03/18 

[spark1.4] sparkContext.stop causes exception on Mesos

2015-07-03 Thread Ayoub
Hello Spark developers, 

After upgrading to spark 1.4 on Mesos 0.22.1 existing code started to throw
this exception when calling sparkContext.stop :

(SparkListenerBus) [ERROR -
org.apache.spark.Logging$class.logError(Logging.scala:96)] Listener
EventLoggingListener threw an exception 
java.lang.reflect.InvocationTargetException 
at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source) 
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 
at java.lang.reflect.Method.invoke(Method.java:606) 
at
org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:146)
 
at
org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:146)
 
at scala.Option.foreach(Option.scala:236) 
at
org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:146)
 
at
org.apache.spark.scheduler.EventLoggingListener.onApplicationEnd(EventLoggingListener.scala:190)
 
at
org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:54)
 
at
org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
 
at
org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
 
at
org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56) 
at
org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
 
at
org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
 
at
org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215) 
at
org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
 
Caused by: java.io.IOException: Filesystem closed 
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:730) 
at
org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1855) 
at
org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1816) 
at
org.apache.hadoop.fs.FSDataOutputStream.hflush(FSDataOutputStream.java:130) 
... 16 more 
I0701 15:03:46.101809  1612 sched.cpp:1589] Asked to stop the driver 
I0701 15:03:46.101971  1355 sched.cpp:831] Stopping framework
'20150629-132734-1224736778-5050-6126-0028'


This problems happens only when spark.eventLog.enabled flag is set to true,
it happens also if sparkContext.stop is omitted in the code, I think because
Spark shut down indirectly the spark context. 

Does anyone know what could cause this problem ?

Thanks,
Ayoub.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark1-4-sparkContext-stop-causes-exception-on-Mesos-tp23605.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



SparkContext.stop() ?

2014-10-31 Thread ll
what is it for?  when do we call it?

thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-tp17826.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: SparkContext.stop() ?

2014-10-31 Thread Daniel Siegmann
It is used to shut down the context when you're done with it, but if you're
using a context for the lifetime of your application I don't think it
matters.

I use this in my unit tests, because they start up local contexts and you
can't have multiple local contexts open so each test must stop its context
when it's done.

On Fri, Oct 31, 2014 at 11:12 AM, ll duy.huynh@gmail.com wrote:

 what is it for?  when do we call it?

 thanks!



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-tp17826.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Daniel Siegmann, Software Developer
Velos
Accelerating Machine Learning

440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
E: daniel.siegm...@velos.io W: www.velos.io


Re: SparkContext.stop() ?

2014-10-31 Thread Matei Zaharia
You don't have to call it if you just exit your application, but it's useful 
for example in unit tests if you want to create and shut down a separate 
SparkContext for each test.

Matei

 On Oct 31, 2014, at 10:39 AM, Evan R. Sparks evan.spa...@gmail.com wrote:
 
 In cluster settings if you don't explicitly call sc.stop() your application 
 may hang. Like closing files, network connections, etc, when you're done with 
 them, it's a good idea to call sc.stop(), which lets the spark master know 
 that your application is finished consuming resources.
 
 On Fri, Oct 31, 2014 at 10:13 AM, Daniel Siegmann daniel.siegm...@velos.io 
 mailto:daniel.siegm...@velos.io wrote:
 It is used to shut down the context when you're done with it, but if you're 
 using a context for the lifetime of your application I don't think it matters.
 
 I use this in my unit tests, because they start up local contexts and you 
 can't have multiple local contexts open so each test must stop its context 
 when it's done.
 
 On Fri, Oct 31, 2014 at 11:12 AM, ll duy.huynh@gmail.com 
 mailto:duy.huynh@gmail.com wrote:
 what is it for?  when do we call it?
 
 thanks!
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-tp17826.html
  
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-tp17826.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
 mailto:user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org 
 mailto:user-h...@spark.apache.org
 
 
 
 
 -- 
 Daniel Siegmann, Software Developer
 Velos
 Accelerating Machine Learning
 
 440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
 E: daniel.siegm...@velos.io mailto:daniel.siegm...@velos.io W: www.velos.io 
 http://www.velos.io/



Re: SparkContext.stop() ?

2014-10-31 Thread Marcelo Vanzin
Actually, if you don't call SparkContext.stop(), the event log
information that is used by the history server will be incomplete, and
your application will never show up in the history server's UI.

If you don't use that functionality, then you're probably ok not
calling it as long as your application exits after it's done using the
context.

On Fri, Oct 31, 2014 at 8:12 AM, ll duy.huynh@gmail.com wrote:
 what is it for?  when do we call it?

 thanks!



 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-tp17826.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org