Problem with Spark Master shutting down when zookeeper leader is shutdown

2018-05-09 Thread agateaaa
Dear Spark community, Just wanted to bring this issue up which was filed for Spark 1.6.1 ( https://issues.apache.org/jira/browse/SPARK-15544) but also exists in Spark 2.3.0 (https://issues.apache.org/jira/browse/SPARK-23530) We have run into this on production, where Spark Master shuts down if

Re: Limit pyspark.daemon threads

2016-06-17 Thread agateaaa
to >> run Spark on Alluxio, this documentation can help: >> http://www.alluxio.org/documentation/master/en/Running-Spark-on-Alluxio.html >> >> Thanks, >> Gene >> >> On Tue, Jun 14, 2016 at 12:44 AM, agateaaa <agate...@gmail.com> wrote: >&

Re: Limit pyspark.daemon threads

2016-06-15 Thread agateaaa
ter (not from each machine). If not set, the default will be > spark.deploy.defaultCores on Spark's standalone cluster manager, or > infinite (all available cores) on Mesos.” > > > > *David Newberger* > > > > *From:* agateaaa [mailto:agate...@gmail.com] > *Sent:* Wednesday, June 15, 2016 4:39

Re: Limit pyspark.daemon threads

2016-06-15 Thread agateaaa
park on Alluxio, this documentation can help: > http://www.alluxio.org/documentation/master/en/Running-Spark-on-Alluxio.html > > Thanks, > Gene > > On Tue, Jun 14, 2016 at 12:44 AM, agateaaa <agate...@gmail.com> wrote: > >> Hi, >> >> I am seeing

Re: Limit pyspark.daemon threads

2016-06-14 Thread agateaaa
92 S 0.0 0.0 0:00.38 python -m + <--pyspark.daemon Is there any way to control the number of pyspark.daemon processes that get spawned ? Thank you Agateaaa On Sun, Mar 27, 2016 at 1:08 AM, Sven Krasser <kras...@gmail.com> wrote: > Hey Ken, > > 1. You're correct,

Re: stopping spark stream app

2016-01-29 Thread agateaaa
Hi, We recently started working on trying to use spark streaming to fetch and process data from kafka. (Direct Streaming, Not Receiver based Spark 1.5.2) We want to be able to stop the streaming application and tried implementing the approach suggested above, using stopping thread and calling