Re: Limiting Pyspark.daemons

2016-07-04 Thread Ashwin Raaghav
Thanks. I'll try that. Hopefully that should work. On Mon, Jul 4, 2016 at 9:12 PM, Mathieu Longtin <math...@closetwork.org> wrote: > I started with a download of 1.6.0. These days, we use a self compiled > 1.6.2. > > On Mon, Jul 4, 2016 at 11:39 AM Ashwin Raaghav <ashraag.

Re: Limiting Pyspark.daemons

2016-07-04 Thread Ashwin Raaghav
Longtin <math...@closetwork.org> wrote: > 1.6.1. > > I have no idea. SPARK_WORKER_CORES should do the same. > > On Mon, Jul 4, 2016 at 11:24 AM Ashwin Raaghav <ashraag...@gmail.com> > wrote: > >> Which version of Spark are you using? 1.6.1? >> >

Re: Limiting Pyspark.daemons

2016-07-04 Thread Ashwin Raaghav
Which version of Spark are you using? 1.6.1? Any ideas as to why it is not working in ours? On Mon, Jul 4, 2016 at 8:51 PM, Mathieu Longtin <math...@closetwork.org> wrote: > 16. > > On Mon, Jul 4, 2016 at 11:16 AM Ashwin Raaghav <ashraag...@gmail.com> > wrote: > &g

Re: Limiting Pyspark.daemons

2016-07-04 Thread Ashwin Raaghav
se more than 1 core per server. However, it seems it will > start as many pyspark as there are cores, but maybe not use them. > > On Mon, Jul 4, 2016 at 10:44 AM Ashwin Raaghav <ashraag...@gmail.com> > wrote: > >> Hi Mathieu, >> >> Isn't that the same as setting &

Re: Limiting Pyspark.daemons

2016-07-04 Thread Ashwin Raaghav
node to 1. But the number of >> pyspark.daemons process is still not coming down. It looks like initially >> there is one Pyspark.daemons process and this in turn spawns as many >> pyspark.daemons processes as the number of cores in the machine. >> >> Any help is apprecia

Re: Adding h5 files in a zip to use with PySpark

2016-06-15 Thread Ashwin Raaghav
List mailing list archive at Nabble.com. > > > > - > > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > > For additional commands, e-mail: user-h...@spark.apache.org > > > > > -- Regards, Ashwin Raaghav