Re: spark standalone with multiple workers gives a warning

2016-10-06 Thread Ofer Eliassaf
> > So how would I start a cluster of 3? SPARK_WORKER_INSTANCES is the only > way I see to start the standalone cluster and the only way I see to define > it is in spark-env.sh. The spark submit option, SPARK_EXECUTOR_INSTANCES > and spark.executor.instances are all related to submitting the job. > > > > Any ideas? > > Thanks > > Assaf > -- Regards, Ofer Eliassaf

pyspark cluster mode on standalone deployment

2016-09-27 Thread Ofer Eliassaf
vailabilty in python spark. Cuurently only Yarn deployment supports it. Bringing the huge Yarn installation just for this feature is not fun at all Does someone have time estimation for this? -- Regards, Ofer Eliassaf

Dynamic Resource Allocation in a standalone

2016-10-27 Thread Ofer Eliassaf
applications will get the total amount of cores until a new application arrives... -- Regards, Ofer Eliassaf

Re: PySpark TaskContext

2016-11-24 Thread Ofer Eliassaf
>> >> - >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> >> > > > -- > Cell : 425-233-8271 > Twitter: https://twitter.com/holdenkarau > -- Regards, Ofer Eliassaf

Re: PySpark TaskContext

2016-11-24 Thread Ofer Eliassaf
>>> View this message in context: http://apache-spark-user-list. >>>>> 1001560.n3.nabble.com/PySpark-TaskContext-tp28125.html >>>>> Sent from the Apache Spark User List mailing list archive at >>>>> Nabble.com. >>>>> >>>>> - >>>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >>>>> >>>>> >>>> >>>> >>>> -- >>>> Cell : 425-233-8271 >>>> Twitter: https://twitter.com/holdenkarau >>>> >>> >>> >>> >>> -- >>> Regards, >>> Ofer Eliassaf >>> >> >> >> >> -- >> Cell : 425-233-8271 >> Twitter: https://twitter.com/holdenkarau >> > > > > -- > Cell : 425-233-8271 > Twitter: https://twitter.com/holdenkarau > -- Regards, Ofer Eliassaf

Re: pyspark cluster mode on standalone deployment

2017-03-05 Thread Ofer Eliassaf
anyone? please? is this getting any priority? On Tue, Sep 27, 2016 at 3:38 PM, Ofer Eliassaf <ofer.elias...@gmail.com> wrote: > Is there any plan to support python spark running in "cluster mode" on a > standalone deployment? > > There is this famous survey

PySpark Structured Streaming - using previous iteration computed results in current iteration

2018-05-16 Thread Ofer Eliassaf
hour. We want to keep around the labels and the sample ids for the next iteration (N+1) where we want to do a join with the new sample window to inherit the labels of samples that existed in the previous (N) iteration. -- Regards, Ofer Eliassaf