Unsubscribe

2017-02-05 Thread satish saley
blockquote, div.yahoo_quoted { margin-left: 0 !important; border-left:1px #715FFA solid !important; padding-left:1ex !important; background-color:white !important; } Unsubscribe Sent from Yahoo Mail for iPhone

Re: Environment tab meaning

2016-06-07 Thread satish saley
ek Laskowski > > > > https://medium.com/@jaceklaskowski/ > > Mastering Apache Spark http://bit.ly/mastering-apache-spark > > Follow me at https://twitter.com/jaceklaskowski > > > > > > On Tue, Jun 7, 2016 at 8:11 PM, satish saley <satishsale...@gmail.com> > wrote: > >> Hi, > >> In spark history server, we see environment tab. Is it show environment > of > >> Driver or Executor or both? > >> > >> Jobs > >> Stages > >> Storage > >> Environment > >> Executors > >> >

Environment tab meaning

2016-06-07 Thread satish saley
Hi, In spark history server, we see environment tab. Is it show environment of Driver or Executor or both? - Jobs - Stages - Storage

duplicate jar problem in yarn-cluster mode

2016-05-17 Thread satish saley
Hello, I am executing a simple code with yarn-cluster --master yarn-cluster --name Spark-FileCopy --class my.example.SparkFileCopy --properties-file spark-defaults.conf --queue saleyq --executor-memory 1G --driver-memory 1G --conf spark.john.snow.is.back=true --jars

pyspark.zip and py4j-0.9-src.zip

2016-05-15 Thread satish saley
Hi, Is there any way to pull in pyspark.zip and py4j-0.9-src.zip in maven project?

Re: System memory 186646528 must be at least 4.718592E8.

2016-05-13 Thread satish saley
n(s"Executor memory > $executorMemory must be at least " + > > On Fri, May 13, 2016 at 12:47 PM, satish saley <satishsale...@gmail.com> > wrote: > >> Hello, >> I am running >> https://github.com/apache/spark/blob/branch-1.6/examples/src/main/python/p

System memory 186646528 must be at least 4.718592E8.

2016-05-13 Thread satish saley
Hello, I am running https://github.com/apache/spark/blob/branch-1.6/examples/src/main/python/pi.py example, but facing following exception What is the unit of memory pointed out in the error? Following are configs --master local[*]

Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread satish saley
application > and can only be killed via YARN commands, or if it's batch and completes. > The simplest way to tie the driver to your app is to pass in yarn-client as > master instead. > > On Fri, May 6, 2016 at 2:00 PM satish saley <satishsale...@gmail.com> > wrote:

Re: killing spark job which is submitted using SparkSubmit

2016-05-06 Thread satish saley
whenever I kill my application. On Fri, May 6, 2016 at 11:58 AM, Anthony May <anthony...@gmail.com> wrote: > Greetings Satish, > > What are the arguments you're passing in? > > On Fri, 6 May 2016 at 12:50 satish saley <satishsale...@gmail.com> wrote: > >> Hello, >

killing spark job which is submitted using SparkSubmit

2016-05-06 Thread satish saley
Hello, I am submitting a spark job using SparkSubmit. When I kill my application, it does not kill the corresponding spark job. How would I kill the corresponding spark job? I know, one way is to use SparkSubmit again with appropriate options. Is there any way though which I can tell SparkSubmit

mesos cluster mode

2016-05-05 Thread satish saley
Hi, Spark documentation says that "cluster mode is currently not supported for Mesos clusters."But below we can see mesos example with cluster mode. I don't have mesos cluster to try it out. Which one is true? Shall I interpret it as "cluster mode is currently not supported for Mesos clusters* for

Redirect from yarn to spark history server

2016-05-02 Thread satish saley
Hello, I am running pyspark job using yarn-cluster mode. I can see spark job in yarn but I am able to go from any "log history" link from yarn to spark history server. How would I keep track of yarn log and its corresponding log in spark history server? Is there any setting in yarn/spark that let