Re: yarn-cluster mode error

2016-05-17 Thread Sandeep Nemuri
Can you post the complete stack trace ? ᐧ On Tue, May 17, 2016 at 7:00 PM, wrote: > Hi, > > i am getting error below while running application on yarn-cluster mode. > > *ERROR yarn.ApplicationMaster: RECEIVED SIGNAL 15: SIGTERM* > > Anyone can suggest why i am

Re: yarn-cluster

2016-05-04 Thread nsalian
Hi, this is a good spot to start for Spark and YARN. https://spark.apache.org/docs/1.5.0/running-on-yarn.html specific to the version you are on, you can toggle between pages. - Neelesh S. Salian Cloudera -- View this message in context:

Re: yarn-cluster

2016-05-03 Thread nsalian
Hello, Thank you for the question. The Status UNDEFINED means the application has not been completed and not been resourced. Upon getting assignment it will progress to RUNNING and then SUCCEEDED upon completion. It isn't a problem that you should worry about. You should make sure to tune your

Re: (YARN CLUSTER MODE) Where to find logs within Spark RDD processing function ?

2016-04-29 Thread nguyen duc tuan
what does the WebUI show? What do you see when you click on "stderr" and "stdout" links ? These links must contain stdoutput and stderr for each executor. About your custom logging in executor, are you sure you checked "${spark. yarn.app.container.log.dir}/spark-app.log" Actual location of this

Re: (YARN CLUSTER MODE) Where to find logs within Spark RDD processing function ?

2016-04-29 Thread dev loper
Hi Ted & Nguyen, @Ted , I was under the belief that if the log4j.properties file would be taken from the application classpath if file path is not specified. Please correct me if I am wrong. I tried your approach as well still I couldn't find the logs. @nguyen I am running it on a Yarn cluster

Re: (YARN CLUSTER MODE) Where to find logs within Spark RDD processing function ?

2016-04-29 Thread nguyen duc tuan
These are executor's logs, not the driver logs. To see this log files, you have to go to executor machines where tasks is running. To see what you will print to stdout or stderr you can either go to the executor machines directly (will store in "stdout" and "stderr" files somewhere in the executor

Re: yarn-cluster mode throwing NullPointerException

2015-10-12 Thread Venkatakrishnan Sowrirajan
Hi Rachana, Are you by any chance saying something like this in your code ​? ​ "sparkConf.setMaster("yarn-cluster");" ​SparkContext is not supported with yarn-cluster mode.​ I think you are hitting this bug -- > https://issues.apache.org/jira/browse/SPARK-7504. This got fixed in Spark-1.4.0,

Re: yarn-cluster spark-submit process not dying

2015-05-28 Thread Corey Nolet
Thanks Sandy- I was digging through the code in the deploy.yarn.Client and literally found that property right before I saw your reply. I'm on 1.2.x right now which doesn't have the property. I guess I need to update sooner rather than later. On Thu, May 28, 2015 at 3:56 PM, Sandy Ryza

Re: yarn-cluster spark-submit process not dying

2015-05-28 Thread Sandy Ryza
Hi Corey, As of this PR https://github.com/apache/spark/pull/5297/files, this can be controlled with spark.yarn.submit.waitAppCompletion. -Sandy On Thu, May 28, 2015 at 11:48 AM, Corey Nolet cjno...@gmail.com wrote: I am submitting jobs to my yarn cluster via the yarn-cluster mode and I'm