Re: How to debug Spark on Yarn?

2015-04-28 Thread Steve Loughran

On 27 Apr 2015, at 07:51, ÐΞ€ρ@Ҝ (๏̯͡๏) 
mailto:deepuj...@gmail.com>> wrote:

Spark 1.3

1. View stderr/stdout from executor from Web UI: when the job is running i 
figured out the executor that am suppose to see, and those two links show 4 
special characters on browser.

2. Tail on Yarn logs:


/apache/hadoop/bin/yarn logs -applicationId  application_1429087638744_151059 | 
less

Threw me: Application has not completed. Logs are only available after an 
application completes


Any other ideas that i can try ?


There's some stuff on log streaming of running Apps on Hadoop 2.6+ which can 
stream logs of running apps to HDFS. I don't know if spark supports that (I 
haven't tested it) so won't give the details right now.

You can go from the RM to the node managers running the containers, and view 
the logs that way.


From some other notes of mine:




One configuration to aid debugging is tell the nodemanagers to keep data for a 
short period after containers finish



  yarn.nodemanager.delete.debug-delay-sec
  600



You can then retrieve logs by either the web UI, or by connecting to the server 
(usually by ssh) and retrieve the logs from the log directory

We also recommend making sure that YARN kills processes



  yarn.nodemanager.sleep-delay-before-sigkill.ms
  3





Re: How to debug Spark on Yarn?

2015-04-27 Thread ๏̯͡๏
1) Application container logs from Web RM UI never load on browser. I
eventually have to kill the browser.
2)  /apache/hadoop/bin/yarn logs -applicationId
application_1429087638744_151059
| less emits logs only after the application has completed.

Are there no better ways to see the logs as they are emitted. Something
similar to hadoop world ?


On Mon, Apr 27, 2015 at 1:58 PM, Zoltán Zvara 
wrote:

> You can check container logs from RM web UI or when log-aggregation is
> enabled with the yarn command. There are other, but less convenient
> options.
>
> On Mon, Apr 27, 2015 at 8:53 AM ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:
>
>> Spark 1.3
>>
>> 1. View stderr/stdout from executor from Web UI: when the job is running
>> i figured out the executor that am suppose to see, and those two links show
>> 4 special characters on browser.
>>
>> 2. Tail on Yarn logs:
>>
>> /apache/hadoop/bin/yarn logs -applicationId
>> application_1429087638744_151059 | less
>> Threw me: Application has not completed. Logs are only available after an
>> application completes
>>
>>
>> Any other ideas that i can try ?
>>
>>
>>
>> On Sat, Apr 25, 2015 at 12:07 AM, Sven Krasser  wrote:
>>
>>> On Fri, Apr 24, 2015 at 11:31 AM, Marcelo Vanzin 
>>> wrote:
>>>

 Spark 1.3 should have links to the executor logs in the UI while the
 application is running. Not yet in the history server, though.
>>>
>>>
>>> You're absolutely correct -- didn't notice it until now. This is a great
>>> addition!
>>>
>>> --
>>> www.skrasser.com 
>>>
>>
>>
>>
>> --
>> Deepak
>>
>>


-- 
Deepak


Re: How to debug Spark on Yarn?

2015-04-27 Thread Zoltán Zvara
You can check container logs from RM web UI or when log-aggregation is
enabled with the yarn command. There are other, but less convenient options.

On Mon, Apr 27, 2015 at 8:53 AM ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:

> Spark 1.3
>
> 1. View stderr/stdout from executor from Web UI: when the job is running i
> figured out the executor that am suppose to see, and those two links show 4
> special characters on browser.
>
> 2. Tail on Yarn logs:
>
> /apache/hadoop/bin/yarn logs -applicationId
> application_1429087638744_151059 | less
> Threw me: Application has not completed. Logs are only available after an
> application completes
>
>
> Any other ideas that i can try ?
>
>
>
> On Sat, Apr 25, 2015 at 12:07 AM, Sven Krasser  wrote:
>
>> On Fri, Apr 24, 2015 at 11:31 AM, Marcelo Vanzin 
>> wrote:
>>
>>>
>>> Spark 1.3 should have links to the executor logs in the UI while the
>>> application is running. Not yet in the history server, though.
>>
>>
>> You're absolutely correct -- didn't notice it until now. This is a great
>> addition!
>>
>> --
>> www.skrasser.com 
>>
>
>
>
> --
> Deepak
>
>


Re: How to debug Spark on Yarn?

2015-04-26 Thread ๏̯͡๏
Spark 1.3

1. View stderr/stdout from executor from Web UI: when the job is running i
figured out the executor that am suppose to see, and those two links show 4
special characters on browser.

2. Tail on Yarn logs:

/apache/hadoop/bin/yarn logs -applicationId
application_1429087638744_151059 | less
Threw me: Application has not completed. Logs are only available after an
application completes


Any other ideas that i can try ?



On Sat, Apr 25, 2015 at 12:07 AM, Sven Krasser  wrote:

> On Fri, Apr 24, 2015 at 11:31 AM, Marcelo Vanzin 
> wrote:
>
>>
>> Spark 1.3 should have links to the executor logs in the UI while the
>> application is running. Not yet in the history server, though.
>
>
> You're absolutely correct -- didn't notice it until now. This is a great
> addition!
>
> --
> www.skrasser.com 
>



-- 
Deepak


Re: How to debug Spark on Yarn?

2015-04-24 Thread Sven Krasser
On Fri, Apr 24, 2015 at 11:31 AM, Marcelo Vanzin 
wrote:

>
> Spark 1.3 should have links to the executor logs in the UI while the
> application is running. Not yet in the history server, though.


You're absolutely correct -- didn't notice it until now. This is a great
addition!

-- 
www.skrasser.com 


Re: How to debug Spark on Yarn?

2015-04-24 Thread Marcelo Vanzin
On top of what's been said...

On Wed, Apr 22, 2015 at 10:48 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:
> 1) I can go to Spark UI and see the status of the APP but cannot see the
> logs as the job progresses. How can i see logs of executors as they progress
> ?

Spark 1.3 should have links to the executor logs in the UI while the
application is running. Not yet in the history server, though.

> 2) In case the App fails/completes then Spark UI vanishes and i get a YARN
> Job page that says job failed, there are no link to Spark UI again. Now i
> take the job ID and run yarn application logs appid and my console fills up
> (with huge scrolling) with logs of all executors. Then i copy and paste into
> a text editor and search for keywords "Exception" , "Job aborted due to ".
> Is this the right way to view logs ?

Aside from Ted's suggestion, you could also pipe the "yarn logs"
output to "less".

-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to debug Spark on Yarn?

2015-04-24 Thread Sven Krasser
For #1, click on a worker node on the YARN dashboard. From there,
Tools->Local logs->Userlogs has the logs for each application, and you can
view them by executor even while an application is running. (This is for
Hadoop 2.4, things may have changed in 2.6.)
-Sven

On Thu, Apr 23, 2015 at 6:27 AM, Ted Yu  wrote:

> For step 2, you can pipe application log to a file instead of
> copy-pasting.
>
> Cheers
>
>
>
> On Apr 22, 2015, at 10:48 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:
>
> I submit a spark app to YARN and i get these messages
>
>  15/04/22 22:45:04 INFO yarn.Client: Application report for
> application_1429087638744_101363 (state: RUNNING)
>
> 15/04/22 22:45:04 INFO yarn.Client: Application report for
> application_1429087638744_101363 (state: RUNNING).
>
> ...
>
>
> 1) I can go to Spark UI and see the status of the APP but cannot see the
> logs as the job progresses. How can i see logs of executors as they
> progress ?
>
> 2) In case the App fails/completes then Spark UI vanishes and i get a YARN
> Job page that says job failed, there are no link to Spark UI again. Now i
> take the job ID and run yarn application logs appid and my console fills up
> (with huge scrolling) with logs of all executors. Then i copy and paste
> into a text editor and search for keywords "Exception" , "Job aborted due
> to ". Is this the right way to view logs ?
>
> --
> Deepak
>
>


-- 
www.skrasser.com 


Re: How to debug Spark on Yarn?

2015-04-23 Thread Ted Yu
For step 2, you can pipe application log to a file instead of copy-pasting. 

Cheers



> On Apr 22, 2015, at 10:48 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:
> 
> I submit a spark app to YARN and i get these messages
> 
> 
> 
> 15/04/22 22:45:04 INFO yarn.Client: Application report for 
> application_1429087638744_101363 (state: RUNNING)
> 
> 
> 15/04/22 22:45:04 INFO yarn.Client: Application report for 
> application_1429087638744_101363 (state: RUNNING).
> 
> ...
> 
> 
> 
> 1) I can go to Spark UI and see the status of the APP but cannot see the logs 
> as the job progresses. How can i see logs of executors as they progress ?
> 
> 2) In case the App fails/completes then Spark UI vanishes and i get a YARN 
> Job page that says job failed, there are no link to Spark UI again. Now i 
> take the job ID and run yarn application logs appid and my console fills up 
> (with huge scrolling) with logs of all executors. Then i copy and paste into 
> a text editor and search for keywords "Exception" , "Job aborted due to ". Is 
> this the right way to view logs ?
> 
> 
> -- 
> Deepak
> 


How to debug Spark on Yarn?

2015-04-22 Thread ๏̯͡๏
I submit a spark app to YARN and i get these messages

 15/04/22 22:45:04 INFO yarn.Client: Application report for
application_1429087638744_101363 (state: RUNNING)

15/04/22 22:45:04 INFO yarn.Client: Application report for
application_1429087638744_101363 (state: RUNNING).

...


1) I can go to Spark UI and see the status of the APP but cannot see the
logs as the job progresses. How can i see logs of executors as they
progress ?

2) In case the App fails/completes then Spark UI vanishes and i get a YARN
Job page that says job failed, there are no link to Spark UI again. Now i
take the job ID and run yarn application logs appid and my console fills up
(with huge scrolling) with logs of all executors. Then i copy and paste
into a text editor and search for keywords "Exception" , "Job aborted due
to ". Is this the right way to view logs ?

-- 
Deepak