OK, I got it.
When I use 'yarn logs -applicationId <appId>' command everything appears in 
right place.
Thank you!

-- 
Яндекс.Почта — надёжная почта
http://mail.yandex.ru/neo2/collect/?exp=1&t=1


07.09.2015, 01:44, "Gerard Maas" <gerard.m...@gmail.com>:
> You need to take into consideration 'where' things are executing. The closure 
> of the 'forEachRDD'  executes in the driver. Therefore, the log statements 
> printed during the execution of that part will be found in the driver logs.
> In contrast, the foreachPartition closure executes on the worker nodes. You 
> will find the '+++ForEachPartition+++' messages printed in the executor log.
>
> So both statements execute, but in different locations of the distributed 
> computing environment (aka cluster)
>
> -kr, Gerard.
>
> On Sun, Sep 6, 2015 at 10:53 PM, Alexey Ponkin <alexey.pon...@ya.ru> wrote:
>> Hi,
>>
>> I have the following code
>>
>> object MyJob extends org.apache.spark.Logging{
>> ...
>>  val source: DStream[SomeType] ...
>>
>>  source.foreachRDD { rdd =>
>>       logInfo(s"""+++ForEachRDD+++""")
>>       rdd.foreachPartition { partitionOfRecords =>
>>         logInfo(s"""+++ForEachPartition+++""")
>>       }
>>   }
>>
>> I was expecting to see both log messages in job log.
>> But unfortunately you will never see string '+++ForEachPartition+++' in 
>> logs, cause block foreachPartition will never execute.
>> And also there is no error message or something in logs.
>> I wonder is this a bug or known behavior?
>> I know that org.apache.spark.Logging is DeveloperAPI, but why it is silently 
>> fails with no messages?
>> What to use instead of org.apache.spark.Logging? in spark-streaming jobs?
>>
>> P.S. running spark 1.4.1 (on yarn)
>>
>> Thanks in advance
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to