Hi Divya,

Have you tried this command *yarn logs -applicationId
application_xxxxxxxxxxxxx_xxxx *?
(where application_xxxxxxxxxxxxx_xxxx is the id of the application and
may be found in the output of the 'spark-submit' command or in the
yarn's webui)
It will collect the logs from all the executors in one output.

Regards,
--
  Bedrytski Aliaksandr
  sp...@bedryt.ski



On Thu, Sep 22, 2016, at 06:06, Divya Gehlot wrote:
> Hi,
> I have initialised the logging in my spark App
> */*Initialize Logging */
**val **log *= Logger.*getLogger*(getClass.getName)
>
> Logger.*getLogger*(*"org"*).setLevel(Level.*OFF*)
>
> Logger.*getLogger*(*"akka"*).setLevel(Level.*OFF*)
>
> log.warn("Some text"+Somemap.size)
>
> When I run my spark job in using spark-submit like as below spark-
> submit \ --master yarn-client \ --driver-memory 1G \ --executor-memory
> 1G \ --executor-cores 1 \ --num-executors 2 \ --class MainClass 
> /home/hadoop/Spark-assembly-
> 1.0.jar I could see the log in terminal itself
> 16/09/22 03:45:31 WARN MainClass$: SomeText  : 10
>
> When I set up this job in scheduler
> where I can see these logs?
>
> Thanks,
> Divya
>

Reply via email to