Hi,
I'm not completely sure about this either, but this is what we are doing
currently:
Configure your logging to write to STDOUT, not to a file explicitely. Spark
will capture stdour and stderr and separate the messages into a app/driver
folder structure in the configured worker directory.

We then use logstash to collect the logs and index them to a elasticsearch
cluster (Spark seems to produce a lot of logging data). With some simple
regex processing, you also get the application id as searchable field.

Regards,
Jeff

2015-03-20 22:37 GMT+01:00 Ted Yu <yuzhih...@gmail.com>:

> Are these jobs the same jobs, just run by different users or, different
> jobs ?
> If the latter, can each application use its own log4j.properties ?
>
> Cheers
>
> On Fri, Mar 20, 2015 at 1:43 PM, Udit Mehta <ume...@groupon.com> wrote:
>
>> Hi,
>>
>> We have spark setup such that there are various users running multiple
>> jobs at the same time. Currently all the logs go to 1 file specified in the
>> log4j.properties.
>> Is it possible to configure log4j in spark for per app/user logging
>> instead of sending all logs to 1 file mentioned in the log4j.properties?
>>
>> Thanks
>> Udit
>>
>
>

Reply via email to