If the spark-defaults.conf file in the machine where you're starting
the Spark app has that config, then that's all that should be needed.

On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber <fawz...@gmail.com> wrote:
> Thanks Marcelo,
>
> Yes I was was expecting to see the new apps compressed but I don’t , do I
> need to perform restart to spark or Yarn?
>
> On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>> Log compression is a client setting. Doing that will make new apps
>> write event logs in compressed format.
>>
>> The SHS doesn't compress existing logs.
>>
>> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber <fawz...@gmail.com> wrote:
>> > Hi All,
>> >
>> > I'm trying to compress the logs at SPark history server, i added
>> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
>> > Client
>> > Advanced Configuration Snippet (Safety Valve) for
>> > spark-conf/spark-defaults.conf
>> >
>> > which i see applied only to the spark gateway servers spark conf.
>> >
>> > What i missing to get this working ?
>>
>>
>>
>> --
>> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to