Hi Marcelo,

Weird, I just ran spark-shell and it's log is comprised but  my spark jobs
that scheduled using oozie is not getting compressed.

On Mon, Mar 26, 2018 at 8:56 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> You're either doing something wrong, or talking about different logs.
> I just added that to my config and ran spark-shell.
>
> $ hdfs dfs -ls /user/spark/applicationHistory | grep
> application_1522085988298_0002
> -rwxrwx---   3 blah blah       9844 2018-03-26 10:54
> /user/spark/applicationHistory/application_1522085988298_0002.snappy
>
>
>
> On Mon, Mar 26, 2018 at 10:48 AM, Fawze Abujaber <fawz...@gmail.com>
> wrote:
> > I distributed this config to all the nodes cross the cluster and with no
> > success, new spark logs still uncompressed.
> >
> > On Mon, Mar 26, 2018 at 8:12 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
> >>
> >> Spark should be using the gateway's configuration. Unless you're
> >> launching the application from a different node, if the setting is
> >> there, Spark should be using it.
> >>
> >> You can also look in the UI's environment page to see the
> >> configuration that the app is using.
> >>
> >> On Mon, Mar 26, 2018 at 10:10 AM, Fawze Abujaber <fawz...@gmail.com>
> >> wrote:
> >> > I see this configuration only on the spark gateway server, and my
> spark
> >> > is
> >> > running on Yarn, so I think I missing something ...
> >> >
> >> > I’m using cloudera manager to set this parameter, maybe I need to add
> >> > this
> >> > parameter in other configuration
> >> >
> >> > On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin <van...@cloudera.com>
> wrote:
> >> >>
> >> >> If the spark-defaults.conf file in the machine where you're starting
> >> >> the Spark app has that config, then that's all that should be needed.
> >> >>
> >> >> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber <fawz...@gmail.com>
> >> >> wrote:
> >> >> > Thanks Marcelo,
> >> >> >
> >> >> > Yes I was was expecting to see the new apps compressed but I don’t
> ,
> >> >> > do
> >> >> > I
> >> >> > need to perform restart to spark or Yarn?
> >> >> >
> >> >> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin <van...@cloudera.com>
> >> >> > wrote:
> >> >> >>
> >> >> >> Log compression is a client setting. Doing that will make new apps
> >> >> >> write event logs in compressed format.
> >> >> >>
> >> >> >> The SHS doesn't compress existing logs.
> >> >> >>
> >> >> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber <
> fawz...@gmail.com>
> >> >> >> wrote:
> >> >> >> > Hi All,
> >> >> >> >
> >> >> >> > I'm trying to compress the logs at SPark history server, i added
> >> >> >> > spark.eventLog.compress=true to spark-defaults.conf to spark
> Spark
> >> >> >> > Client
> >> >> >> > Advanced Configuration Snippet (Safety Valve) for
> >> >> >> > spark-conf/spark-defaults.conf
> >> >> >> >
> >> >> >> > which i see applied only to the spark gateway servers spark
> conf.
> >> >> >> >
> >> >> >> > What i missing to get this working ?
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> --
> >> >> >> Marcelo
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Marcelo
> >>
> >>
> >>
> >> --
> >> Marcelo
> >
> >
>
>
>
> --
> Marcelo
>

Reply via email to