Re: Spark logs compression

2018-03-26 Thread Marcelo Vanzin
On Mon, Mar 26, 2018 at 11:01 AM, Fawze Abujaber  wrote:
> Weird, I just ran spark-shell and it's log is comprised but  my spark jobs
> that scheduled using oozie is not getting compressed.

Ah, then it's probably a problem with how Oozie is generating the
config for the Spark job. Given your env it's potentially related to
Cloudera Manager so I'd try to ask questions in the Cloudera forums
first...

-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark logs compression

2018-03-26 Thread Fawze Abujaber
Hi Marcelo,

Weird, I just ran spark-shell and it's log is comprised but  my spark jobs
that scheduled using oozie is not getting compressed.

On Mon, Mar 26, 2018 at 8:56 PM, Marcelo Vanzin  wrote:

> You're either doing something wrong, or talking about different logs.
> I just added that to my config and ran spark-shell.
>
> $ hdfs dfs -ls /user/spark/applicationHistory | grep
> application_1522085988298_0002
> -rwxrwx---   3 blah blah   9844 2018-03-26 10:54
> /user/spark/applicationHistory/application_1522085988298_0002.snappy
>
>
>
> On Mon, Mar 26, 2018 at 10:48 AM, Fawze Abujaber 
> wrote:
> > I distributed this config to all the nodes cross the cluster and with no
> > success, new spark logs still uncompressed.
> >
> > On Mon, Mar 26, 2018 at 8:12 PM, Marcelo Vanzin 
> wrote:
> >>
> >> Spark should be using the gateway's configuration. Unless you're
> >> launching the application from a different node, if the setting is
> >> there, Spark should be using it.
> >>
> >> You can also look in the UI's environment page to see the
> >> configuration that the app is using.
> >>
> >> On Mon, Mar 26, 2018 at 10:10 AM, Fawze Abujaber 
> >> wrote:
> >> > I see this configuration only on the spark gateway server, and my
> spark
> >> > is
> >> > running on Yarn, so I think I missing something ...
> >> >
> >> > I’m using cloudera manager to set this parameter, maybe I need to add
> >> > this
> >> > parameter in other configuration
> >> >
> >> > On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin 
> wrote:
> >> >>
> >> >> If the spark-defaults.conf file in the machine where you're starting
> >> >> the Spark app has that config, then that's all that should be needed.
> >> >>
> >> >> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber 
> >> >> wrote:
> >> >> > Thanks Marcelo,
> >> >> >
> >> >> > Yes I was was expecting to see the new apps compressed but I don’t
> ,
> >> >> > do
> >> >> > I
> >> >> > need to perform restart to spark or Yarn?
> >> >> >
> >> >> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin 
> >> >> > wrote:
> >> >> >>
> >> >> >> Log compression is a client setting. Doing that will make new apps
> >> >> >> write event logs in compressed format.
> >> >> >>
> >> >> >> The SHS doesn't compress existing logs.
> >> >> >>
> >> >> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber <
> fawz...@gmail.com>
> >> >> >> wrote:
> >> >> >> > Hi All,
> >> >> >> >
> >> >> >> > I'm trying to compress the logs at SPark history server, i added
> >> >> >> > spark.eventLog.compress=true to spark-defaults.conf to spark
> Spark
> >> >> >> > Client
> >> >> >> > Advanced Configuration Snippet (Safety Valve) for
> >> >> >> > spark-conf/spark-defaults.conf
> >> >> >> >
> >> >> >> > which i see applied only to the spark gateway servers spark
> conf.
> >> >> >> >
> >> >> >> > What i missing to get this working ?
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> --
> >> >> >> Marcelo
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Marcelo
> >>
> >>
> >>
> >> --
> >> Marcelo
> >
> >
>
>
>
> --
> Marcelo
>


Re: Spark logs compression

2018-03-26 Thread Fawze Abujaber
I distributed this config to all the nodes cross the cluster and with no
success, new spark logs still uncompressed.

On Mon, Mar 26, 2018 at 8:12 PM, Marcelo Vanzin  wrote:

> Spark should be using the gateway's configuration. Unless you're
> launching the application from a different node, if the setting is
> there, Spark should be using it.
>
> You can also look in the UI's environment page to see the
> configuration that the app is using.
>
> On Mon, Mar 26, 2018 at 10:10 AM, Fawze Abujaber 
> wrote:
> > I see this configuration only on the spark gateway server, and my spark
> is
> > running on Yarn, so I think I missing something ...
> >
> > I’m using cloudera manager to set this parameter, maybe I need to add
> this
> > parameter in other configuration
> >
> > On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin  wrote:
> >>
> >> If the spark-defaults.conf file in the machine where you're starting
> >> the Spark app has that config, then that's all that should be needed.
> >>
> >> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber 
> >> wrote:
> >> > Thanks Marcelo,
> >> >
> >> > Yes I was was expecting to see the new apps compressed but I don’t ,
> do
> >> > I
> >> > need to perform restart to spark or Yarn?
> >> >
> >> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin 
> wrote:
> >> >>
> >> >> Log compression is a client setting. Doing that will make new apps
> >> >> write event logs in compressed format.
> >> >>
> >> >> The SHS doesn't compress existing logs.
> >> >>
> >> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber 
> >> >> wrote:
> >> >> > Hi All,
> >> >> >
> >> >> > I'm trying to compress the logs at SPark history server, i added
> >> >> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
> >> >> > Client
> >> >> > Advanced Configuration Snippet (Safety Valve) for
> >> >> > spark-conf/spark-defaults.conf
> >> >> >
> >> >> > which i see applied only to the spark gateway servers spark conf.
> >> >> >
> >> >> > What i missing to get this working ?
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Marcelo
> >>
> >>
> >>
> >> --
> >> Marcelo
>
>
>
> --
> Marcelo
>


Re: Spark logs compression

2018-03-26 Thread Marcelo Vanzin
You're either doing something wrong, or talking about different logs.
I just added that to my config and ran spark-shell.

$ hdfs dfs -ls /user/spark/applicationHistory | grep
application_1522085988298_0002
-rwxrwx---   3 blah blah   9844 2018-03-26 10:54
/user/spark/applicationHistory/application_1522085988298_0002.snappy



On Mon, Mar 26, 2018 at 10:48 AM, Fawze Abujaber  wrote:
> I distributed this config to all the nodes cross the cluster and with no
> success, new spark logs still uncompressed.
>
> On Mon, Mar 26, 2018 at 8:12 PM, Marcelo Vanzin  wrote:
>>
>> Spark should be using the gateway's configuration. Unless you're
>> launching the application from a different node, if the setting is
>> there, Spark should be using it.
>>
>> You can also look in the UI's environment page to see the
>> configuration that the app is using.
>>
>> On Mon, Mar 26, 2018 at 10:10 AM, Fawze Abujaber 
>> wrote:
>> > I see this configuration only on the spark gateway server, and my spark
>> > is
>> > running on Yarn, so I think I missing something ...
>> >
>> > I’m using cloudera manager to set this parameter, maybe I need to add
>> > this
>> > parameter in other configuration
>> >
>> > On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin  wrote:
>> >>
>> >> If the spark-defaults.conf file in the machine where you're starting
>> >> the Spark app has that config, then that's all that should be needed.
>> >>
>> >> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber 
>> >> wrote:
>> >> > Thanks Marcelo,
>> >> >
>> >> > Yes I was was expecting to see the new apps compressed but I don’t ,
>> >> > do
>> >> > I
>> >> > need to perform restart to spark or Yarn?
>> >> >
>> >> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin 
>> >> > wrote:
>> >> >>
>> >> >> Log compression is a client setting. Doing that will make new apps
>> >> >> write event logs in compressed format.
>> >> >>
>> >> >> The SHS doesn't compress existing logs.
>> >> >>
>> >> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber 
>> >> >> wrote:
>> >> >> > Hi All,
>> >> >> >
>> >> >> > I'm trying to compress the logs at SPark history server, i added
>> >> >> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
>> >> >> > Client
>> >> >> > Advanced Configuration Snippet (Safety Valve) for
>> >> >> > spark-conf/spark-defaults.conf
>> >> >> >
>> >> >> > which i see applied only to the spark gateway servers spark conf.
>> >> >> >
>> >> >> > What i missing to get this working ?
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Marcelo
>> >>
>> >>
>> >>
>> >> --
>> >> Marcelo
>>
>>
>>
>> --
>> Marcelo
>
>



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark logs compression

2018-03-26 Thread Marcelo Vanzin
Spark should be using the gateway's configuration. Unless you're
launching the application from a different node, if the setting is
there, Spark should be using it.

You can also look in the UI's environment page to see the
configuration that the app is using.

On Mon, Mar 26, 2018 at 10:10 AM, Fawze Abujaber  wrote:
> I see this configuration only on the spark gateway server, and my spark is
> running on Yarn, so I think I missing something ...
>
> I’m using cloudera manager to set this parameter, maybe I need to add this
> parameter in other configuration
>
> On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin  wrote:
>>
>> If the spark-defaults.conf file in the machine where you're starting
>> the Spark app has that config, then that's all that should be needed.
>>
>> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber 
>> wrote:
>> > Thanks Marcelo,
>> >
>> > Yes I was was expecting to see the new apps compressed but I don’t , do
>> > I
>> > need to perform restart to spark or Yarn?
>> >
>> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin  wrote:
>> >>
>> >> Log compression is a client setting. Doing that will make new apps
>> >> write event logs in compressed format.
>> >>
>> >> The SHS doesn't compress existing logs.
>> >>
>> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber 
>> >> wrote:
>> >> > Hi All,
>> >> >
>> >> > I'm trying to compress the logs at SPark history server, i added
>> >> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
>> >> > Client
>> >> > Advanced Configuration Snippet (Safety Valve) for
>> >> > spark-conf/spark-defaults.conf
>> >> >
>> >> > which i see applied only to the spark gateway servers spark conf.
>> >> >
>> >> > What i missing to get this working ?
>> >>
>> >>
>> >>
>> >> --
>> >> Marcelo
>>
>>
>>
>> --
>> Marcelo



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark logs compression

2018-03-26 Thread Fawze Abujaber
I see this configuration only on the spark gateway server, and my spark is
running on Yarn, so I think I missing something ...

I’m using cloudera manager to set this parameter, maybe I need to add this
parameter in other configuration

On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin  wrote:

> If the spark-defaults.conf file in the machine where you're starting
> the Spark app has that config, then that's all that should be needed.
>
> On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber 
> wrote:
> > Thanks Marcelo,
> >
> > Yes I was was expecting to see the new apps compressed but I don’t , do I
> > need to perform restart to spark or Yarn?
> >
> > On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin  wrote:
> >>
> >> Log compression is a client setting. Doing that will make new apps
> >> write event logs in compressed format.
> >>
> >> The SHS doesn't compress existing logs.
> >>
> >> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber 
> wrote:
> >> > Hi All,
> >> >
> >> > I'm trying to compress the logs at SPark history server, i added
> >> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
> >> > Client
> >> > Advanced Configuration Snippet (Safety Valve) for
> >> > spark-conf/spark-defaults.conf
> >> >
> >> > which i see applied only to the spark gateway servers spark conf.
> >> >
> >> > What i missing to get this working ?
> >>
> >>
> >>
> >> --
> >> Marcelo
>
>
>
> --
> Marcelo
>


Re: Spark logs compression

2018-03-26 Thread Marcelo Vanzin
If the spark-defaults.conf file in the machine where you're starting
the Spark app has that config, then that's all that should be needed.

On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber  wrote:
> Thanks Marcelo,
>
> Yes I was was expecting to see the new apps compressed but I don’t , do I
> need to perform restart to spark or Yarn?
>
> On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin  wrote:
>>
>> Log compression is a client setting. Doing that will make new apps
>> write event logs in compressed format.
>>
>> The SHS doesn't compress existing logs.
>>
>> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber  wrote:
>> > Hi All,
>> >
>> > I'm trying to compress the logs at SPark history server, i added
>> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark
>> > Client
>> > Advanced Configuration Snippet (Safety Valve) for
>> > spark-conf/spark-defaults.conf
>> >
>> > which i see applied only to the spark gateway servers spark conf.
>> >
>> > What i missing to get this working ?
>>
>>
>>
>> --
>> Marcelo



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark logs compression

2018-03-26 Thread Fawze Abujaber
Thanks Marcelo,

Yes I was was expecting to see the new apps compressed but I don’t , do I
need to perform restart to spark or Yarn?

On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin  wrote:

> Log compression is a client setting. Doing that will make new apps
> write event logs in compressed format.
>
> The SHS doesn't compress existing logs.
>
> On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber  wrote:
> > Hi All,
> >
> > I'm trying to compress the logs at SPark history server, i added
> > spark.eventLog.compress=true to spark-defaults.conf to spark Spark Client
> > Advanced Configuration Snippet (Safety Valve) for
> > spark-conf/spark-defaults.conf
> >
> > which i see applied only to the spark gateway servers spark conf.
> >
> > What i missing to get this working ?
>
>
>
> --
> Marcelo
>


Re: Spark logs compression

2018-03-26 Thread Marcelo Vanzin
Log compression is a client setting. Doing that will make new apps
write event logs in compressed format.

The SHS doesn't compress existing logs.

On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber  wrote:
> Hi All,
>
> I'm trying to compress the logs at SPark history server, i added
> spark.eventLog.compress=true to spark-defaults.conf to spark Spark Client
> Advanced Configuration Snippet (Safety Valve) for
> spark-conf/spark-defaults.conf
>
> which i see applied only to the spark gateway servers spark conf.
>
> What i missing to get this working ?



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Spark logs compression

2018-03-26 Thread Fawze Abujaber
Hi All,

I'm trying to compress the logs at SPark history server, i
added spark.eventLog.compress=true to spark-defaults.conf to spark Spark
Client Advanced Configuration Snippet (Safety Valve) for
spark-conf/spark-defaults.conf

which i see applied only to the spark gateway servers spark conf.

What i missing to get this working ?