On Mon, Mar 26, 2018 at 11:01 AM, Fawze Abujaber wrote:
> Weird, I just ran spark-shell and it's log is comprised but my spark jobs
> that scheduled using oozie is not getting compressed.
Ah, then it's probably a problem with how Oozie is generating the
config for the Spark job. Given your env i
Hi Marcelo,
Weird, I just ran spark-shell and it's log is comprised but my spark jobs
that scheduled using oozie is not getting compressed.
On Mon, Mar 26, 2018 at 8:56 PM, Marcelo Vanzin wrote:
> You're either doing something wrong, or talking about different logs.
> I just added that to my c
I distributed this config to all the nodes cross the cluster and with no
success, new spark logs still uncompressed.
On Mon, Mar 26, 2018 at 8:12 PM, Marcelo Vanzin wrote:
> Spark should be using the gateway's configuration. Unless you're
> launching the application from a different node, if the
You're either doing something wrong, or talking about different logs.
I just added that to my config and ran spark-shell.
$ hdfs dfs -ls /user/spark/applicationHistory | grep
application_1522085988298_0002
-rwxrwx--- 3 blah blah 9844 2018-03-26 10:54
/user/spark/applicationHistory/applicat
Spark should be using the gateway's configuration. Unless you're
launching the application from a different node, if the setting is
there, Spark should be using it.
You can also look in the UI's environment page to see the
configuration that the app is using.
On Mon, Mar 26, 2018 at 10:10 AM, Faw
I see this configuration only on the spark gateway server, and my spark is
running on Yarn, so I think I missing something ...
I’m using cloudera manager to set this parameter, maybe I need to add this
parameter in other configuration
On Mon, 26 Mar 2018 at 20:05 Marcelo Vanzin wrote:
> If the
If the spark-defaults.conf file in the machine where you're starting
the Spark app has that config, then that's all that should be needed.
On Mon, Mar 26, 2018 at 10:02 AM, Fawze Abujaber wrote:
> Thanks Marcelo,
>
> Yes I was was expecting to see the new apps compressed but I don’t , do I
> need
Thanks Marcelo,
Yes I was was expecting to see the new apps compressed but I don’t , do I
need to perform restart to spark or Yarn?
On Mon, 26 Mar 2018 at 19:53 Marcelo Vanzin wrote:
> Log compression is a client setting. Doing that will make new apps
> write event logs in compressed format.
>
Log compression is a client setting. Doing that will make new apps
write event logs in compressed format.
The SHS doesn't compress existing logs.
On Mon, Mar 26, 2018 at 9:17 AM, Fawze Abujaber wrote:
> Hi All,
>
> I'm trying to compress the logs at SPark history server, i added
> spark.eventLog
Hi All,
I'm trying to compress the logs at SPark history server, i
added spark.eventLog.compress=true to spark-defaults.conf to spark Spark
Client Advanced Configuration Snippet (Safety Valve) for
spark-conf/spark-defaults.conf
which i see applied only to the spark gateway servers spark conf.
Wh
10 matches
Mail list logo