GitHub user AnthonyTruchet opened a pull request:

    https://github.com/apache/spark/pull/15023

    Backport [SPARK-5847] Allow for configuring MetricsSystem's prefix

    This is a backport of #14270. Because the spark.internal.config system
    does not exists in branch 1.6, a simpler substitution scheme for ${} in
    the spark.metrics.namespace value, using only Spark configuration had to
    be added to preserve the behaviour discussed in the tickets and tested.
    
    This backport is contributed by Criteo SA under the Apache v2 licence.
    
    ## What changes were proposed in this pull request?
    
    Adding a new property to SparkConf called spark.metrics.namespace that 
allows users to
    set a custom namespace for executor and driver metrics in the metrics 
systems.
    
    By default, the root namespace used for driver or executor metrics is
    the value of `spark.app.id`. However, often times, users want to be able to 
track the metrics
    across apps for driver and executor metrics, which is hard to do with 
application ID
    (i.e. `spark.app.id`) since it changes with every invocation of the app. 
For such use cases,
    users can set the `spark.metrics.namespace` property to any given value
    or to another spark configuration key reference like `${spark.app.name}`
    which is then used to populate the root namespace of the metrics system
    (with the app name in our example). `spark.metrics.namespace` property can 
be set to any
    arbitrary spark property key, whose value would be used to set the root 
namespace of the
    metrics system. Non driver and executor metrics are never prefixed with 
`spark.app.id`, nor
    does the `spark.metrics.namespace` property have any such affect on such 
metrics.
    
    
    ## How was this patch tested?
    
    Added new unit tests, modified existing unit tests.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/criteo-forks/spark backport-SPARK-5847

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/15023.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #15023
    
----
commit c1c57fbb8593346410995578668cc7bff79f77e0
Author: Anthony Truchet <a.truc...@criteo.com>
Date:   2016-09-01T09:37:37Z

    [SPARK-5847][CORE][BRANCH-1.6] Allow for configuring MetricsSystem's use of 
app ID to namespace all metrics
    
    This is a backport of #14270. Because the spark.internal.config system
    does not exists in branch 1.6, a simpler substitution scheme for ${} in
    the spark.metrics.namespace value, using only Spark configuration had to
    be added to preserve the behaviour discussed in the tickets and tested.
    
    This backport is contributed by Criteo SA under the Apache v2 licence.
    
    Adding a new property to SparkConf called spark.metrics.namespace that 
allows users to
    set a custom namespace for executor and driver metrics in the metrics 
systems.
    
    By default, the root namespace used for driver or executor metrics is
    the value of `spark.app.id`. However, often times, users want to be able to 
track the metrics
    across apps for driver and executor metrics, which is hard to do with 
application ID
    (i.e. `spark.app.id`) since it changes with every invocation of the app. 
For such use cases,
    users can set the `spark.metrics.namespace` property to any given value
    or to another spark configuration key reference like `${spark.app.name}`
    which is then used to populate the root namespace of the metrics system
    (with the app name in our example). `spark.metrics.namespace` property can 
be set to any
    arbitrary spark property key, whose value would be used to set the root 
namespace of the
    metrics system. Non driver and executor metrics are never prefixed with 
`spark.app.id`, nor
    does the `spark.metrics.namespace` property have any such affect on such 
metrics.
    
    Added new unit tests, modified existing unit tests.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to