That was intentional - what's your use case that require configs not
starting with spark?


On Thu, Aug 13, 2015 at 8:16 AM, rfarrjr <rfar...@gmail.com> wrote:

> Ran into an issue setting a property on the SparkConf that wasn't made
> available on the worker.  After some digging[1] I noticed that only
> properties that start with "spark." are sent by the schedular.   I'm not
> sure if this was intended behavior or not.
>
> Using Spark Streaming 1.4.1 running on Java 8.
>
> ~Robert
>
> [1]
>
> https://github.com/apache/spark/blob/v1.4.1/core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala#L243
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/possible-bug-user-SparkConf-properties-not-copied-to-worker-process-tp13665.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to