You can pass them in the environment map used to create spark context.

On Tue, Mar 25, 2014 at 2:29 PM, santhoma <santhosh.tho...@yahoo.com> wrote:

> Hello
>
> I have a requirement to set some env values for my spark jobs.
> Does anyone know how to set them? Specifically following variables:
>
> 1) ORACLE_HOME
> 2) LD_LIBRARY_PATH
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-environment-variable-for-a-spark-job-tp3180.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>



-- 

Sourav Chandra

Senior Software Engineer

· · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · ·

sourav.chan...@livestream.com

o: +91 80 4121 8723

m: +91 988 699 3746

skype: sourav.chandra

Livestream

"Ajmera Summit", First Floor, #3/D, 68 Ward, 3rd Cross, 7th C Main, 3rd
Block, Koramangala Industrial Area,

Bangalore 560034

www.livestream.com

Reply via email to