Re: how to run spark job on yarn with jni lib?

2014-09-30 Thread taqilabon
We currently choose not to run jobs on yarn, so I stop trying this.
Anyway thanks for you guys' suggestions.
At least, your solutions may help people who must run their jobs on yarn : )



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146p15395.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to run spark job on yarn with jni lib?

2014-09-29 Thread mbaryu
You will also need to run 'ldconfig' on each host to read the ld.so.conf file
and make it active.  You might also need to stop Spark (the JVM) on each
node to cause the loader to reload for those processes.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146p15351.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to run spark job on yarn with jni lib?

2014-09-26 Thread Marcelo Vanzin
I assume you did those things in all machines, not just on the machine
launching the job?

I've seen that workaround used successfully (well, actually, they
copied the library to /usr/lib or something, but same idea).

On Thu, Sep 25, 2014 at 7:45 PM, taqilabon g945...@gmail.com wrote:
 You're right, I'm suffering from SPARK-1719.
 I've tried to add their location to /etc/ld.so.conf and I've submitted my
 job as a yarn-client,
 but the problem is the same: my native libraries are not loaded.
 Does this method work in your case?



 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146p15195.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



how to run spark job on yarn with jni lib?

2014-09-25 Thread taqilabon
Hi all,

I tried to run my spark job on yarn.
In my application, I need to call third-parity jni libraries in a spark job.
However, I can't find a way to make spark job load my native libraries.
Is there anyone who knows how to solve this problem?
Thanks.

Ziv Huang



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to run spark job on yarn with jni lib?

2014-09-25 Thread Marcelo Vanzin
Hmmm, you might be suffering from SPARK-1719.

Not sure what the proper workaround is, but it sounds like your native
libs are not in any of the standard lib directories; one workaround
might be to copy them there, or add their location to /etc/ld.so.conf
(I'm assuming Linux).

On Thu, Sep 25, 2014 at 8:34 AM, taqilabon g945...@gmail.com wrote:
 Hi all,

 I tried to run my spark job on yarn.
 In my application, I need to call third-parity jni libraries in a spark job.
 However, I can't find a way to make spark job load my native libraries.
 Is there anyone who knows how to solve this problem?
 Thanks.

 Ziv Huang



 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to run spark job on yarn with jni lib?

2014-09-25 Thread taqilabon
You're right, I'm suffering from SPARK-1719.
I've tried to add their location to /etc/ld.so.conf and I've submitted my
job as a yarn-client,
but the problem is the same: my native libraries are not loaded.
Does this method work in your case?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-spark-job-on-yarn-with-jni-lib-tp15146p15195.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org