I would say instead of LD_LIBRARY_PATH you might want to use java.library.
path

in the following way

java -Djava.library.path=/path/to/my/library or pass java.library.path
along with spark-submit

On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vgour...@gmail.com> wrote:

> Maybe you've already checked these out. Some basic questions that come to
> my mind are:
> 1) is this library "foolib" or "foo-C-library" available on the worker
> node?
> 2) if yes, is it accessible by the user/program (rwx)?
>
> Thanks,
> Vasu.
>
> On Nov 26, 2016, at 5:08 PM, kant kodali <kanth...@gmail.com> wrote:
>
> If it is working for standalone program I would think you can apply the
> same settings across all the spark worker  and client machines and give
> that a try. Lets start with that.
>
> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <start.vin...@gmail.com>
> wrote:
>
>> Just subscribed to  Spark User.  So, forwarding message again.
>>
>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <start.vin...@gmail.com>
>> wrote:
>>
>>> Thanks Kant. Can you give me a sample program which allows me to call
>>> jni from executor task ?   I have jni working in standalone program in
>>> scala/java.
>>>
>>> Regards,
>>> Vineet
>>>
>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <kanth...@gmail.com>
>>> wrote:
>>>
>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>>
>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup 
>>>> LD_LIBRARY_PATH
>>>> is wrong unless you tell us that it is working for other cases but not this
>>>> one.
>>>>
>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <r...@databricks.com>
>>>> wrote:
>>>>
>>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>>
>>>>>
>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <
>>>>> start.vin...@gmail.com> wrote:
>>>>>
>>>>>> Thanks Reynold for quick reply.
>>>>>>
>>>>>>  I have tried following:
>>>>>>
>>>>>> class MySimpleApp {
>>>>>>  // ---Native methods
>>>>>>   @native def fooMethod (foo: String): String
>>>>>> }
>>>>>>
>>>>>> object MySimpleApp {
>>>>>>   val flag = false
>>>>>>   def loadResources() {
>>>>>> System.loadLibrary("foo-C-library")
>>>>>>   val flag = true
>>>>>>   }
>>>>>>   def main() {
>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>>       if(flag == false){
>>>>>>       MySimpleApp.loadResources()
>>>>>>      val SimpleInstance = new MySimpleApp
>>>>>>       }
>>>>>>       SimpleInstance.fooMethod ("fooString")
>>>>>>       iter
>>>>>>     })
>>>>>>   }
>>>>>> }
>>>>>>
>>>>>> I don't see way to invoke fooMethod which is implemented in
>>>>>> foo-C-library. Is I am missing something ? If possible, can you point me 
>>>>>> to
>>>>>> existing implementation which i can refer to.
>>>>>>
>>>>>> Thanks again.
>>>>>>
>>>>>> ~
>>>>>>
>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <r...@databricks.com>
>>>>>> wrote:
>>>>>>
>>>>>>> bcc dev@ and add user@
>>>>>>>
>>>>>>>
>>>>>>> This is more a user@ list question rather than a dev@ list
>>>>>>> question. You can do something like this:
>>>>>>>
>>>>>>> object MySimpleApp {
>>>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>>>> resources, e.g. with a flag or lazy val
>>>>>>>
>>>>>>>   def main() = {
>>>>>>>     ...
>>>>>>>
>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>>       MySimpleApp.loadResources()
>>>>>>>
>>>>>>>       // do whatever you want with the iterator
>>>>>>>     }
>>>>>>>   }
>>>>>>> }
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <
>>>>>>> start.vin...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>>>> interface (here is sample  application code)
>>>>>>>>
>>>>>>>>
>>>>>>>> class SimpleApp {
>>>>>>>>  // ---Native methods
>>>>>>>> @native def foo (Top: String): String
>>>>>>>> }
>>>>>>>>
>>>>>>>> object SimpleApp  {
>>>>>>>>    def main(args: Array[String]) {
>>>>>>>>
>>>>>>>>     val conf = new SparkConf().setAppName("Simple
>>>>>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>>      System.loadLibrary("foolib")
>>>>>>>>     //instantiate the class
>>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>>     //String passing - Working
>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>>   }
>>>>>>>>
>>>>>>>> Above code work fines.
>>>>>>>>
>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>>>
>>>>>>>> How can i invoke JNI library from worker node ? Where should i load
>>>>>>>> it in executor ?
>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives
>>>>>>>> me following error :
>>>>>>>>
>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>>>
>>>>>>>> Any help would be really appreciated.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to