On 27 Nov 2016, at 02:55, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:

I would say instead of LD_LIBRARY_PATH you might want to use java.library.path

in the following way

java -Djava.library.path=/path/to/my/library or pass java.library.path along 
with spark-submit


This is only going to set up paths on the submitting system; to load JNI code 
in the executors, the binary needs to be sent to far end and then put on the 
Java load path there.

Copy the relevant binary to somewhere on the PATH of the destination machine. 
Do that and you shouldn't have to worry about other JVM options, (though it's 
been a few years since I did any JNI).

One trick: write a simple main() object/entry point which calls the JNI method, 
and doesn't attempt to use any spark libraries; have it log any exception and 
return an error code if the call failed. This will let you use it as a link 
test after deployment: if you can't run that class then things are broken, 
before you go near spark


On Sat, Nov 26, 2016 at 6:44 PM, Gmail 
<vgour...@gmail.com<mailto:vgour...@gmail.com>> wrote:
Maybe you've already checked these out. Some basic questions that come to my 
mind are:
1) is this library "foolib" or "foo-C-library" available on the worker node?
2) if yes, is it accessible by the user/program (rwx)?

Thanks,
Vasu.

On Nov 26, 2016, at 5:08 PM, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:

If it is working for standalone program I would think you can apply the same 
settings across all the spark worker  and client machines and give that a try. 
Lets start with that.

On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha 
<start.vin...@gmail.com<mailto:start.vin...@gmail.com>> wrote:
Just subscribed to  Spark User.  So, forwarding message again.

On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha 
<start.vin...@gmail.com<mailto:start.vin...@gmail.com>> wrote:
Thanks Kant. Can you give me a sample program which allows me to call jni from 
executor task ?   I have jni working in standalone program in scala/java.

Regards,
Vineet

On Sat, Nov 26, 2016 at 11:43 AM, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:
Yes this is a Java JNI question. Nothing to do with Spark really.

 java.lang.UnsatisfiedLinkError typically would mean the way you setup 
LD_LIBRARY_PATH is wrong unless you tell us that it is working for other cases 
but not this one.

On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin 
<r...@databricks.com<mailto:r...@databricks.com>> wrote:
That's just standard JNI and has nothing to do with Spark, does it?


On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha 
<start.vin...@gmail.com<mailto:start.vin...@gmail.com>> wrote:
Thanks Reynold for quick reply.

 I have tried following:

class MySimpleApp {
 // ---Native methods
  @native def fooMethod (foo: String): String
}

object MySimpleApp {
  val flag = false
  def loadResources() {
System.loadLibrary("foo-C-library")
  val flag = true
  }
  def main() {
    sc.parallelize(1 to 10).mapPartitions ( iter => {
      if(flag == false){
      MySimpleApp.loadResources()
     val SimpleInstance = new MySimpleApp
      }
      SimpleInstance.fooMethod ("fooString")
      iter
    })
  }
}

I don't see way to invoke fooMethod which is implemented in foo-C-library. Is I 
am missing something ? If possible, can you point me to existing implementation 
which i can refer to.

Thanks again.

~

On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin 
<r...@databricks.com<mailto:r...@databricks.com>> wrote:
bcc dev@ and add user@


This is more a user@ list question rather than a dev@ list question. You can do 
something like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load resources, 
e.g. with a flag or lazy val

  def main() = {
    ...

    sc.parallelize(1 to 10).mapPartitions { iter =>
      MySimpleApp.loadResources()

      // do whatever you want with the iterator
    }
  }
}





On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha 
<start.vin...@gmail.com<mailto:start.vin...@gmail.com>> wrote:
Hi,

I am trying to invoke C library from the Spark Stack using JNI interface (here 
is sample  application code)


class SimpleApp {
 // ---Native methods
@native def foo (Top: String): String
}

object SimpleApp  {
   def main(args: Array[String]) {

    val conf = new 
SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
    val sc = new SparkContext(conf)
     System.loadLibrary("foolib")
    //instantiate the class
     val SimpleAppInstance = new SimpleApp
    //String passing - Working
    val ret = SimpleAppInstance.foo("fooString")
  }

Above code work fines.

I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,  
spark.executor.extraLibraryPath at worker node

How can i invoke JNI library from worker node ? Where should i load it in 
executor ?
Calling  System.loadLibrary("foolib") inside the work node gives me following 
error :

Exception in thread "main" java.lang.UnsatisfiedLinkError:

Any help would be really appreciated.





















Reply via email to