That's just standard JNI and has nothing to do with Spark, does it?

On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <start.vin...@gmail.com>
wrote:

> Thanks Reynold for quick reply.
>
>  I have tried following:
>
> class MySimpleApp {
>  // ---Native methods
>   @native def fooMethod (foo: String): String
> }
>
> object MySimpleApp {
>   val flag = false
>   def loadResources() {
> System.loadLibrary("foo-C-library")
>   val flag = true
>   }
>   def main() {
>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>       if(flag == false){
>       MySimpleApp.loadResources()
>      val SimpleInstance = new MySimpleApp
>       }
>       SimpleInstance.fooMethod ("fooString")
>       iter
>     })
>   }
> }
>
> I don't see way to invoke fooMethod which is implemented in foo-C-library.
> Is I am missing something ? If possible, can you point me to existing
> implementation which i can refer to.
>
> Thanks again.
>
> ~
>
> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <r...@databricks.com> wrote:
>
>> bcc dev@ and add user@
>>
>>
>> This is more a user@ list question rather than a dev@ list question. You
>> can do something like this:
>>
>> object MySimpleApp {
>>   def loadResources(): Unit = // define some idempotent way to load
>> resources, e.g. with a flag or lazy val
>>
>>   def main() = {
>>     ...
>>
>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>       MySimpleApp.loadResources()
>>
>>       // do whatever you want with the iterator
>>     }
>>   }
>> }
>>
>>
>>
>>
>>
>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <start.vin...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I am trying to invoke C library from the Spark Stack using JNI interface
>>> (here is sample  application code)
>>>
>>>
>>> class SimpleApp {
>>>  // ---Native methods
>>> @native def foo (Top: String): String
>>> }
>>>
>>> object SimpleApp  {
>>>    def main(args: Array[String]) {
>>>
>>>     val conf = new SparkConf().setAppName("Simple
>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>     val sc = new SparkContext(conf)
>>>      System.loadLibrary("foolib")
>>>     //instantiate the class
>>>      val SimpleAppInstance = new SimpleApp
>>>     //String passing - Working
>>>     val ret = SimpleAppInstance.foo("fooString")
>>>   }
>>>
>>> Above code work fines.
>>>
>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>> spark.executor.extraLibraryPath at worker node
>>>
>>> How can i invoke JNI library from worker node ? Where should i load it
>>> in executor ?
>>> Calling  System.loadLibrary("foolib") inside the work node gives me
>>> following error :
>>>
>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>
>>> Any help would be really appreciated.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Reply via email to