bcc dev@ and add user@

This is more a user@ list question rather than a dev@ list question. You
can do something like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load
resources, e.g. with a flag or lazy val

  def main() = {
    ...

    sc.parallelize(1 to 10).mapPartitions { iter =>
      MySimpleApp.loadResources()

      // do whatever you want with the iterator
    }
  }
}





On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <start.vin...@gmail.com>
wrote:

> Hi,
>
> I am trying to invoke C library from the Spark Stack using JNI interface
> (here is sample  application code)
>
>
> class SimpleApp {
>  // ---Native methods
> @native def foo (Top: String): String
> }
>
> object SimpleApp  {
>    def main(args: Array[String]) {
>
>     val conf = new 
> SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> "lib")
>     val sc = new SparkContext(conf)
>      System.loadLibrary("foolib")
>     //instantiate the class
>      val SimpleAppInstance = new SimpleApp
>     //String passing - Working
>     val ret = SimpleAppInstance.foo("fooString")
>   }
>
> Above code work fines.
>
> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
> spark.executor.extraLibraryPath at worker node
>
> How can i invoke JNI library from worker node ? Where should i load it in
> executor ?
> Calling  System.loadLibrary("foolib") inside the work node gives me
> following error :
>
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>
> Any help would be really appreciated.
>
>
>
>
>
>
>
>
>
>
>
>
>

Reply via email to