Re: Third party library

2016-12-13 Thread vineet chadha
Thanks Jakob for sharing the link. Will try it out. Regards, Vineet On Tue, Dec 13, 2016 at 3:00 PM, Jakob Odersky wrote: > Hi Vineet, > great to see you solved the problem! Since this just appeared in my > inbox, I wanted to take the opportunity for a shameless plug: >

Re: Third party library

2016-12-13 Thread Jakob Odersky
Hi Vineet, great to see you solved the problem! Since this just appeared in my inbox, I wanted to take the opportunity for a shameless plug: https://github.com/jodersky/sbt-jni. In case you're using sbt and also developing the native library, this plugin may help with the pains of building and

Re: Third party library

2016-12-13 Thread vineet chadha
Thanks Steve and Kant. Apologies for late reply as I was out for vacation. Got it working. For other users: def loadResources() { System.loadLibrary("foolib") val MyInstance = new MyClass val retstr = MyInstance.foo("mystring") // method trying to invoke }

Re: Third party library

2016-11-27 Thread Steve Loughran
On 27 Nov 2016, at 02:55, kant kodali > wrote: I would say instead of LD_LIBRARY_PATH you might want to use java.library.path in the following way java -Djava.library.path=/path/to/my/library or pass java.library.path along with spark-submit

Re: Third party library

2016-11-26 Thread kant kodali
I would say instead of LD_LIBRARY_PATH you might want to use java.library. path in the following way java -Djava.library.path=/path/to/my/library or pass java.library.path along with spark-submit On Sat, Nov 26, 2016 at 6:44 PM, Gmail wrote: > Maybe you've already checked

Re: Third party library

2016-11-26 Thread Gmail
Maybe you've already checked these out. Some basic questions that come to my mind are: 1) is this library "foolib" or "foo-C-library" available on the worker node? 2) if yes, is it accessible by the user/program (rwx)? Thanks, Vasu. > On Nov 26, 2016, at 5:08 PM, kant kodali

Re: Third party library

2016-11-26 Thread kant kodali
If it is working for standalone program I would think you can apply the same settings across all the spark worker and client machines and give that a try. Lets start with that. On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha wrote: > Just subscribed to Spark User. So,

Re: Third party library

2016-11-26 Thread vineet chadha
Just subscribed to Spark User. So, forwarding message again. On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha wrote: > Thanks Kant. Can you give me a sample program which allows me to call jni > from executor task ? I have jni working in standalone program in >

Re: Third party library

2016-11-26 Thread kant kodali
Yes this is a Java JNI question. Nothing to do with Spark really. java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH is wrong unless you tell us that it is working for other cases but not this one. On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin

Re: Third party library

2016-11-26 Thread Reynold Xin
That's just standard JNI and has nothing to do with Spark, does it? On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha wrote: > Thanks Reynold for quick reply. > > I have tried following: > > class MySimpleApp { > // ---Native methods > @native def fooMethod (foo:

Re: Third party library

2016-11-25 Thread Reynold Xin
bcc dev@ and add user@ This is more a user@ list question rather than a dev@ list question. You can do something like this: object MySimpleApp { def loadResources(): Unit = // define some idempotent way to load resources, e.g. with a flag or lazy val def main() = { ...

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Lan Jiang
I am happy to report that after set spark.dirver.userClassPathFirst, I can use protobuf 3 with spark-shell. Looks like the classloading issue in the driver, not executor. Marcelo, thank you very much for the tip! Lan > On Sep 15, 2015, at 1:40 PM, Marcelo Vanzin wrote:

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Marcelo Vanzin
Hi, Just "spark.executor.userClassPathFirst" is not enough. You should also set "spark.driver.userClassPathFirst". Also not that I don't think this was really tested with the shell, but that should work with regular apps started using spark-submit. If that doesn't work, I'd recommend shading, as

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Steve Loughran
On 15 Sep 2015, at 05:47, Lan Jiang > wrote: Hi, there, I am using Spark 1.4.1. The protobuf 2.5 is included by Spark 1.4.1 by default. However, I would like to use Protobuf 3 in my spark application so that I can use some new features such as Map

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Lan Jiang
Yong > > Date: Tue, 15 Sep 2015 09:33:40 -0500 > Subject: Re: Change protobuf version or any other third party library version > in Spark application > From: ljia...@gmail.com > To: java8...@hotmail.com > CC: ste...@hortonworks.com; user@spark.apache.org > > Steve, >

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Guru Medasani
parameter: > > https://issues.apache.org/jira/browse/SPARK-2996 > <https://issues.apache.org/jira/browse/SPARK-2996> > > Yong > > Subject: Re: Change protobuf version or any other third party library version > in Spark application > From: ste...@hortonworks.com

RE: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread java8964
If you use Standalone mode, just start spark-shell like following: spark-shell --jars your_uber_jar --conf spark.files.userClassPathFirst=true Yong Date: Tue, 15 Sep 2015 09:33:40 -0500 Subject: Re: Change protobuf version or any other third party library version in Spark application From: ljia

Re: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread Lan Jiang
setting depends on your deployment mode, check this for the parameter: > > https://issues.apache.org/jira/browse/SPARK-2996 > > Yong > > ------ > Subject: Re: Change protobuf version or any other third party library > version in Spark application > From:

RE: Change protobuf version or any other third party library version in Spark application

2015-09-15 Thread java8964
ter: https://issues.apache.org/jira/browse/SPARK-2996 Yong Subject: Re: Change protobuf version or any other third party library version in Spark application From: ste...@hortonworks.com To: ljia...@gmail.com CC: user@spark.apache.org Date: Tue, 15 Sep 2015 09:19:28 + On 15 Sep 2

Change protobuf version or any other third party library version in Spark application

2015-09-14 Thread Lan Jiang
Hi, there, I am using Spark 1.4.1. The protobuf 2.5 is included by Spark 1.4.1 by default. However, I would like to use Protobuf 3 in my spark application so that I can use some new features such as Map support. Is there anyway to do that? Right now if I build a uber.jar with dependencies

Re: Why can't Spark Streaming recover from the checkpoint directory when using a third party library for processingmulti-line JSON?

2015-03-04 Thread Emre Sevinc
I've also tried the following: Configuration hadoopConfiguration = new Configuration(); hadoopConfiguration.set(multilinejsoninputformat.member, itemSet); JavaStreamingContext ssc = JavaStreamingContext.getOrCreate(checkpointDirectory, hadoopConfiguration, factory, false); but I

Re: Why can't Spark Streaming recover from the checkpoint directory when using a third party library for processingmulti-line JSON?

2015-03-04 Thread Tathagata Das
That could be a corner case bug. How do you add the 3rd party library to the class path of the driver? Through spark-submit? Could you give the command you used? TD On Wed, Mar 4, 2015 at 12:42 AM, Emre Sevinc emre.sev...@gmail.com wrote: I've also tried the following: Configuration

Re: Why can't Spark Streaming recover from the checkpoint directory when using a third party library for processingmulti-line JSON?

2015-03-04 Thread Emre Sevinc
I'm adding this 3rd party library to my Maven pom.xml file so that it's embedded into the JAR I send to spark-submit: dependency groupIdjson-mapreduce/groupId artifactIdjson-mapreduce/artifactId version1.0-SNAPSHOT/version exclusions exclusion

Why can't Spark Streaming recover from the checkpoint directory when using a third party library for processingmulti-line JSON?

2015-03-03 Thread Emre Sevinc
Hello, I have a Spark Streaming application (that uses Spark 1.2.1) that listens to an input directory, and when new JSON files are copied to that directory processes them, and writes them to an output directory. It uses a 3rd party library to process the multi-line JSON files (