(WT01 - BAS) <prajod.vettiyat...@wipro.com>
Cc: user <user@spark.apache.org>
Subject: Re: Running in cluster mode causes native library linking to fail
Hello guys,
After lots of time trying to make things work, I finally found what was causing
the issue:
I was calling the function from
t; found out that I did not reply to the group in my original reply.
>
>
>
> *From:* Prajod S Vettiyattil (WT01 - BAS)
> *Sent:* 15 October 2015 11:45
> *To:* 'Bernardo Vecchia Stein' <bernardovst...@gmail.com>
> *Subject:* RE: Running in cluster mode causes nati
278322
]
Regards,
Prajod
From: Bernardo Vecchia Stein [mailto:bernardovst...@gmail.com]
Sent: 15 October 2015 00:36
To: Prajod S Vettiyattil (WT01 - BAS) <prajod.vettiyat...@wipro.com>
Subject: Re: Running in cluster mode causes native library linking to fail
Hello Prajod,
Thanks for your repl
Hi Deenar,
Yes, the native library is installed on all machines of the cluster. I
tried a simpler approach by just using System.load() and passing the exact
path of the library, and things still won't work (I get exactly the same
error and message).
Any ideas of what might be failing?
Thank
Hi Bernardo,
So is this in distributed mode? or single node? Maybe fix the issue with a
single node first ;)
You are right that Spark finds the library but not the *.so file. I also
use System.load() with LD_LIBRARY_PATH set, and I am able to
execute without issues. Maybe you'd like to double
Hi Renato,
I am using a single master and a single worker node, both in the same
machine, to simplify everything. I have tested with System.loadLibrary() as
well (setting all the necessary paths) and get the same error. Just double
checked everything and the parameters are fine.
Bernardo
On 14
Sorry Bernardo, I just double checked. I use: System.loadLibrary();
Could you also try that?
Renato M.
2015-10-14 21:51 GMT+02:00 Renato Marroquín Mogrovejo <
renatoj.marroq...@gmail.com>:
> Hi Bernardo,
>
> So is this in distributed mode? or single node? Maybe fix the issue with a
>
You can also try setting the env variable LD_LIBRARY_PATH to point where
your compiled libraries are.
Renato M.
2015-10-14 21:07 GMT+02:00 Bernardo Vecchia Stein
:
> Hi Deenar,
>
> Yes, the native library is installed on all machines of the cluster. I
> tried a
Hi Renato,
I have done that as well, but so far no luck. I believe spark is finding
the library correctly, otherwise the error message would be "no libraryname
found" or something like that. The problem seems to be something else, and
I'm not sure how to find it.
Thanks,
Bernardo
On 14 October
Hi Bernardo
Is the native library installed on all machines of your cluster and are you
setting both the spark.driver.extraLibraryPath and
spark.executor.extraLibraryPath ?
Deenar
On 14 October 2015 at 05:44, Bernardo Vecchia Stein <
bernardovst...@gmail.com> wrote:
> Hello,
>
> I am trying
Hello,
I am trying to run some scala code in cluster mode using spark-submit. This
code uses addLibrary to link with a .so that exists in the machine, and
this library has a function to be called natively (there's a native
definition as needed in the code).
The problem I'm facing is: whenever I
11 matches
Mail list logo