Hi,

For jars use spark-submit --jars. Dunno about so's. Could that work through
jars?

Jacek
11.03.2016 8:07 PM "prateek arora" <prateek.arora...@gmail.com> napisaƂ(a):

> Hi
>
> I have multiple node cluster and my spark jobs depend on a native
> library (.so files) and some jar files.
>
> Can some one please explain what are the best ways to distribute dependent
> files across nodes?
>
> right now i copied  dependent files in all nodes using chef tool .
>
> Regards
> Prateek
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-distribute-dependent-files-so-jar-across-spark-worker-nodes-tp26464.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to