I would recommend to upload those jars to HDFS, and use add jars
option in spark-submit with URI from HDFS instead of URI from local
filesystem. Thus, it can avoid the problem of fetching jars from
driver which can be a bottleneck.

Sincerely,

DB Tsai
-------------------------------------------------------
Blog: https://www.dbtsai.com


On Tue, Mar 24, 2015 at 4:13 AM, Xi Shen <davidshe...@gmail.com> wrote:
> Hi,
>
> I am doing ML using Spark mllib. However, I do not have full control to the
> cluster. I am using Microsoft Azure HDInsight
>
> I want to deploy the BLAS or whatever required dependencies to accelerate
> the computation. But I don't know how to deploy those DLLs when I submit my
> JAR to the cluster.
>
> I know how to pack those DLLs into a jar. The real challenge is how to let
> the system find them...
>
>
> Thanks,
> David
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to