Both spark-submit and spark-shell have a --jars option for passing
additional jars to the cluster. They will be added to the appropriate
classpaths.

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, Mar 24, 2015 at 4:13 AM, Xi Shen <davidshe...@gmail.com> wrote:

> Hi,
>
> I am doing ML using Spark mllib. However, I do not have full control to
> the cluster. I am using Microsoft Azure HDInsight
>
> I want to deploy the BLAS or whatever required dependencies to accelerate
> the computation. But I don't know how to deploy those DLLs when I submit my
> JAR to the cluster.
>
> I know how to pack those DLLs into a jar. The real challenge is how to let
> the system find them...
>
>
> Thanks,
> David
>
>

Reply via email to