Are you deploying the windows dll to linux machine?

Sincerely,

DB Tsai
-------------------------------------------------------
Blog: https://www.dbtsai.com


On Wed, Mar 25, 2015 at 3:57 AM, Xi Shen <davidshe...@gmail.com> wrote:
> I think you meant to use the "--files" to deploy the DLLs. I gave a try, but
> it did not work.
>
> From the Spark UI, Environment tab, I can see
>
> spark.yarn.dist.files
>
> file:/c:/openblas/libgcc_s_seh-1.dll,file:/c:/openblas/libblas3.dll,file:/c:/openblas/libgfortran-3.dll,file:/c:/openblas/liblapack3.dll,file:/c:/openblas/libquadmath-0.dll
>
> I think my DLLs are all deployed. But I still got the warn message that
> native BLAS library cannot be load.
>
> And idea?
>
>
> Thanks,
> David
>
>
> On Wed, Mar 25, 2015 at 5:40 AM DB Tsai <dbt...@dbtsai.com> wrote:
>>
>> I would recommend to upload those jars to HDFS, and use add jars
>> option in spark-submit with URI from HDFS instead of URI from local
>> filesystem. Thus, it can avoid the problem of fetching jars from
>> driver which can be a bottleneck.
>>
>> Sincerely,
>>
>> DB Tsai
>> -------------------------------------------------------
>> Blog: https://www.dbtsai.com
>>
>>
>> On Tue, Mar 24, 2015 at 4:13 AM, Xi Shen <davidshe...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am doing ML using Spark mllib. However, I do not have full control to
>> > the
>> > cluster. I am using Microsoft Azure HDInsight
>> >
>> > I want to deploy the BLAS or whatever required dependencies to
>> > accelerate
>> > the computation. But I don't know how to deploy those DLLs when I submit
>> > my
>> > JAR to the cluster.
>> >
>> > I know how to pack those DLLs into a jar. The real challenge is how to
>> > let
>> > the system find them...
>> >
>> >
>> > Thanks,
>> > David
>> >

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to