[ 
https://issues.apache.org/jira/browse/SPARK-5010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiangrui Meng closed SPARK-5010.
--------------------------------
    Resolution: Not a Problem

I'm closing this PR because it is a upstream issue with the native BLAS library.

> native openblas library doesn't work: undefined symbol: cblas_dscal
> -------------------------------------------------------------------
>
>                 Key: SPARK-5010
>                 URL: https://issues.apache.org/jira/browse/SPARK-5010
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.3.0
>         Environment: standalone
>            Reporter: Tomas Hudik
>            Priority: Minor
>              Labels: mllib, openblas
>
> 1. compiled and installed open blas library
> 2, ln -s libopenblas_sandybridgep-r0.2.13.so /usr/lib/libblas.so.3
> 3. compiled and built spark:
> mvn -Pnetlib-lgpl -DskipTests clean compile package
> 4. run: bin/run-example  mllib.LinearRegression 
> data/mllib/sample_libsvm_data.txt
> 14/12/30 18:39:57 INFO BlockManagerMaster: Trying to register BlockManager
> 14/12/30 18:39:57 INFO BlockManagerMasterActor: Registering block manager 
> localhost:34297 with 265.1 MB RAM, BlockManagerId(<driver>, localhost, 34297)
> 14/12/30 18:39:57 INFO BlockManagerMaster: Registered BlockManager
> 14/12/30 18:39:58 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 14/12/30 18:39:58 WARN LoadSnappy: Snappy native library not loaded
> Training: 80, test: 20.
> /usr/local/lib/jdk1.8.0//bin/java: symbol lookup error: 
> /tmp/jniloader1826801168744171087netlib-native_system-linux-x86_64.so: 
> undefined symbol: cblas_dscal
> I followed guide: https://spark.apache.org/docs/latest/mllib-guide.html 
> section dependencies.
> Am I missing something?
> How to force Spark to use openblas library?
> Thanks, Tomas



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to