I'm trying to use the netlib-java stuff with mllib and sparkR on yarn. I've 
compiled with -Pnetlib-lgpl, see the necessary things in the spark assembly 
jar.  The nodes have  /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3, and 
/usr/lib/libgfortran.so.3.

Running:data <- read.df(sqlContext, 'data.csv', 'com.databricks.spark.csv')
mdl = glm(C2~., data, family="gaussian")

But I get the error:15/11/06 21:17:27 WARN LAPACK: Failed to load 
implementation from: com.github.fommil.netlib.NativeSystemLAPACK15/11/06 
21:17:27 WARN LAPACK: Failed to load implementation from: 
com.github.fommil.netlib.NativeRefLAPACK15/11/06 21:17:27 ERROR 
RBackendHandler: fitRModelFormula on org.apache.spark.ml.api.r.SparkRWrappers 
failedError in invokeJava(isStatic = TRUE, className, methodName, ...) :   
java.lang.AssertionError: assertion failed: lapack.dpotrs returned 18.       at 
scala.Predef$.assert(Predef.scala:179)        at 
org.apache.spark.mllib.linalg.CholeskyDecomposition$.solve(CholeskyDecomposition.scala:40)
        at 
org.apache.spark.ml.optim.WeightedLeastSquares.fit(WeightedLeastSquares.scala:114)
        at 
org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:166)
        at 
org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:65)
        at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)        at 
org.apache.spark.ml.Predictor.fit(Predictor.scala:71)        at 
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:138)        at 
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:134)
Anyone have this working?
Thanks,Tom

Reply via email to