Is there anything other then the spark assembly that needs to be in the
classpath? I verified the assembly was built right and its in the classpath
(else nothing would work).
Thanks,Tom
On Tuesday, November 10, 2015 8:29 PM, Shivaram Venkataraman
wrote:
Nothing more -- The only two things I can think of are: (a) is there
something else on the classpath that comes before this lgpl JAR ? I've
seen cases where two versions of netlib-java on the classpath can mess
things up. (b) There is something about the way SparkR is using
reflection to invoke
I think this is happening in the driver. Could you check the classpath
of the JVM that gets started ? If you use spark-submit on yarn the
classpath is setup before R gets launched, so it should match the
behavior of Scala / Python.
Thanks
Shivaram
On Fri, Nov 6, 2015 at 1:39 PM, Tom Graves
I'm trying to use the netlib-java stuff with mllib and sparkR on yarn. I've
compiled with -Pnetlib-lgpl, see the necessary things in the spark assembly
jar. The nodes have /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3, and
/usr/lib/libgfortran.so.3.
Running:data <- read.df(sqlContext,