I have been unsuccessful with incorporating an external Jar into a SparkR
program.  Does anyone know how to do this successfully?

JarTest.java
=================
package com.myco;

public class JarTest {
   public static double myStaticMethod() {
       return 5.515;
   }

}
=================
JarTest.R
=================
Sys.setenv(SPARK_HOME="/usr/local/spark-1.4.0-bin-hadoop2.6/")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))

library(SparkR)

#sparkR.stop() # Stop if you want to rerun the code
sc = sparkR.init(master="local", sparkJars=c("/locOfMyJar/JarTest.jar"))

SparkR:::callJStatic("java.lang.Math", "max", 5, 2) # OK: 5
SparkR:::callJStatic("java.lang.Math", "min", 5, 2) # OK: 2
SparkR:::callJStatic("com.myco.JarTest", "myStaticMethod") # Fails, see
below

# 5/06/22 13:38:07 ERROR RBackendHandler: myStaticMethod on com.myco.JarTest
failed
# java.lang.ClassNotFoundException: com.myco.JarTest
# at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
# at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
# at java.security.AccessController.doPrivileged(Native Method)
# at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
# at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
# at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
# at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
# at java.lang.Class.forName0(Native Method)
# at java.lang.Class.forName(Class.java:190)
# at
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:101)
# at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:74)
# at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:36)
# at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
# at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
# at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
# at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
# at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
# at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
# at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163)
# at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
# at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
# at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
# at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
# at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
# at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
# at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
# at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
# at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
# at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
# at java.lang.Thread.run(Thread.java:744)
# Error: returnStatus == 0 is not TRUE
=================

So basically something like:
> javac jarTest.java
> jar -cf JarTest.jar JarTest.class
> jar -tf JarTest.jar

Then I run RStudio or R with the commands you see in JarTest.R (make sure
you point to your jar file).  As you can see in the comments, it does not
appear to find the Java class.  Does anyone know a way to make this work
correctly?

Thanks!





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/External-Jar-file-with-SparkR-tp23433.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to