Hi All,

I am attaching the full procedure I adopted to run R from Apache Zeppelin
using Datalayer's zeppelin-R repo on the github but I am facing voidEval
function error.It has been one full week that I have been stuck at this
problem. Any help would be appreciated.

Please find Steps.txt as the file in which I have written the steps that I
have followed.

Here is the  code I wrote:

%r
print(1:5)
voidEval failed



Here is the error log:

ERROR StatusLogger No log4j2 configuration file found. Using default
configuration: logging only errors to the console.

StartRserve: first connect try failed with: Cannot connect: Connection
refused

StartRserve: waiting for Rserve to start ... (java.lang.UNIXProcess@1c9c5521
)

StartRserve: Rserve>

StartRserve: Rserve>Attaching package: ‘SparkR’

StartRserve: Rserve>

StartRserve: Rserve>The following objects are masked from ‘package:base’:

StartRserve: Rserve>

StartRserve: Rserve>    intersect, rbind, sample, summary, table

StartRserve: Rserve>

StartRserve: Rserve>Launching java with spark-submit command
/Users/ashish.dalal/downloads/spark/bin/spark-submit   sparkr-shell
/var/folders/62/5997pbmn0qjg8766pv4mjxwrr8n54y/T//RtmpDWgRDn/backend_portd74679a20ac

StartRserve: Rserve>SLF4J: Class path contains multiple SLF4J bindings.

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/interpreter/R/log4j-slf4j-impl-2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/interpreter/R/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/spark/assembly/target/scala-2.10/spark-assembly-1.5.0-SNAPSHOT-hadoop2.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: See
http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

StartRserve: Rserve>SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]

StartRserve: Rserve>ERROR StatusLogger No log4j2 configuration file found.
Using default configuration: logging only errors to the console.

StartRserve: Rserve>log4j:ERROR setFile(null,true) call failed.

StartRserve: Rserve>java.io.FileNotFoundException:  (No such file or
directory)

StartRserve: Rserve>    at java.io.FileOutputStream.open(Native Method)

StartRserve: Rserve>    at
java.io.FileOutputStream.<init>(FileOutputStream.java:221)

StartRserve: Rserve>    at
java.io.FileOutputStream.<init>(FileOutputStream.java:142)

StartRserve: Rserve>    at
org.apache.log4j.FileAppender.setFile(FileAppender.java:294)

StartRserve: Rserve>    at
org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)

StartRserve: Rserve>    at
org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547)

StartRserve: Rserve>    at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483)

StartRserve: Rserve>    at
org.apache.log4j.LogManager.<clinit>(LogManager.java:127)

StartRserve: Rserve>    at
org.apache.log4j.Logger.getLogger(Logger.java:104)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:75)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)

StartRserve: Rserve>    at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2034)

StartRserve: Rserve>    at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

StartRserve: Rserve>    at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2025)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)

StartRserve: Rserve>    at
org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253)

StartRserve: Rserve>    at
org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)

StartRserve: Rserve>    at
org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)

StartRserve: Rserve>    at
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)

StartRserve: Rserve>    at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)

StartRserve: Rserve>    at
org.apache.spark.SparkContext.<init>(SparkContext.scala:432)

StartRserve: Rserve>    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)

StartRserve: Rserve>    at
org.apache.spark.api.r.RRDD$.createSparkContext(RRDD.scala:375)

StartRserve: Rserve>    at
org.apache.spark.api.r.RRDD.createSparkContext(RRDD.scala)

StartRserve: Rserve>    at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

StartRserve: Rserve>    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

StartRserve: Rserve>    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

StartRserve: Rserve>    at java.lang.reflect.Method.invoke(Method.java:606)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:132)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:79)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)

StartRserve: Rserve>    at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)



14:30:34.949 [pool-3-thread-2] ERROR io.datalayer.zeppelin.R.RInterpreter -
Exception while connecting to Rserve

org.rosuda.REngine.Rserve.RserveException: voidEval failed

        at
org.rosuda.REngine.Rserve.RConnection.voidEval(RConnection.java:209)
~[Rserve-1.8.2-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:117)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:98)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
[?:1.7.0_79]

        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[?:1.7.0_79]

        at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]



14:30:34.995 [pool-3-thread-2] ERROR io.datalayer.zeppelin.R.RInterpreter -
Exception while connecting to Rserve

org.rosuda.REngine.Rserve.RserveException: voidEval failed

        at
org.rosuda.REngine.Rserve.RConnection.voidEval(RConnection.java:209)
~[Rserve-1.8.2-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:117)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
[?:1.7.0_79]

        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[?:1.7.0_79]

        at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]

On Thu, Jul 30, 2015 at 9:25 PM, Ashish Dalal <dalalstree...@gmail.com>
wrote:

> I am able to see that R interpreter is registered in the Interpreter Group
> but when I write R code and evaluate it, it throws an error saying
> "voidEval failed".
>
> Thanks
>
> On Thu, Jul 30, 2015 at 8:00 PM, <felixcheun...@hotmail.com> wrote:
>
>> *There* is another effort (that is not GPLv3 licensed) I am part of for
>> a SparkR interpreter.
>>
>> What is the issue you are seeing?
>>
>> From: Ashish Dalal
>> Sent: Wednesday, July 29, 2:02 PM
>> Subject: Adding R interpreter to Zeppelin
>> To: users@zeppelin.incubator.apache.org
>>
>> Hi all,
>>
>> I am working on adding R interpreter to Zeppelin.
>>
>> There is a github repo by datalayer.io  (
>> https://github.com/datalayer/zeppelin-R) and I have added it's
>> RInterpreter.java  & RInterpreterTest.java along with modifying pom.xml but
>> I am still not able to run R code in Apache Zeppelin.
>>
>> I am able to see the R interpreter registered in the list of interpreters
>> when I start running the Zeppelin, but the evaluations for the R code I
>> write is not getting done.
>>
>> Please help if you can.
>>
>> Thanks!
>>
>> Regards,
>>
>> Ashish Dalal
>>
>>
>
1. Go to the github repo at https://github.com/apache/spark and git clone this 
repo in your downloads directory.
2. Go to your home directory, and make a .Rprofile file by using the vi command.
3. Write in it the following text to set your SparkR home directory.

Sys.setenv(SPARK_HOME="/Users/ashish.dalal/downloads/spark")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
sc <- sparkR.init(master="local") 

4. Make a .profile file in the home directory and set the following maven 
option inside it:
     export MAVEN_OPTS="-Xmx512m -XX:MaxPermSize=512m”
5. The place where I have specified the SPARK_HOME = "  ", in between " ", 
write down full path of the directory where spark 
is installed. Save the file and get back to the terminal.

6. Again go the spark directory via terminal and run the following command to 
build spark from source enabling SparkR from 
shell. 

Command-> build/mvn -DskipTests -Psparkr package

7. Wait for sometime till the spark builds from source.
8. Once the spark is successfully built from source, go to the apache zeppelin 
repo on github at https://github.com/apache/incubator-zeppelin/
and clone this repo on your local machine in the downloads directory.

9. Build Zeppelin from source by running the following command.
mvn clean install -DskipTests -Pspark-1.4

10. Once Zeppelin is successfully built from source, go to the zeppelin-R repo 
on github at https://github.com/datalayer/zeppelin-R
and clone this repo on your local machine in the incubator-zeppelin directory.

11. Change your directory to Zeppelin-R and run the following command
mvn install

12. Once Zeppelin-R is successfully built from source, you will see a target 
folder in your zeppelin-R directory which earlier was
not present.

13. create conf/zeppelin-site.xml by copying conf/zeppelin-site.xml.template to 
conf/zeppelin-site.xml

14. Add your interpreter class name to the zeppelin.interpreters property in 
conf/zeppelin-site.xml

Property value is comma separated [INTERPRETERCLASSNAME] for example,

<property>
<name>zeppelin.interpreters</name>
<value>org.apache.zeppelin.spark.SparkInterpreter,org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkSqlInterpreter,org.apache.zeppelin.spark.DepInterpreter,org.apache.zeppelin.markdown.Markdown,org.apache.zeppelin.shell.ShellInterpreter,org.apache.zeppelin.hive.HiveInterpreter,io.datalayer.zeppelin.R.RInterpreter</value>
</property> 

13. Go to the incubator-zeppelin directory and then go to the interpreter 
subdirectory, and make a directory named “R”.

14. Copy all the dependency jars in the R folder. For copying the dependencies, 
first copy the jar files from target folder inside zeppelin-R directory to the 
R folder inside the incubator-zeppelin/interpreter/ directory. 

15. Once the jars from the target folder are copied, go to the zeppelin-R 
directory and run the following command:
mvn dependency:copy-dependencies

16. Once the dependencies are copied to the target folder, you will see a 
target/dependency folder filled with all the dependencies, including transitive.


17.Now copy all the jar files from the dependency folder to the r folder inside 
the interpreter directory.

18.Once the files are copied, go to the zeppelin-zengine directory inside the 
incubator-zeppelin directory and add the RInterpreter class in the 
zeppelinconfiguration.java file inside the 
/src/main/java/org/apache/zeppelin/conf/.


19. Now go to the incubator-zeppelin and run mvm install -DskipTests 
-Pspark-1.4 

20. Install Rserve package and load Rserve package in R
21. Install Rserve package and load Rserve package in Rstudio
22. Now run go to the incubator-zeppelin directory and run 
./bin/zeppelin-dameon.sh start
23. open localhost:8080 in the web browser and create a new notebook
24. type %r and then write any R code for e.g. print(1:5).

25. Getting a voidEval function error

Reply via email to