RE: Spark error when loading phoenix-spark dependency

2016-09-05 Thread Vikash Kumar
Hi,
I am loading the library through UI in spark interpreter as:


1.   org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1

Excluded :- org.scala-lang:scala-library, org.scala-lang:scala-compiler, 
org.scala-lang:scala-reflect, org.apache.phoenix:phoenix-core


2.   org.apache.phoenix:phoenix-core:4.4.0-HBase-1.1

Excluded :- com.sun.jersey:jersey-core, com.sun.jersey:jersey-server, 
com.sun.jersey:jersey-client, org.ow2.asm:asm, io.netty:netty

Thanks and Regard,
Vikash Kumar

From: astros...@gmail.com [mailto:astros...@gmail.com] On Behalf Of Hyung Sung 
Shim
Sent: Tuesday, September 6, 2016 10:47 AM
To: users 
Subject: Re: Spark error when loading phoenix-spark dependency

Hello.
How did you load library?


2016-09-06 13:49 GMT+09:00 Vikash Kumar 
>:
Hi ,
Is there anyone who is getting the same errors?

Thanks and Regard,
Vikash Kumar

From: Vikash Kumar 
[mailto:vikash.ku...@resilinc.com]
Sent: Thursday, September 1, 2016 11:08 AM
To: users@zeppelin.apache.org
Subject: Spark error when loading phoenix-spark dependency

Hi all,
I am getting the following error when loading the 
org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1 dependency from spark 
interpreter. I am using Zeppelin Version 0.6.2-SNAPSHOT with spark 1.6.1 and 
hdp 2.7.1.

The packages that I am inporting is:
import org.apache.phoenix.spark._
import org.apache.phoenix.spark.PhoenixRDD._
import java.sql.{ Date, Timestamp}
My build command is
mvn clean package -DskipTests -Drat.ignoreErrors=true 
-Dcheckstyle.skip=true -Pspark-1.6 -Dspark.version=1.6.1 -Phadoop-2.6 –Pyarn


java.lang.NoSuchMethodError: 
org.apache.spark.util.Utils$.resolveURIs(Ljava/lang/String;)Ljava/lang/String;
at 
org.apache.spark.repl.SparkILoop$.getAddedJars(SparkILoop.scala:1079)
at 
org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:210)
at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:698)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)





Thanks and Regard,
Vikash Kumar



Re: Spark error when loading phoenix-spark dependency

2016-09-05 Thread Hyung Sung Shim
You can refer [1] to load dependency loading library.

[1]
http://zeppelin.apache.org/docs/0.7.0-SNAPSHOT/interpreter/spark.html#dependencyloading


2016-09-06 14:16 GMT+09:00 Hyung Sung Shim :

> Hello.
> How did you load library?
>
>
> 2016-09-06 13:49 GMT+09:00 Vikash Kumar :
>
>> Hi ,
>>
>> Is there anyone who is getting the same errors?
>>
>>
>>
>> Thanks and Regard,
>>
>> Vikash Kumar
>>
>>
>>
>> *From:* Vikash Kumar [mailto:vikash.ku...@resilinc.com]
>> *Sent:* Thursday, September 1, 2016 11:08 AM
>> *To:* users@zeppelin.apache.org
>> *Subject:* Spark error when loading phoenix-spark dependency
>>
>>
>>
>> Hi all,
>>
>> I am getting the following error when loading the
>> org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1 dependency from spark
>> interpreter. I am using Zeppelin *Version 0.6.2-SNAPSHOT* with spark
>> 1.6.1 and hdp 2.7.1.
>>
>>
>>
>> The packages that I am inporting is:
>>
>> import org.apache.phoenix.spark._
>>
>> import org.apache.phoenix.spark.PhoenixRDD._
>>
>> import java.sql.{ Date, Timestamp}
>>
>> My build command is
>>
>> mvn clean package -DskipTests -Drat.ignoreErrors=true
>> -Dcheckstyle.skip=true -Pspark-1.6 -Dspark.version=1.6.1 -Phadoop-2.6 –Pyarn
>>
>>
>>
>>
>>
>> java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.r
>> esolveURIs(Ljava/lang/String;)Ljava/lang/String;
>>
>> at org.apache.spark.repl.SparkILo
>> op$.getAddedJars(SparkILoop.scala:1079)
>>
>> at org.apache.spark.repl.SparkILo
>> op.createInterpreter(SparkILoop.scala:210)
>>
>> at org.apache.zeppelin.spark.Spar
>> kInterpreter.open(SparkInterpreter.java:698)
>>
>> at org.apache.zeppelin.interprete
>> r.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
>>
>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.
>> interpret(LazyOpenInterpreter.java:93)
>>
>> at org.apache.zeppelin.interprete
>> r.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteI
>> nterpreterServer.java:341)
>>
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
>>
>> at org.apache.zeppelin.scheduler.
>> FIFOScheduler$1.run(FIFOScheduler.java:139)
>>
>> at java.util.concurrent.Executors$RunnableAdapter.call(
>> Executors.java:511)
>>
>> at java.util.concurrent.FutureTas
>> k.run(FutureTask.java:266)
>>
>> at java.util.concurrent.Scheduled
>> ThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledT
>> hreadPoolExecutor.java:180)
>>
>> at java.util.concurrent.Scheduled
>> ThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPo
>> olExecutor.java:293)
>>
>> at java.util.concurrent.ThreadPoo
>> lExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>
>> at java.util.concurrent.ThreadPoo
>> lExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> Thanks and Regard,
>>
>> Vikash Kumar
>>
>
>


RE: Spark error when loading phoenix-spark dependency

2016-09-05 Thread Vikash Kumar
Hi ,
Is there anyone who is getting the same errors?

Thanks and Regard,
Vikash Kumar

From: Vikash Kumar [mailto:vikash.ku...@resilinc.com]
Sent: Thursday, September 1, 2016 11:08 AM
To: users@zeppelin.apache.org
Subject: Spark error when loading phoenix-spark dependency

Hi all,
I am getting the following error when loading the 
org.apache.phoenix:phoenix-spark:4.4.0-HBase-1.1 dependency from spark 
interpreter. I am using Zeppelin Version 0.6.2-SNAPSHOT with spark 1.6.1 and 
hdp 2.7.1.

The packages that I am inporting is:
import org.apache.phoenix.spark._
import org.apache.phoenix.spark.PhoenixRDD._
import java.sql.{ Date, Timestamp}
My build command is
mvn clean package -DskipTests -Drat.ignoreErrors=true 
-Dcheckstyle.skip=true -Pspark-1.6 -Dspark.version=1.6.1 -Phadoop-2.6 -Pyarn


java.lang.NoSuchMethodError: 
org.apache.spark.util.Utils$.resolveURIs(Ljava/lang/String;)Ljava/lang/String;
at 
org.apache.spark.repl.SparkILoop$.getAddedJars(SparkILoop.scala:1079)
at 
org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:210)
at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:698)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)





Thanks and Regard,
Vikash Kumar


Spark External JAR and Netty issue

2016-09-05 Thread Michael Pedersen
Hello,

I'm trying to include an external JAR file following the documentation for
dependency management:

https://zeppelin.apache.org/docs/latest/manual/dependencymanagement.html

The docs state, under "Loading Dependencies to Interpreter", that: "You can
enter not only groupId:artifactId:version but also local file in artifact
field". So, I enter the full path to my local JAR file.

But when running a Spark job through the notebook, execution hangs and a
Netty NoSuchMethod error message appears in the log; a full stack trace is
copied below. This is using the latest binaries for Zeppelin (0.6.1),
running locally.

Does anybody have a suggestion how this can be resolved? Do I need to
provide the required Netty dependencies?

Many thanks,
Michael

Excerpt from Zeppelin logs:

INFO [2016-09-05 17:20:53,794] ({Executor task launch worker-0}
Logging.scala[logInfo]:54) - Fetching spark://
192.168.1.81:54957/jars/[...]-assembly-1.0.jar to
/private/var/folders/1v/_8cfp9313dv1sdmdhwggyp94gn/T/spark-a0b6d56e-a884-4de3-8ded-65b1c21717f3/userFiles-98918d40-5946-4926-aa06-c5808cea7120/fetchFileTemp536145571300926075.tmp

ERROR [2016-09-05 17:20:53,831] ({shuffle-server-1}
TransportRequestHandler.java[operationComplete]:200) - Error sending result
StreamResponse{streamId=/jars/[...]-assembly-1.0.jar, byteCount=32640349,
body=FileSegmentManagedBuffer{file=[...]/zeppelin-0.6.1-bin-netinst/local-repo/2BVRQKH3U/[...]-assembly-1.0.jar,
offset=0, length=32640349}} to /192.168.1.81:54959; closing connection

io.netty.handler.codec.EncoderException: java.lang.NoSuchMethodError:
io.netty.channel.DefaultFileRegion.(Ljava/io/File;JJ)V

at
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:107)

at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)

at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)

at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:651)

at
io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:266)

at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)

at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)

at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:706)

at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:741)

at
io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:895)

at io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:240)

at
org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:193)

at
org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:149)

at
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:110)

at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)

at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)

at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)

at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)

at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)

at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)

at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)

at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)

at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)

at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)

at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)

at
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)

at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)

at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)

at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)

at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)

at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)

at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)

at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)

at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)

at