Hi All,
I currently do the following
val jsonDStream = getJsonDStream()
jsonDStream.foreachRDD{rdd =>
val jsonDF = spark.read.json(rdd)
jsonDF.createOrReplaceTempView("dataframe")
}
client.startStream()
%spark.sql select * from dataframe
I can see the data and everytime I click a run
I'm building Zeppelin from sources. I suppose it means that "default spark
interpreter" which has my custom spark deps is included into built zeppelin
dist. It solves my problem! Thanks for explanation.
2017-04-23 5:08 GMT+02:00 moon soo Lee :
> Hi,
>
> 'conf/interpreter-list'
Thanks moon. Unfortunately I’m not an admin for the system I’m using and don’t
control when it gets updated.
Do you happen to know which version of Zeppelin this issue was fixed in? Is it
only 0.7.2?
Thanks, Lucas.
From: moon soo Lee [mailto:m...@apache.org]
Sent: 22 April 2017 06:32
To:
@Chaoran Yu Yeah I don't think its dependency issue. you wouldn't be able
to call methods if you are missing dependencies.
I am also in a similar boat though I am trying to get Streaming and
Zeppelin to work except I have my own indirect receiver (not the direct
stream). That twitter example is
@Chaoran Yu I finally got it working. here is my code. I usually code in
Java but tried to convert it into scala below.
import spark.implicits._
import org.apache.spark.streaming._
SparkConf sparkConf = sc.getConf();
sparkConf.setJars(JavaSparkContext.jarOfClass(Hello.class));
val