,
Grace (Huang Jie)
From: Jason Dai [mailto:jason@gmail.com]
Sent: Wednesday, March 11, 2015 10:45 PM
To: Irfan Ahmad
Cc: Tobias Pfeiffer; Cheng, Hao; Mohit Anchlia; user@spark.apache.org; Shao,
Saisai; Dai, Jason; Huang, Jie
Subject: Re: SQL with Spark Streaming
Sorry typo; should be https
Hi,
On Thu, Mar 12, 2015 at 12:08 AM, Huang, Jie jie.hu...@intel.com wrote:
According to my understanding, your approach is to register a series of
tables by using transformWith, right? And then, you can get a new Dstream
(i.e., SchemaDstream), which consists of lots of SchemaRDDs.
Yep,
Yes, a previous prototype is available
https://github.com/Intel-bigdata/spark-streamsql, and a talk is given at
last year's Spark Summit (
http://spark-summit.org/2014/talk/streamsql-on-spark-manipulating-streams-by-sql-using-spark
)
We are currently porting the prototype to use the latest
Sorry typo; should be https://github.com/intel-spark/stream-sql
Thanks,
-Jason
On Wed, Mar 11, 2015 at 10:19 PM, Irfan Ahmad ir...@cloudphysics.com
wrote:
Got a 404 on that link: https://github.com/Intel-bigdata/spark-streamsql
*Irfan Ahmad*
CTO | Co-Founder | *CloudPhysics*
Got a 404 on that link: https://github.com/Intel-bigdata/spark-streamsql
*Irfan Ahmad*
CTO | Co-Founder | *CloudPhysics* http://www.cloudphysics.com
Best of VMworld Finalist
Best Cloud Management Award
NetworkWorld 10 Startups to Watch
EMA Most Notable Vendor
On Wed, Mar 11, 2015 at 6:41 AM,
Hi,
On Wed, Mar 11, 2015 at 9:33 AM, Cheng, Hao hao.ch...@intel.com wrote:
Intel has a prototype for doing this, SaiSai and Jason are the authors.
Probably you can ask them for some materials.
The github repository is here: https://github.com/intel-spark/stream-sql
Also, what I did is
Does Spark Streaming also supports SQLs? Something like how Esper does CEP.
Intel has a prototype for doing this, SaiSai and Jason are the authors.
Probably you can ask them for some materials.
From: Mohit Anchlia [mailto:mohitanch...@gmail.com]
Sent: Wednesday, March 11, 2015 8:12 AM
To: user@spark.apache.org
Subject: SQL with Spark Streaming
Does Spark Streaming also
00:06:45 -0500
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
To: bigdat...@live.com
CC: lian.cs@gmail.com; user@spark.apache.org
The various spark contexts generally aren't serializable because you can't use
them on the executors anyway. We made SQLContext
Generally you can use |-Dsun.io.serialization.extendedDebugInfo=true| to
enable serialization debugging information when serialization exceptions
are raised.
On 12/24/14 1:32 PM, bigdata4u wrote:
I am trying to use sql over Spark streaming using Java. But i am getting
Serialization Exception
@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true
to enable serialization debugging information when serialization
exceptions are raised
integrating SQL and Spark Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true
to enable serialization debugging information when serialization
exceptions are raised.
On 12/24/14 1:32 PM,
bigdata4u wrote
: Wed, 24 Dec 2014 16:23:30 +0800
From: lian.cs@gmail.com
To: bigdat...@live.com; user@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark
Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true to
enable serialization debugging
I am trying to use sql over Spark streaming using Java. But i am getting
Serialization Exception.
public static void main(String args[]) {
SparkConf sparkConf = new SparkConf().setAppName(NumberCount);
JavaSparkContext jc = new JavaSparkContext(sparkConf);
JavaStreamingContext jssc
14 matches
Mail list logo