RE: Re: Spark assembly in Maven repo?

2015-12-14 Thread Xiaoyong Zhu
Thanks for the info!

Xiaoyong

From: Sean Owen [mailto:so...@cloudera.com]
Sent: Monday, December 14, 2015 12:20 AM
To: Xiaoyong Zhu <xiaoy...@microsoft.com>
Cc: user <user@spark.apache.org>
Subject: Re: Re: Spark assembly in Maven repo?


Yes, though I think the Maven Central repository is more canonical.

http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.5.2/<https://na01.safelinks.protection.outlook.com/?url=http%3a%2f%2frepo1.maven.org%2fmaven2%2forg%2fapache%2fspark%2fspark-core_2.10%2f1.5.2%2f=01%7c01%7cxiaoyzhu%40064d.mgd.microsoft.com%7cbc95bb73fe654d2c16df08d3045f61e3%7c72f988bf86f141af91ab2d7cd011db47%7c1=UmuArU%2bj9WRhKE0VpQkm63TC4ewQaq150Ne5YbwmiL4%3d>

On Mon, Dec 14, 2015, 06:35 Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Thanks! do you mean something here (for example for 1.5.1 using scala 2.10)?
https://repository.apache.org/content/repositories/releases/org/apache/spark/spark-core_2.10/1.5.1/<https://na01.safelinks.protection.outlook.com/?url=https%3a%2f%2frepository.apache.org%2fcontent%2frepositories%2freleases%2forg%2fapache%2fspark%2fspark-core_2.10%2f1.5.1%2f=01%7c01%7cxiaoyzhu%40064d.mgd.microsoft.com%7cbc95bb73fe654d2c16df08d3045f61e3%7c72f988bf86f141af91ab2d7cd011db47%7c1=XznNXDLoqiO7aE48Vg7BhTcnVico4ibnalzlmM67nec%3d>

Xiaoyong

From: Sean Owen [mailto:so...@cloudera.com<mailto:so...@cloudera.com>]
Sent: Saturday, December 12, 2015 12:45 AM
To: Xiaoyong Zhu <xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>

Subject: Re: Re: Spark assembly in Maven repo?

That's exactly what the various artifacts in the Maven repo are for. The API 
classes for core are in the core artifact and so on. You don't need an assembly.

On Sat, Dec 12, 2015 at 12:32 AM, Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Yes, so our scenario is to treat the spark assembly as an “SDK” so users can 
develop Spark applications easily without downloading them. In this case which 
way do you guys think might be good?

Xiaoyong

From: fightf...@163.com<mailto:fightf...@163.com> 
[mailto:fightf...@163.com<mailto:fightf...@163.com>]
Sent: Friday, December 11, 2015 12:08 AM
To: Mark Hamstra <m...@clearstorydata.com<mailto:m...@clearstorydata.com>>
Cc: Xiaoyong Zhu <xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>>; Jeff 
Zhang <zjf...@gmail.com<mailto:zjf...@gmail.com>>; user 
<user@spark.apache.org<mailto:user@spark.apache.org>>; Zhaomin Xu 
<z...@microsoft.com<mailto:z...@microsoft.com>>; Joe Zhang (SDE) 
<gui...@microsoft.com<mailto:gui...@microsoft.com>>
Subject: Re: Re: Spark assembly in Maven repo?

Agree with you that assembly jar is not good to publish. However, what he 
really need is to fetch
an updatable maven jar file.


fightf...@163.com<mailto:fightf...@163.com>

From: Mark Hamstra<mailto:m...@clearstorydata.com>
Date: 2015-12-11 15:34
To: fightf...@163.com<mailto:fightf...@163.com>
CC: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>; Jeff 
Zhang<mailto:zjf...@gmail.com>; user<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: Re: RE: Spark assembly in Maven repo?
No, publishing a spark assembly jar is not fine.  See the doc attached to 
https://issues.apache.org/jira/browse/SPARK-11157<https://na01.safelinks.protection.outlook.com/?url=https%3a%2f%2fissues.apache.org%2fjira%2fbrowse%2fSPARK-11157=01%7c01%7cxiaoyzhu%40064d.mgd.microsoft.com%7c999bc04245724187198808d3020238b3%7c72f988bf86f141af91ab2d7cd011db47%7c1=cUBvArmmYasxoOl9wrefYO4L7JiTF5qqqZDyY%2b%2bLP3Y%3d>
 and be aware that a likely goal of Spark 2.0 will be the elimination of 
assemblies.

On Thu, Dec 10, 2015 at 11:19 PM, fightf...@163.com<mailto:fightf...@163.com> 
<fightf...@163.com<mailto:fightf...@163.com>> wrote:
Using maven to download the assembly jar is fine. I would recommend to deploy 
this
assembly jar to your local maven repo, i.e. nexus repo, Or more likey a 
snapshot repository


fightf...@163.com<mailto:fightf...@163.com>

From: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>
Date: 2015-12-11 15:10
To: Jeff Zhang<mailto:zjf...@gmail.com>
CC: user@spark.apache.org<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: RE: Spark assembly in Maven repo?
Sorry – I didn’t make it clear. It’s actually not a “dependency” – it’s 
actually that we are building a certain plugin for IntelliJ where we want to 
distribute this jar. But since the jar is updated frequently we don't want to 
distribute it together with our

RE: Re: Spark assembly in Maven repo?

2015-12-13 Thread Xiaoyong Zhu
Thanks! do you mean something here (for example for 1.5.1 using scala 2.10)?
https://repository.apache.org/content/repositories/releases/org/apache/spark/spark-core_2.10/1.5.1/

Xiaoyong

From: Sean Owen [mailto:so...@cloudera.com]
Sent: Saturday, December 12, 2015 12:45 AM
To: Xiaoyong Zhu <xiaoy...@microsoft.com>
Cc: user <user@spark.apache.org>
Subject: Re: Re: Spark assembly in Maven repo?

That's exactly what the various artifacts in the Maven repo are for. The API 
classes for core are in the core artifact and so on. You don't need an assembly.

On Sat, Dec 12, 2015 at 12:32 AM, Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Yes, so our scenario is to treat the spark assembly as an “SDK” so users can 
develop Spark applications easily without downloading them. In this case which 
way do you guys think might be good?

Xiaoyong

From: fightf...@163.com<mailto:fightf...@163.com> 
[mailto:fightf...@163.com<mailto:fightf...@163.com>]
Sent: Friday, December 11, 2015 12:08 AM
To: Mark Hamstra <m...@clearstorydata.com<mailto:m...@clearstorydata.com>>
Cc: Xiaoyong Zhu <xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>>; Jeff 
Zhang <zjf...@gmail.com<mailto:zjf...@gmail.com>>; user 
<user@spark.apache.org<mailto:user@spark.apache.org>>; Zhaomin Xu 
<z...@microsoft.com<mailto:z...@microsoft.com>>; Joe Zhang (SDE) 
<gui...@microsoft.com<mailto:gui...@microsoft.com>>
Subject: Re: Re: Spark assembly in Maven repo?

Agree with you that assembly jar is not good to publish. However, what he 
really need is to fetch
an updatable maven jar file.


fightf...@163.com<mailto:fightf...@163.com>

From: Mark Hamstra<mailto:m...@clearstorydata.com>
Date: 2015-12-11 15:34
To: fightf...@163.com<mailto:fightf...@163.com>
CC: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>; Jeff 
Zhang<mailto:zjf...@gmail.com>; user<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: Re: RE: Spark assembly in Maven repo?
No, publishing a spark assembly jar is not fine.  See the doc attached to 
https://issues.apache.org/jira/browse/SPARK-11157<https://na01.safelinks.protection.outlook.com/?url=https%3a%2f%2fissues.apache.org%2fjira%2fbrowse%2fSPARK-11157=01%7c01%7cxiaoyzhu%40064d.mgd.microsoft.com%7c999bc04245724187198808d3020238b3%7c72f988bf86f141af91ab2d7cd011db47%7c1=cUBvArmmYasxoOl9wrefYO4L7JiTF5qqqZDyY%2b%2bLP3Y%3d>
 and be aware that a likely goal of Spark 2.0 will be the elimination of 
assemblies.

On Thu, Dec 10, 2015 at 11:19 PM, fightf...@163.com<mailto:fightf...@163.com> 
<fightf...@163.com<mailto:fightf...@163.com>> wrote:
Using maven to download the assembly jar is fine. I would recommend to deploy 
this
assembly jar to your local maven repo, i.e. nexus repo, Or more likey a 
snapshot repository


fightf...@163.com<mailto:fightf...@163.com>

From: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>
Date: 2015-12-11 15:10
To: Jeff Zhang<mailto:zjf...@gmail.com>
CC: user@spark.apache.org<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: RE: Spark assembly in Maven repo?
Sorry – I didn’t make it clear. It’s actually not a “dependency” – it’s 
actually that we are building a certain plugin for IntelliJ where we want to 
distribute this jar. But since the jar is updated frequently we don't want to 
distribute it together with our plugin but we would like to download it via 
Maven.

In this case what’s the recommended way?

Xiaoyong

From: Jeff Zhang [mailto:zjf...@gmail.com<mailto:zjf...@gmail.com>]
Sent: Thursday, December 10, 2015 11:03 PM
To: Xiaoyong Zhu <xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Spark assembly in Maven repo?

I don't think make the assembly jar as dependency a good practice. You may meet 
jar hell issue in that case.

On Fri, Dec 11, 2015 at 2:46 PM, Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Hi Experts,

We have a project which has a dependency for the following jar

spark-assembly--hadoop.jar
for example:
spark-assembly-1.4.1.2.3.3.0-2983-hadoop2.7.1.2.3.3.0-2983.jar

since this assembly might be updated in the future, I am not sure if there is a 
Maven repo that has the above spark assembly jar? Or should we create & upload 
it to Maven central?

Thanks!

Xiaoyong




--
Best Regards

Jeff Zhang




RE: Re: Spark assembly in Maven repo?

2015-12-11 Thread Xiaoyong Zhu
Yes, so our scenario is to treat the spark assembly as an “SDK” so users can 
develop Spark applications easily without downloading them. In this case which 
way do you guys think might be good?

Xiaoyong

From: fightf...@163.com [mailto:fightf...@163.com]
Sent: Friday, December 11, 2015 12:08 AM
To: Mark Hamstra <m...@clearstorydata.com>
Cc: Xiaoyong Zhu <xiaoy...@microsoft.com>; Jeff Zhang <zjf...@gmail.com>; user 
<user@spark.apache.org>; Zhaomin Xu <z...@microsoft.com>; Joe Zhang (SDE) 
<gui...@microsoft.com>
Subject: Re: Re: Spark assembly in Maven repo?

Agree with you that assembly jar is not good to publish. However, what he 
really need is to fetch
an updatable maven jar file.


fightf...@163.com<mailto:fightf...@163.com>

From: Mark Hamstra<mailto:m...@clearstorydata.com>
Date: 2015-12-11 15:34
To: fightf...@163.com<mailto:fightf...@163.com>
CC: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>; Jeff 
Zhang<mailto:zjf...@gmail.com>; user<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: Re: RE: Spark assembly in Maven repo?
No, publishing a spark assembly jar is not fine.  See the doc attached to 
https://issues.apache.org/jira/browse/SPARK-11157<https://na01.safelinks.protection.outlook.com/?url=https%3a%2f%2fissues.apache.org%2fjira%2fbrowse%2fSPARK-11157=01%7c01%7cxiaoyzhu%40064d.mgd.microsoft.com%7c999bc04245724187198808d3020238b3%7c72f988bf86f141af91ab2d7cd011db47%7c1=cUBvArmmYasxoOl9wrefYO4L7JiTF5qqqZDyY%2b%2bLP3Y%3d>
 and be aware that a likely goal of Spark 2.0 will be the elimination of 
assemblies.

On Thu, Dec 10, 2015 at 11:19 PM, fightf...@163.com<mailto:fightf...@163.com> 
<fightf...@163.com<mailto:fightf...@163.com>> wrote:
Using maven to download the assembly jar is fine. I would recommend to deploy 
this
assembly jar to your local maven repo, i.e. nexus repo, Or more likey a 
snapshot repository


fightf...@163.com<mailto:fightf...@163.com>

From: Xiaoyong Zhu<mailto:xiaoy...@microsoft.com>
Date: 2015-12-11 15:10
To: Jeff Zhang<mailto:zjf...@gmail.com>
CC: user@spark.apache.org<mailto:user@spark.apache.org>; Zhaomin 
Xu<mailto:z...@microsoft.com>; Joe Zhang (SDE)<mailto:gui...@microsoft.com>
Subject: RE: Spark assembly in Maven repo?
Sorry – I didn’t make it clear. It’s actually not a “dependency” – it’s 
actually that we are building a certain plugin for IntelliJ where we want to 
distribute this jar. But since the jar is updated frequently we don't want to 
distribute it together with our plugin but we would like to download it via 
Maven.

In this case what’s the recommended way?

Xiaoyong

From: Jeff Zhang [mailto:zjf...@gmail.com<mailto:zjf...@gmail.com>]
Sent: Thursday, December 10, 2015 11:03 PM
To: Xiaoyong Zhu <xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Spark assembly in Maven repo?

I don't think make the assembly jar as dependency a good practice. You may meet 
jar hell issue in that case.

On Fri, Dec 11, 2015 at 2:46 PM, Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Hi Experts,

We have a project which has a dependency for the following jar

spark-assembly--hadoop.jar
for example:
spark-assembly-1.4.1.2.3.3.0-2983-hadoop2.7.1.2.3.3.0-2983.jar

since this assembly might be updated in the future, I am not sure if there is a 
Maven repo that has the above spark assembly jar? Or should we create & upload 
it to Maven central?

Thanks!

Xiaoyong




--
Best Regards

Jeff Zhang



RE: Spark assembly in Maven repo?

2015-12-10 Thread Xiaoyong Zhu
Sorry – I didn’t make it clear. It’s actually not a “dependency” – it’s 
actually that we are building a certain plugin for IntelliJ where we want to 
distribute this jar. But since the jar is updated frequently we don't want to 
distribute it together with our plugin but we would like to download it via 
Maven.

In this case what’s the recommended way?

Xiaoyong

From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Thursday, December 10, 2015 11:03 PM
To: Xiaoyong Zhu <xiaoy...@microsoft.com>
Cc: user@spark.apache.org
Subject: Re: Spark assembly in Maven repo?

I don't think make the assembly jar as dependency a good practice. You may meet 
jar hell issue in that case.

On Fri, Dec 11, 2015 at 2:46 PM, Xiaoyong Zhu 
<xiaoy...@microsoft.com<mailto:xiaoy...@microsoft.com>> wrote:
Hi Experts,

We have a project which has a dependency for the following jar

spark-assembly--hadoop.jar
for example:
spark-assembly-1.4.1.2.3.3.0-2983-hadoop2.7.1.2.3.3.0-2983.jar

since this assembly might be updated in the future, I am not sure if there is a 
Maven repo that has the above spark assembly jar? Or should we create & upload 
it to Maven central?

Thanks!

Xiaoyong




--
Best Regards

Jeff Zhang


Spark assembly in Maven repo?

2015-12-10 Thread Xiaoyong Zhu
Hi Experts,

We have a project which has a dependency for the following jar

spark-assembly--hadoop.jar
for example:
spark-assembly-1.4.1.2.3.3.0-2983-hadoop2.7.1.2.3.3.0-2983.jar

since this assembly might be updated in the future, I am not sure if there is a 
Maven repo that has the above spark assembly jar? Or should we create & upload 
it to Maven central?

Thanks!

Xiaoyong



how to get Spark stage DAGs thru the REST APIs?

2015-11-03 Thread Xiaoyong Zhu
Hi experts

It seems that for the below Spark Stage DAGs, they are available for the Spark 
UI/Spark History, however they are not available from any of the Spark REST 
APIs.
Not sure if I missed anything if we want to get such kind of data from the REST 
API? We are using Spark on YARN.
[cid:image001.png@01D116E5.91A31140]

Xiaoyong



is there a way to interact with Spark clusters remotely?

2014-12-15 Thread Xiaoyong Zhu
Hi experts

I am wondering if there is a way to interactive with Spark remotely? i.e. no 
access to clusters required but submit Python/Scala scripts to cluster and get 
result based on (REST) APIs.
That will facilitate the development process a lot..

Xiaoyong


RE: is there a way to interact with Spark clusters remotely?

2014-12-15 Thread Xiaoyong Zhu
Thanks all for your information! What Pietro mentioned seems to be the 
appropriate solution.. I also find a 
slideshttp://www.slideshare.net/EvanChan2/spark-summit-2014-spark-job-server-talk
 talking about it.
Several quick questions:

1.   Is it already available in Spark main branch? (seems not but I am not 
sure if it is in plan)

2.   It seems that the current job sever can only submit Java jars (or 
Scala I guess?) - is there any plan to support Python in the future?
Thanks and any information would be appreciated!

Xiaoyong

From: Pietro Gentile [mailto:pietro.gentil...@gmail.com]
Sent: Monday, December 15, 2014 10:33 PM
To: Xiaoyong Zhu
Subject: R: is there a way to interact with Spark clusters remotely?

Hi,

try this https://github.com/spark-jobserver/spark-jobserver .

Best Regards,

Pietro Gentile


Da: Xiaoyong Zhu [mailto:xiaoy...@microsoft.com]
Inviato: lunedì 15 dicembre 2014 15:17
A: user@spark.apache.orgmailto:user@spark.apache.org
Oggetto: is there a way to interact with Spark clusters remotely?

Hi experts

I am wondering if there is a way to interactive with Spark remotely? i.e. no 
access to clusters required but submit Python/Scala scripts to cluster and get 
result based on (REST) APIs.
That will facilitate the development process a lot..

Xiaoyong


[http://static.avast.com/emails/avast-mail-stamp.png]http://www.avast.com/


Questa e-mail è priva di virus e malware perché è attiva la protezione avast! 
Antivirushttp://www.avast.com/ .




Spark Streaming Python APIs?

2014-12-14 Thread Xiaoyong Zhu
Hi spark experts

Are there any Python APIs for Spark Streaming? I didn't find the Python APIs in 
Spark Streaming programming guide..
http://spark.apache.org/docs/latest/streaming-programming-guide.html

Xiaoyong



RE: Spark Streaming Python APIs?

2014-12-14 Thread Xiaoyong Zhu
Cool thanks!

Xiaoyong

From: Shao, Saisai [mailto:saisai.s...@intel.com]
Sent: Monday, December 15, 2014 10:57 AM
To: Xiaoyong Zhu
Cc: user@spark.apache.org
Subject: RE: Spark Streaming Python APIs?

AFAIK, this will be a new feature in version 1.2, you can check out the master 
branch or 1.2 branch to take a try.

Thanks
Jerry

From: Xiaoyong Zhu [mailto:xiaoy...@microsoft.com]
Sent: Monday, December 15, 2014 10:53 AM
To: user@spark.apache.orgmailto:user@spark.apache.org
Subject: Spark Streaming Python APIs?

Hi spark experts

Are there any Python APIs for Spark Streaming? I didn't find the Python APIs in 
Spark Streaming programming guide..
http://spark.apache.org/docs/latest/streaming-programming-guide.html

Xiaoyong



RE: Spark Streaming Python APIs?

2014-12-14 Thread Xiaoyong Zhu
Btw I have seen the python related docs in the 1.2 doc here:
http://people.apache.org/~pwendell/spark-1.2.0-rc2-docs/streaming-programming-guide.html

Xiaoyong

From: Xiaoyong Zhu [mailto:xiaoy...@microsoft.com]
Sent: Monday, December 15, 2014 10:58 AM
To: Shao, Saisai
Cc: user@spark.apache.org
Subject: RE: Spark Streaming Python APIs?

Cool thanks!

Xiaoyong

From: Shao, Saisai [mailto:saisai.s...@intel.com]
Sent: Monday, December 15, 2014 10:57 AM
To: Xiaoyong Zhu
Cc: user@spark.apache.orgmailto:user@spark.apache.org
Subject: RE: Spark Streaming Python APIs?

AFAIK, this will be a new feature in version 1.2, you can check out the master 
branch or 1.2 branch to take a try.

Thanks
Jerry

From: Xiaoyong Zhu [mailto:xiaoy...@microsoft.com]
Sent: Monday, December 15, 2014 10:53 AM
To: user@spark.apache.orgmailto:user@spark.apache.org
Subject: Spark Streaming Python APIs?

Hi spark experts

Are there any Python APIs for Spark Streaming? I didn't find the Python APIs in 
Spark Streaming programming guide..
http://spark.apache.org/docs/latest/streaming-programming-guide.html

Xiaoyong



Spark SQL Roadmap?

2014-12-13 Thread Xiaoyong Zhu
Dear spark experts, I am very interested in Spark SQL availability in the 
future - could someone share with me the information about the following 
questions?

1.   Is there some ETAs for the Spark SQL release?

2.   I heard there is a Hive on Spark program also - what's the difference 
between Spark SQL and Hive on Spark?

Thanks!
Xiaoyong


RE: Spark SQL Roadmap?

2014-12-13 Thread Xiaoyong Zhu
Thanks Denny for your information!
For #1, what I meant is the Spark SQL beta/official release date (as today it 
is still in alpha phase)… thought today I see it has most basic 
functionalities,  I don’t know when will the next milestone happen? i.e. Beta?
For #2, thanks for the information! I read it and it’s really useful! My take 
is that, Hive on Spark is still Hive (thus having all the metastore information 
and Hive interfaces such as the REST APIs), while Spark SQL is the expansion of 
Spark and use several interfaces (HiveContext for example) to support run Hive 
queries. Is this correct?

Then a following question would be, does Spark SQL has some REST APIs, just as 
what WebHCat exposes, to help users to submit queries remotely, other than 
logging into a cluster and execute the command in spark-sql command line?

Xiaoyong

From: Denny Lee [mailto:denny.g@gmail.com]
Sent: Saturday, December 13, 2014 10:59 PM
To: Xiaoyong Zhu; user@spark.apache.org
Subject: Re: Spark SQL Roadmap?

Hi Xiaoyong,

SparkSQL has already been released and has been part of the Spark code-base 
since Spark 1.0.  The latest stable release is Spark 1.1 (here's the Spark SQL 
Programming 
Guidehttp://spark.apache.org/docs/1.1.0/sql-programming-guide.html) and we're 
currently voting on Spark 1.2.

Hive on Spark is an initiative by Cloudera to help folks whom are already using 
Hive but instead of using traditional MR it will utilize Spark.  For more 
information, check out 
http://blog.cloudera.com/blog/2014/07/apache-hive-on-apache-spark-motivations-and-design-principles/.

For anyone who is building new projects in Spark, IMHO I would suggest jumping 
to SparkSQL first.

HTH!
Denny


On Sat Dec 13 2014 at 5:00:56 AM Xiaoyong Zhu 
xiaoy...@microsoft.commailto:xiaoy...@microsoft.com wrote:
Dear spark experts, I am very interested in Spark SQL availability in the 
future – could someone share with me the information about the following 
questions?

1.   Is there some ETAs for the Spark SQL release?

2.   I heard there is a Hive on Spark program also – what’s the difference 
between Spark SQL and Hive on Spark?

Thanks!
Xiaoyong