Spark YARN Shuffle service wire compatibility

2015-10-22 Thread Jong Wook Kim
Hi, I’d like to know if there is a guarantee that Spark YARN shuffle service 
has wire compatibility between 1.x versions.

I could run Spark 1.5 job with YARN nodemanagers having shuffle service 1.4, 
but it might’ve been just a coincidence.

Now we’re upgrading CDH to 5.3 to 5.4, whose NodeManager already have shuffle 
service of 1.3 in its classpath, and concerned that it might not be always 
compatible with 1.5 jobs expecting shuffle service.


Jong Wook
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark as a service

2015-03-25 Thread Irfan Ahmad
You're welcome. How did it go?


*Irfan Ahmad*
CTO | Co-Founder | *CloudPhysics* http://www.cloudphysics.com
Best of VMworld Finalist
Best Cloud Management Award
NetworkWorld 10 Startups to Watch
EMA Most Notable Vendor

On Wed, Mar 25, 2015 at 7:53 AM, Ashish Mukherjee 
ashish.mukher...@gmail.com wrote:

 Thank you

 On Tue, Mar 24, 2015 at 8:40 PM, Irfan Ahmad ir...@cloudphysics.com
 wrote:

 Also look at the spark-kernel and spark job server projects.

 Irfan
 On Mar 24, 2015 5:03 AM, Todd Nist tsind...@gmail.com wrote:

 Perhaps this project, https://github.com/calrissian/spark-jetty-server,
 could help with your requirements.

 On Tue, Mar 24, 2015 at 7:12 AM, Jeffrey Jedele 
 jeffrey.jed...@gmail.com wrote:

 I don't think there's are general approach to that - the usecases are
 just to different. If you really need it, you probably will have to
 implement yourself in the driver of your application.

 PS: Make sure to use the reply to all button so that the mailing list
 is included in your reply. Otherwise only I will get your mail.

 Regards,
 Jeff

 2015-03-24 12:01 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com
 :

 Hi Jeffrey,

 Thanks. Yes, this resolves the SQL problem. My bad - I was looking for
 something which would work for Spark Streaming and other Spark jobs too,
 not just SQL.

 Regards,
 Ashish

 On Tue, Mar 24, 2015 at 4:07 PM, Jeffrey Jedele 
 jeffrey.jed...@gmail.com wrote:

 Hi Ashish,
 this might be what you're looking for:


 https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbcodbc-server

 Regards,
 Jeff

 2015-03-24 11:28 GMT+01:00 Ashish Mukherjee 
 ashish.mukher...@gmail.com:

 Hello,

 As of now, if I have to execute a Spark job, I need to create a jar
 and deploy it.  If I need to run a dynamically formed SQL from a Web
 application, is there any way of using SparkSQL in this manner? Perhaps,
 through a Web Service or something similar.

 Regards,
 Ashish









Spark as a service

2015-03-24 Thread Ashish Mukherjee
Hello,

As of now, if I have to execute a Spark job, I need to create a jar and
deploy it.  If I need to run a dynamically formed SQL from a Web
application, is there any way of using SparkSQL in this manner? Perhaps,
through a Web Service or something similar.

Regards,
Ashish


Re: Spark as a service

2015-03-24 Thread Jeffrey Jedele
Hi Ashish,
this might be what you're looking for:

https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbcodbc-server

Regards,
Jeff

2015-03-24 11:28 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com:

 Hello,

 As of now, if I have to execute a Spark job, I need to create a jar and
 deploy it.  If I need to run a dynamically formed SQL from a Web
 application, is there any way of using SparkSQL in this manner? Perhaps,
 through a Web Service or something similar.

 Regards,
 Ashish



Re: Spark as a service

2015-03-24 Thread Jeffrey Jedele
I don't think there's are general approach to that - the usecases are just
to different. If you really need it, you probably will have to implement
yourself in the driver of your application.

PS: Make sure to use the reply to all button so that the mailing list is
included in your reply. Otherwise only I will get your mail.

Regards,
Jeff

2015-03-24 12:01 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com:

 Hi Jeffrey,

 Thanks. Yes, this resolves the SQL problem. My bad - I was looking for
 something which would work for Spark Streaming and other Spark jobs too,
 not just SQL.

 Regards,
 Ashish

 On Tue, Mar 24, 2015 at 4:07 PM, Jeffrey Jedele jeffrey.jed...@gmail.com
 wrote:

 Hi Ashish,
 this might be what you're looking for:


 https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbcodbc-server

 Regards,
 Jeff

 2015-03-24 11:28 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com:

 Hello,

 As of now, if I have to execute a Spark job, I need to create a jar and
 deploy it.  If I need to run a dynamically formed SQL from a Web
 application, is there any way of using SparkSQL in this manner? Perhaps,
 through a Web Service or something similar.

 Regards,
 Ashish






Re: Spark as a service

2015-03-24 Thread Todd Nist
Perhaps this project, https://github.com/calrissian/spark-jetty-server,
could help with your requirements.

On Tue, Mar 24, 2015 at 7:12 AM, Jeffrey Jedele jeffrey.jed...@gmail.com
wrote:

 I don't think there's are general approach to that - the usecases are just
 to different. If you really need it, you probably will have to implement
 yourself in the driver of your application.

 PS: Make sure to use the reply to all button so that the mailing list is
 included in your reply. Otherwise only I will get your mail.

 Regards,
 Jeff

 2015-03-24 12:01 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com:

 Hi Jeffrey,

 Thanks. Yes, this resolves the SQL problem. My bad - I was looking for
 something which would work for Spark Streaming and other Spark jobs too,
 not just SQL.

 Regards,
 Ashish

 On Tue, Mar 24, 2015 at 4:07 PM, Jeffrey Jedele jeffrey.jed...@gmail.com
  wrote:

 Hi Ashish,
 this might be what you're looking for:


 https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbcodbc-server

 Regards,
 Jeff

 2015-03-24 11:28 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com
 :

 Hello,

 As of now, if I have to execute a Spark job, I need to create a jar and
 deploy it.  If I need to run a dynamically formed SQL from a Web
 application, is there any way of using SparkSQL in this manner? Perhaps,
 through a Web Service or something similar.

 Regards,
 Ashish







Re: Spark as a service

2015-03-24 Thread Irfan Ahmad
Also look at the spark-kernel and spark job server projects.

Irfan
On Mar 24, 2015 5:03 AM, Todd Nist tsind...@gmail.com wrote:

 Perhaps this project, https://github.com/calrissian/spark-jetty-server,
 could help with your requirements.

 On Tue, Mar 24, 2015 at 7:12 AM, Jeffrey Jedele jeffrey.jed...@gmail.com
 wrote:

 I don't think there's are general approach to that - the usecases are
 just to different. If you really need it, you probably will have to
 implement yourself in the driver of your application.

 PS: Make sure to use the reply to all button so that the mailing list is
 included in your reply. Otherwise only I will get your mail.

 Regards,
 Jeff

 2015-03-24 12:01 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com:

 Hi Jeffrey,

 Thanks. Yes, this resolves the SQL problem. My bad - I was looking for
 something which would work for Spark Streaming and other Spark jobs too,
 not just SQL.

 Regards,
 Ashish

 On Tue, Mar 24, 2015 at 4:07 PM, Jeffrey Jedele 
 jeffrey.jed...@gmail.com wrote:

 Hi Ashish,
 this might be what you're looking for:


 https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbcodbc-server

 Regards,
 Jeff

 2015-03-24 11:28 GMT+01:00 Ashish Mukherjee ashish.mukher...@gmail.com
 :

 Hello,

 As of now, if I have to execute a Spark job, I need to create a jar
 and deploy it.  If I need to run a dynamically formed SQL from a Web
 application, is there any way of using SparkSQL in this manner? Perhaps,
 through a Web Service or something similar.

 Regards,
 Ashish








Re: Hive on Spark with Spark as a service on CDH5.2

2015-03-17 Thread Arush Kharbanda
Hive on Spark and accessing HiveContext from the shall are seperate things.

Hive on Spark -
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started

To access hive on Spark you need to built with -Phive.

http://spark.apache.org/docs/1.2.1/building-spark.html#building-with-hive-and-jdbc-support

On Tue, Mar 17, 2015 at 11:35 AM, anu anamika.guo...@gmail.com wrote:

 *I am not clear if spark sql supports HIve on Spark when spark is run as a
 service in CDH 5.2? *

 Can someone please clarify this. If this is possible, how what
 configuration
 changes have I to make to import hive context in spark shell as well as to
 be able to do a spark-submit for the job to be run on the entire cluster.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Hive-on-Spark-with-Spark-as-a-service-on-CDH5-2-tp22091.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 

[image: Sigmoid Analytics] http://htmlsig.com/www.sigmoidanalytics.com

*Arush Kharbanda* || Technical Teamlead

ar...@sigmoidanalytics.com || www.sigmoidanalytics.com


Hive on Spark with Spark as a service on CDH5.2

2015-03-17 Thread anu
*I am not clear if spark sql supports HIve on Spark when spark is run as a
service in CDH 5.2? *

Can someone please clarify this. If this is possible, how what configuration
changes have I to make to import hive context in spark shell as well as to
be able to do a spark-submit for the job to be run on the entire cluster.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Hive-on-Spark-with-Spark-as-a-service-on-CDH5-2-tp22091.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread olegshirokikh
The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Corey Nolet
There's also an example of running a SparkContext in a java servlet
container from Calrissian: https://github.com/calrissian/spark-jetty-server

On Fri, Jan 16, 2015 at 2:31 PM, olegshirokikh o...@solver.com wrote:

 The question is about the ways to create a Windows desktop-based and/or
 web-based application client that is able to connect and talk to the server
 containing Spark application (either local or on-premise cloud
 distributions) in the run-time.

 Any language/architecture may work. So far, I've seen two things that may
 be
 a help in that, but I'm not so sure if they would be the best alternative
 and how they work yet:

 Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
 defines a REST API for Spark
 Hue -

 http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
 - uses item 1)

 Any advice would be appreciated. Simple toy example program (or steps) that
 shows, e.g. how to build such client for simply creating Spark Context on a
 local machine and say reading text file and returning basic stats would be
 ideal answer!



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Robert C Senkbeil
Hi,

You can take a look at the Spark Kernel project:
https://github.com/ibm-et/spark-kernel

The Spark Kernel's goal is to serve as the foundation for interactive
applications. The project provides a client library in Scala that abstracts
connecting to the kernel (containing a Spark Context), which can be
embedded into a web application. We demonstrated this at StataConf when we
embedded the Spark Kernel client into a Play application to provide an
interactive web application that communicates to Spark via the Spark Kernel
(hosting a Spark Context).

A getting started section can be found here:
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any other questions, feel free to email me or communicate over
our mailing list:

spark-ker...@googlegroups.com

https://groups.google.com/forum/#!forum/spark-kernel

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer



From:   olegshirokikh o...@solver.com
To: user@spark.apache.org
Date:   01/16/2015 01:32 PM
Subject:Creating Apache Spark-powered “As Service” applications



The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may
be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/

- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Creating Apache Spark-powered “As Service” applications

2015-01-16 Thread Oleg Shirokikh
Thanks a lot, Robert – I’ll definitely investigate this and probably would come 
back with questions.

P.S. I’m new to this Spark forum. I’m getting responses through emails but they 
are not appearing as “replies” in the thread – it’s kind of inconvenient. Is it 
something that I should tweak?

Thanks,
Oleg

From: Robert C Senkbeil [mailto:rcsen...@us.ibm.com]
Sent: Friday, January 16, 2015 12:21 PM
To: Oleg Shirokikh
Cc: user@spark.apache.org
Subject: Re: Creating Apache Spark-powered “As Service” applications


Hi,

You can take a look at the Spark Kernel project: 
https://github.com/ibm-et/spark-kernel

The Spark Kernel's goal is to serve as the foundation for interactive 
applications. The project provides a client library in Scala that abstracts 
connecting to the kernel (containing a Spark Context), which can be embedded 
into a web application. We demonstrated this at StataConf when we embedded the 
Spark Kernel client into a Play application to provide an interactive web 
application that communicates to Spark via the Spark Kernel (hosting a Spark 
Context).

A getting started section can be found here: 
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any other questions, feel free to email me or communicate over our 
mailing list:

spark-ker...@googlegroups.commailto:spark-ker...@googlegroups.com

https://groups.google.com/forum/#!forum/spark-kernel

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer

[Inactive hide details for olegshirokikh ---01/16/2015 01:32:43 PM---The 
question is about the ways to create a Windows desktop-]olegshirokikh 
---01/16/2015 01:32:43 PM---The question is about the ways to create a Windows 
desktop-based and/or web-based application client

From: olegshirokikh o...@solver.commailto:o...@solver.com
To: user@spark.apache.orgmailto:user@spark.apache.org
Date: 01/16/2015 01:32 PM
Subject: Creating Apache Spark-powered “As Service” applications





The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/
- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.orgmailto:user-unsubscr...@spark.apache.org
For additional commands, e-mail: 
user-h...@spark.apache.orgmailto:user-h...@spark.apache.org