Re: Spark Server - How to implement

2014-12-12 Thread Manoj Samel
Thanks Marcelo.

Spark Gurus/Databricks team - do you have something in roadmap for such a
spark server ?

Thanks,

On Thu, Dec 11, 2014 at 5:43 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Oops, sorry, fat fingers.

 We've been playing with something like that inside Hive:
 https://github.com/apache/hive/tree/spark/spark-client

 That seems to have at least a few of the characteristics you're
 looking for; but it's a very young project, and at this moment we're
 not developing it as a public API, but mostly for internal Hive use.
 It can give you a few ideas, though. Also, SPARK-3215.


 On Thu, Dec 11, 2014 at 5:41 PM, Marcelo Vanzin van...@cloudera.com
 wrote:
  Hi Manoj,
 
  I'm not aware of any public projects that do something like that,
  except for the Ooyala server which you say doesn't cover your needs.
 
  We've been playing with something like that inside Hive, though:
 
  On Thu, Dec 11, 2014 at 5:33 PM, Manoj Samel manojsamelt...@gmail.com
 wrote:
  Hi,
 
  If spark based services are to be exposed as a continuously available
  server, what are the options?
 
  * The API exposed to client will be proprietary and fine grained (RPC
 style
  ..), not a Job level API
  * The client API need not be SQL so the Thrift JDBC server does not
 seem to
  be option .. but I could be wrong here ...
  * Ooyala implementation is a REST API for job submission, but as
 mentioned
  above; the desired API is a finer grain API, not a job submission
 
  Any existing implementation?
 
  Is it build your own server? Any thoughts on approach to use ?
 
  Thanks,
 
 
 
 
 
 
 
  --
  Marcelo



 --
 Marcelo



Re: Spark Server - How to implement

2014-12-12 Thread Patrick Wendell
Hey Manoj,

One proposal potentially of interest is the Spark Kernel project from
IBM - you should look for their. The interface in that project is more
of a remote REPL interface, i.e. you submit commands (as strings)
and get back results (as strings), but you don't have direct
programmatic access to state like in the JobServer. Not sure if this
is what you need.

https://issues.apache.org/jira/browse/SPARK-4605

This type of higher level execution context is something we've
generally defined to be outside of scope for the core Spark
distribution because they can be cleanly built on the stable API, and
from what I've seen of different applications that build on Spark, the
requirements are fairly different for different applications. I'm
guessing that in the next year we'll see a handful of community
projects pop up around providing various types of execution services
for spark apps.

- Patrick

On Fri, Dec 12, 2014 at 10:06 AM, Manoj Samel manojsamelt...@gmail.com wrote:
 Thanks Marcelo.

 Spark Gurus/Databricks team - do you have something in roadmap for such a
 spark server ?

 Thanks,

 On Thu, Dec 11, 2014 at 5:43 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Oops, sorry, fat fingers.

 We've been playing with something like that inside Hive:
 https://github.com/apache/hive/tree/spark/spark-client

 That seems to have at least a few of the characteristics you're
 looking for; but it's a very young project, and at this moment we're
 not developing it as a public API, but mostly for internal Hive use.
 It can give you a few ideas, though. Also, SPARK-3215.


 On Thu, Dec 11, 2014 at 5:41 PM, Marcelo Vanzin van...@cloudera.com
 wrote:
  Hi Manoj,
 
  I'm not aware of any public projects that do something like that,
  except for the Ooyala server which you say doesn't cover your needs.
 
  We've been playing with something like that inside Hive, though:
 
  On Thu, Dec 11, 2014 at 5:33 PM, Manoj Samel manojsamelt...@gmail.com
  wrote:
  Hi,
 
  If spark based services are to be exposed as a continuously available
  server, what are the options?
 
  * The API exposed to client will be proprietary and fine grained (RPC
  style
  ..), not a Job level API
  * The client API need not be SQL so the Thrift JDBC server does not
  seem to
  be option .. but I could be wrong here ...
  * Ooyala implementation is a REST API for job submission, but as
  mentioned
  above; the desired API is a finer grain API, not a job submission
 
  Any existing implementation?
 
  Is it build your own server? Any thoughts on approach to use ?
 
  Thanks,
 
 
 
 
 
 
 
  --
  Marcelo



 --
 Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark Server - How to implement

2014-12-11 Thread Manoj Samel
Hi,

If spark based services are to be exposed as a continuously available
server, what are the options?

* The API exposed to client will be proprietary and fine grained (RPC style
..), not a Job level API
* The client API need not be SQL so the Thrift JDBC server does not seem to
be option .. but I could be wrong here ...
* Ooyala implementation is a REST API for job submission, but as mentioned
above; the desired API is a finer grain API, not a job submission

Any existing implementation?

Is it build your own server? Any thoughts on approach to use ?

Thanks,


Re: Spark Server - How to implement

2014-12-11 Thread Marcelo Vanzin
Oops, sorry, fat fingers.

We've been playing with something like that inside Hive:
https://github.com/apache/hive/tree/spark/spark-client

That seems to have at least a few of the characteristics you're
looking for; but it's a very young project, and at this moment we're
not developing it as a public API, but mostly for internal Hive use.
It can give you a few ideas, though. Also, SPARK-3215.


On Thu, Dec 11, 2014 at 5:41 PM, Marcelo Vanzin van...@cloudera.com wrote:
 Hi Manoj,

 I'm not aware of any public projects that do something like that,
 except for the Ooyala server which you say doesn't cover your needs.

 We've been playing with something like that inside Hive, though:

 On Thu, Dec 11, 2014 at 5:33 PM, Manoj Samel manojsamelt...@gmail.com wrote:
 Hi,

 If spark based services are to be exposed as a continuously available
 server, what are the options?

 * The API exposed to client will be proprietary and fine grained (RPC style
 ..), not a Job level API
 * The client API need not be SQL so the Thrift JDBC server does not seem to
 be option .. but I could be wrong here ...
 * Ooyala implementation is a REST API for job submission, but as mentioned
 above; the desired API is a finer grain API, not a job submission

 Any existing implementation?

 Is it build your own server? Any thoughts on approach to use ?

 Thanks,







 --
 Marcelo



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark Server - How to implement

2014-12-11 Thread Marcelo Vanzin
Hi Manoj,

I'm not aware of any public projects that do something like that,
except for the Ooyala server which you say doesn't cover your needs.

We've been playing with something like that inside Hive, though:

On Thu, Dec 11, 2014 at 5:33 PM, Manoj Samel manojsamelt...@gmail.com wrote:
 Hi,

 If spark based services are to be exposed as a continuously available
 server, what are the options?

 * The API exposed to client will be proprietary and fine grained (RPC style
 ..), not a Job level API
 * The client API need not be SQL so the Thrift JDBC server does not seem to
 be option .. but I could be wrong here ...
 * Ooyala implementation is a REST API for job submission, but as mentioned
 above; the desired API is a finer grain API, not a job submission

 Any existing implementation?

 Is it build your own server? Any thoughts on approach to use ?

 Thanks,







-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org