There are a variety of REST API services you can use, but you must consider
carefully whether it makes sense to start a Spark job based on individual
requests, unless you mean based on some triggering event you want to start
a Spark job, in which case it makes sense to use the RESTful service.
Whether your Spark cluster is multi-tenant depends on the scheduler you
use, cluster size and other factors. But you seem to be mixing terminology,
a given application can certainly generate many tasks and you could deploy
many applications on a single Spark cluster, whether you do them all
concurrently or not is where the issue of multi-tenancy comes into picture.

On Wed, Jun 15, 2016 at 8:19 PM, Yu Wei <yu20...@hotmail.com> wrote:

> Hi,
>
> I'm learning spark recently. I have one question about spark.
>
>
> Is it possible to feed web requests via spark application directly? Is
> there any library to be used?
>
> Or do I need to write the results from spark to HDFS/HBase?
>
>
> Is one spark application only to be designed to implement one single task?
> Or could multiple tasks be combined into one application?
>
>
>
> Thanks,
>
> Jared
>
>
>

Reply via email to