Hi,
Spark jobserver seems to be more mature than Livy but both would work I
think. You will just get more functionalities with the jobserver except the
impersonation that is only in Livy.
If you need to publish business API I would recommend to use Akka http with
Spark actors sharing a preloaded spark context so you can publish more user
friendly API. Jobserver has no way to specify endpoints URL and API verbs,
it's more like a series of random numbers.
The other way to publish business API is to build a classic API application
that requests jobserver or livy jobs through HTTP but I think it has two
much latency to run 2 HTTP request ?

2016-11-06 14:06 GMT+01:00 Reza zade <kntrm...@gmail.com>:

> Hi
>
> I have written multiple spark driver programs that load some data from
> HDFS to data frames and accomplish spark sql queries on it and persist the
> results in HDFS again. Now I need to provide a long running java program in
> order to receive requests and their some parameters(such as the number of
> top rows should be returned) from a web application (e.g. a dashboard) via
> post and get and send back the results to web application. My web
> application is somewhere out of the Spark cluster. Briefly my goal is to
> send requests and their accompanying data from web application via
> something such as POST to long running java program. then it receives the
> request and runs the corresponding spark driver (spark app) and returns the
> results for example in JSON format.
>
>
> Whats is your suggestion to develop this use case?
> Is Livy a good choise? If your answer is positive what should I do?
>
> Thanks.
>
>

Reply via email to