The spark jobserver will do what you describe for you. I have built an app
where we have a bunch of queries being submitted via
http://something/query/<query-name>
via POST (all parameters for the query are in JSON POST request). This is a
scalatra layer that talks to spark jobserver via HTTP.

On Sun, Nov 6, 2016 at 8:06 AM, Reza zade <kntrm...@gmail.com> wrote:

> Hi
>
> I have written multiple spark driver programs that load some data from
> HDFS to data frames and accomplish spark sql queries on it and persist the
> results in HDFS again. Now I need to provide a long running java program in
> order to receive requests and their some parameters(such as the number of
> top rows should be returned) from a web application (e.g. a dashboard) via
> post and get and send back the results to web application. My web
> application is somewhere out of the Spark cluster. Briefly my goal is to
> send requests and their accompanying data from web application via
> something such as POST to long running java program. then it receives the
> request and runs the corresponding spark driver (spark app) and returns the
> results for example in JSON format.
>
>
> Whats is your suggestion to develop this use case?
> Is Livy a good choise? If your answer is positive what should I do?
>
> Thanks.
>
>

Reply via email to