The design majorly depends on your use cases. You have to think about the
requirements and rank them.

For example, if your application cares the response time and is ok to read
the stale data, using a nosql database as a middleware is a good option.

Good Luck,

Xiao Li


2015-10-11 21:00 GMT-07:00 Nuthan Kumar <mnut...@gmail.com>:

> If the data is also on-demand, spark as back end is also good option..
>
>
>
> Sent from Outlook Mail <http://go.microsoft.com/fwlink/?LinkId=550987>
> for Windows 10 phone
>
>
>
>
>
>
> *From: *Akhil Das
> *Sent: *Sunday, October 11, 2015 1:32 AM
> *To: *unk1102
> *Cc: *user@spark.apache.org
> *Subject: *Re: Best practices to call small spark jobs as part of REST api
>
>
>
>
>
> One approach would be to make your spark job push the computed results
> (json) to a database and your reset server can pull it from there and power
> the UI.
>
>
> Thanks
>
> Best Regards
>
>
>
> On Wed, Sep 30, 2015 at 12:26 AM, unk1102 <umesh.ka...@gmail.com> wrote:
>
> Hi I would like to know any best practices to call spark jobs in rest api.
> My
> Spark jobs returns results as json and that json can be used by UI
> application.
>
> Should we even have direct HDFS/Spark backend layer in UI for on demand
> queries? Please guide. Thanks much.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Best-practices-to-call-small-spark-jobs-as-part-of-REST-api-tp24872.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
>
>
>

Reply via email to