Hi,

I'm new to Spark and need some architecture tips :-)

I need a way to connect the mobile app with the Spark backend to upload
data to and   download data from the Spark backend.

The use case is that the user do something with the app. This changes
are uploaded to the backend. Spark calculates something. If the user
uses the app again it download the new calculated data.

My plan is that the mobile app talks with a Jersey-Tomcat server and
this Jersey-Tomcat server loads the data into Spark and starts the jobs.

But what is the best way to upload the data to Spark and to start the job?

Currently Jersey, Tomcat and Spark are on the same machine.

I found this spark-jobserver[1] but I'm not sure if it is the right
choise. The mobile app uploads a JSON. Jersey converts it into POJOs to
do something with it. And than it converts it to JSON again to load it
into Spark witch converts it to POJOs.

I thought also about Spark streaming. But this means that this streaming
stuff runs 24/7?



[1] ... https://github.com/spark-jobserver/spark-jobserver

-- 

Ralph Bergmann


www              http://www.dasralph.de | http://www.the4thFloor.eu
mail             ra...@dasralph.de
skype            dasralph

facebook         https://www.facebook.com/dasralph
google+          https://plus.google.com/+RalphBergmann
xing             https://www.xing.com/profile/Ralph_Bergmann3
linkedin         https://www.linkedin.com/in/ralphbergmann
gulp             https://www.gulp.de/Profil/RalphBergmann.html
github           https://github.com/the4thfloor


pgp key id       0x421F9B78
pgp fingerprint  CEE3 7AE9 07BE 98DF CD5A E69C F131 4A8E 421F 9B78

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to