any help / idea will be appreciated :)
thanks

Regards,
Oded Maimon
Scene53.

On Sun, Jul 12, 2015 at 4:49 PM, Oded Maimon <o...@scene53.com> wrote:

> Hi All,
> we are evaluating spark for real-time analytic. what we are trying to do
> is the following:
>
>    - READER APP- use custom receiver to get data from rabbitmq (written
>    in scala)
>    - ANALYZER APP - use spark R application to read the data (windowed),
>    analyze it every minute and save the results inside spark
>    - OUTPUT APP - user spark application (scala/java/python) to read the
>    results from R every X minutes and send the data to few external systems
>
> basically at the end i would like to have the READER COMPONENT as an app
> that always consumes the data and keeps it in spark,
> have as many ANALYZER COMPONENTS as my data scientists wants, and have one
> OUTPUT APP that will read the ANALYZER results and send it to any relevant
> system.
>
> what is the right way to do it?
>
> Thanks,
> Oded.
>
>
>
>

-- 


*This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are 
addressed. Please note that any disclosure, copying or distribution of the 
content of this information is strictly forbidden. If you have received 
this email message in error, please destroy it immediately and notify its 
sender.*

Reply via email to