hi,
We are building an analytics dashboard. Data will be updated every 5
minutes for now and eventually every 1 minute, maybe more frequent. The
amount of data coming is not huge, per customer maybe 30 records per minute
although we could have 500 customers. Is streaming correct for this
I nstead
Streaming would be easy to implement, all you have to do is to create the
stream, do some transformation (depends on your usecase) and finally write
it to your dashboards backend. What kind of dashboards are you building?
For d3.js based ones, you can have websocket and write the stream output to
Thanks. Yes d3 ones. Just to clarify--we could take our current system,
which is incrementally adding partitions and overlay an Apache streaming
layer to ingest these partitions? Then nightly, we could coalesce these
partitions for example? I presume that while we are carrying out
a coalesce, the
I'm not quiet sure if i understood you correctly, but here's the thing, if
you use sparkstreaming, it is more likely to refresh your dashboard for
each batch. So for every batch your dashboard will be updated with the new
data. And yes, the end use won't feel anything while you do the
Great thanks
On Monday, November 24, 2014, Akhil Das ak...@sigmoidanalytics.com wrote:
I'm not quiet sure if i understood you correctly, but here's the thing, if
you use sparkstreaming, it is more likely to refresh your dashboard for
each batch. So for every batch your dashboard will be