If your dashboard is doing ajax/pull requests against say a REST API you
can always create a Spark context in your rest service and use SparkSQL to
query over the parquet files. The parquet files are already on disk so it
seems silly to write both to parquet and to a DB...unless I'm missing
something in your setup.

On Tue, Sep 16, 2014 at 4:18 AM, Marius Soutier <mps....@gmail.com> wrote:

> Writing to Parquet and querying the result via SparkSQL works great
> (except for some strange SQL parser errors). However the problem remains,
> how do I get that data back to a dashboard. So I guess I’ll have to use a
> database after all.
>
>
>>> You can batch up data & store into parquet partitions as well. & query
>>>> it using another SparkSQL  shell, JDBC driver in SparkSQL is part 1.1 i
>>>> believe.
>>>>
>>>

Reply via email to