Seconded. Deterministic materialized keys at specific time granularities are
definitely the way to go. If your frequency is high enough you could r/w data
at second or ms resolution directly into memory and then roll those up into
higher time resolutions on disk. The value, as noted, could be js
On 2/23/15 1:33 PM, Jason Campbell wrote:
Thanks for the info.
The model looks reasonable, but something I would worry about is the
availability of the key data. For example, the timestamps and msg-ids should
be known without key-listing Riak (which is always a very slow operation).
There i
Thanks for the info.
The model looks reasonable, but something I would worry about is the
availability of the key data. For example, the timestamps and msg-ids should
be known without key-listing Riak (which is always a very slow operation).
There is several options for this, you can either m
On 2/22/15 6:16 PM, Jason Campbell wrote:
Coming at this from another angle, if you already have a permanent data store,
and you are only reporting on each hour at a time, can you run the reports
based on the log itself?
A lot of Riak’s advantage comes from the stability and availability of dat
Coming at this from another angle, if you already have a permanent data store,
and you are only reporting on each hour at a time, can you run the reports
based on the log itself?
A lot of Riak’s advantage comes from the stability and availability of data
storage, but S3 is already doing that fo
Hi Jason, Christopher.
This is supposed to be an append-only time-limited data. I only intend
to save about 2 weeks worth of data (which is yet another thing I need
to figure out, ie how to vacate older data).
Re: querying, for the most part the system will be building out hourly
reports bas
I have the same questions as Christopher.
Does this data need to change, or is it write-once?
What information do you have when querying?
- Will you already have timestamp and msg-id?
- If not, you may want to consider aggregating everything into a single key.
This is easier of the data isn’t
> On Feb 20, 2015, at 5:35 PM, AM wrote:
>
> Hi All.
>
> I am currently looking at using Riak as a data store for time series data.
> Currently we get about 1.5T of data in JSON format that I intend to persist
> in Riak. I am having some difficulty figuring out how to model it such that I
>
Hi All.
I am currently looking at using Riak as a data store for time series
data. Currently we get about 1.5T of data in JSON format that I intend
to persist in Riak. I am having some difficulty figuring out how to
model it such that I can fulfill the use cases I have been handed.
The data
Agreed, that’s a useful article, highly recommended.
-John
On Nov 25, 2014, at 2:03 PM, Troy Melhase wrote:
> Lots of insight here:
>
> https://highlyscalable.wordpress.com/2012/03/01/nosql-data-modeling-techniques/
>
>
>
> On Tue, Nov 25, 2014 at 1:47 PM, John Daily wrote:
> I’ve written
Lots of insight here:
https://highlyscalable.wordpress.com/2012/03/01/nosql-data-modeling-techniques/
On Tue, Nov 25, 2014 at 1:47 PM, John Daily wrote:
> I’ve written a bit about modeling patterns at
> http://basho.com/riak-development-anti-patterns/
>
> I expanded a bit on it in contributio
I’ve written a bit about modeling patterns at
http://basho.com/riak-development-anti-patterns/
I expanded a bit on it in contributions to the latest version of Eric Redmond’s
outstanding http://littleriakbook.com (make sure to grab one of the downloads,
the HTML is out of date).
-John
On Nov
I'm experimenting with Riak and trying out with some of our use cases. Our
basic requirement is to have better response times with low latency and
hence moving away from costly joins in RDBMS.
While modelling, for some of the mappings I can model them as muitiple
key-value pairs or use combination
13 matches
Mail list logo