Hi Manuel,
On 11/02/14 12:42, manuel wrote:
Have you done any benchmark to measure the maximum number of triples
that can be stored (I need the order of magnitude)?
We have loaded big datasets (up to 2 billion triples) with good results
(sorry, no concrete benchmarks).
Versioning should have an impact on triple insertion (maybe up to 50%
slower), but not real impact on query time as soon as the queries do not
contain complex patterns with deleted triples.
This is for me an important need as I have to collect data coming
from many sensor data.
That would be a good test scenario we don't have currently.
Moreover, as the semantic data is combined with a lot of binary data
(such BLOB), is it possible eventually to store it over KIWI
backend?
Not really. We still have the concept of content from the old LMF, but
it's deprecated. Right now we are discussing at dev@marmotta about LDP
Binary Resources: http://markmail.org/message/jimm5stgdyore3py
but I'm not sure if it fits with what you need...
Cheers,
--
Sergio Fernández
Senior Researcher
Knowledge and Media Technologies
Salzburg Research Forschungsgesellschaft mbH
Jakob-Haringer-Straße 5/3 | 5020 Salzburg, Austria
T: +43 662 2288 318 | M: +43 660 2747 925
[email protected]
http://www.salzburgresearch.at