thanks  Guillaume. How does that compare to the wikidata footprint of the
wikidata service (SQL) not WDQS. I presume it sits in a MyISAM storage
container?

On Tue, Jun 4, 2019 at 11:25 AM Guillaume Lederrey <gleder...@wikimedia.org>
wrote:

> On Tue, Jun 4, 2019 at 12:18 PM Adam Sanchez <a.sanche...@gmail.com>
> wrote:
> >
> > Hello,
> >
> > Does somebody know the minimal hardware requirements (disk size and
> > RAM) for loading wikidata dump in Blazegraph?
>
> The actual hardware requirements will depend on your use case. But for
> comparison, our production servers are:
>
> * 16 cores (hyper threaded, 32 threads)
> * 128G RAM
> * 1.5T of SSD storage
>
> > The downloaded dump file wikidata-20190513-all-BETA.ttl is 379G.
> > The bigdata.jnl file which stores all the triples data in Blazegraph
> > is 478G but still growing.
> > I had 1T disk but is almost full now.
>
> The current size of our jnl file in production is ~670G.
>
> Hope that helps!
>
>     Guillaume
>
> > Thanks,
> >
> > Adam
> >
> > _______________________________________________
> > Wikidata mailing list
> > Wikidata@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
>
> --
> Guillaume Lederrey
> Engineering Manager, Search Platform
> Wikimedia Foundation
> UTC+2 / CEST
>
> _______________________________________________
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>


-- 


---
Marco Neumann
KONA
_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

Reply via email to