I have been doing it from open street map data using a python script.

At first it generates about 100gb of zipped ttl files, imports them with a
specialized blazegraph tool and afterwards uses direct sparql statements to
do batch updates when new data becomes available.

https://github.com/Sophox/sophox/tree/main/osm2rdf


On Wed, Apr 14, 2021, 5:07 PM Henry Rosales <hrosmen...@gmail.com> wrote:

> Hi all!
>
> I hope you are doing very well! I’m working on constructing an RDF
> dataset, which I’m ingesting to Wikibase and leverage its ecosystem:
> visualization, SPARQL endpoint, a script for dump creation, etc. For
> ingestion, I’m using the API (explained in
> https://www.wikidata.org/w/api.php), and it is going well, except for the
> time performance.
>
> I was exploring the Quick Statements module of Wikibase, which allows the
> ingestion through the UI. However, I need a way to ingest data
> automatically. I wonder if Wikibase/Wikidata has any script for bulk loads.
>
> Any comment or suggestion will be welcome.
>
> Best Regards,
> Henry Rosales
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to