gt;>> Wikimedia Deutschland e.V.
>>> Tempelhofer Ufer 23-24
>>> 10963 Berlin
>>> www.wikimedia.de
>>>
>>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>>
>>> Eingetragen im Vereinsregister des Amtsgericht
e apparently we
> can’t get more than 1 total pages from MWAPI.
>
> Cheers,
> Lucas
> On 12.01.19 13:57, Reem Al-Kashif wrote:
>
> Thank you so much, Nicolas & Lucas!
>
> @Lucas this helps a lot! At least I will get an idea about what I need
> until PetScan is
Nicolas VIGNERON wrote:
>
> Hi Reem,
>
> If this page
> https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual/MWAPI
> is up-o-date it's does not seem possible to get the article size of a
> wikipedia article (but I must I don't use and know "wikibase:mwapi" a
&g
mp;=>
refuses to cooperate for a reason I don't know yet.. :/
Thanks in advance.
Best,
Reem
--
*Kind regards,Reem Al-Kashif*
<http://www.avg.com/email-signature?utm_medium=email_source=link_campaign=sig-email_content=webmail>
Virus-free.
www.avg.com
<http://www.avg.com/email-signatur
Hello!
I posted this
<https://www.wikidata.org/wiki/Wikidata:Project_chat#Querying_Wikidata_compared_with_DBpedia>
the other day to the Project Chat. Would love to hear more opinions. :)
Best,
Reem
--
*Kind regards,Reem Al-Kashif*
<http://www.avg.com/email-signature?utm_medium=ema
body of the email, leaving the
> subject line intact. Thank you.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
--
*Kind regards,Reem Al-Kashif*
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi Amir,
I don't think such a tool exists now. I believe a SPARQL can do it.You can
view some sparql examples and maybe personalize some of them here (
https://m.wikidata.org/wiki/Special:MyLanguage/Wikidata:SPARQL_query_service/queries/examples)
or you can go here
iling list
> > Wikidata@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
> >
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
--
*Kind regards,Reem Al-Kashif*
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi
I don't have an idea about how to develop this, but it seems like an
interesting project!
Best,
Reem
On 30 Mar 2017 10:17, "Gerard Meijssen" wrote:
> Hoi,
> Much of the content of DBpedia and Wikidata have the same origin;
> harvesting data from a Wikipedia.
imedia-l mailing list, guidelines at: https://meta.wikimedia.org/wik
> i/Mailing_lists/Guidelines
> New messages to: wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
--
*Kind regards,Reem Al-Kashif*
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ns to expand.
>
> Nemo
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
--
*Kind regards,Reem Al-Kashif*
___
Hi,
I'm just wondering if anybody knows what happened to DizzyLogic wiki
parser?
Best,
Reem
--
*Kind regards,Reem Al-Kashif*
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
12 matches
Mail list logo