[Wikidata] Re: Private Information Retrieval

2022-01-29 Thread Ed Summers
It is an interesting idea! Are you concerned at all about how a series of 
queries from a known origin might be interpreted? I guess you can always route 
your queries through Tor, assuming Wikidata continues to allow that?
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: Wikidata Query Service scaling update Aug 2021

2021-08-19 Thread Ed Summers
Marko,

First, I just wanted to say it is *awesome* to see this level of
transparency and clarity about the state of the service.

Maybe this is over simplifying things but is it accurate to say that
there are two orthogonal problems here?

1. The underlying technology (BlazeGraph) is end of life and needs to
be replaced.
2. The Query service is open to the public, with no authentication,
which means the Wikidata team have very little idea who/what depends on
the service.

For 2 I wonder if it might make sense to start requiring registration &
authentication? Or is this not the wiki way?

//Ed
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


[Wikidata] Re: History of some original Wikidata design decisions?

2021-07-23 Thread Ed Summers

> On Thu, Jul 22, 2021 at 6:56 PM Denny Vrandečić <
> dvrande...@wikimedia.org> wrote:
> > 
> > I hope that helps with the historical deep dive :) Lydia and I
> > really should write that book!

A Wikidata book would be most excellent, especially one by both of you!
If there's anything interested people can do to help make it happen (a
little crowdfunding or what have you) please let us know.

//Ed
___
Wikidata mailing list -- wikidata@lists.wikimedia.org
To unsubscribe send an email to wikidata-le...@lists.wikimedia.org


Re: [Wikidata] Linking to place with Wikipedia page but no Wikidata link

2017-09-02 Thread Ed Summers

> On Sep 2, 2017, at 7:47 AM, Jane Darnell  wrote:
> 
> Your note really made me feel so sad. I try to motivate my Wikipedian friends 
> into doing more on Wikidata and each time they react the way you did, with a 
> sentence like "I imagined that the mapping between Wikipedia and Wikidata was 
> ultra-automated." I guess there is something about the "data" word in the 
> same that makes people assume it is technical, or that being 
> "machine-readable" makes it impossible for humans to read and without "bot" 
> knowlege, there is no place for "normal contributors" to help out.

I appreciate this perspective a great deal. I think it's great that you are 
motivating users to edit Wikidata--it's really important. Wikidata is nothing 
(IMHO) without the human-in-the-loop.

But as a practical matter wouldn't it be useful if there were stubs in Wikidata 
that would help editors identify which entities need attention? Or would the 
vastness of it cause a problem?

I can certainly see an argument for an embargo period to give counter-vandalism 
efforts a chance to triage the new pages. But after that point wouldn't it be 
useful if a bot monitored the language wikipedias for new entries and then 
added them to Wikidata so that people could fill them out?

I'm just throwing ideas around here, and am not trying to be critical of the 
current state of affairs. You all are doing amazing work.

//Ed


signature.asc
Description: Message signed with OpenPGP
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Linking to place with Wikipedia page but no Wikidata link

2017-09-01 Thread Ed Summers
So each language wikipedia does this on an ad-hoc basis?

> On Sep 1, 2017, at 9:36 AM, Jane Darnell  wrote:
> 
> Checking the history of that page shows it was recently created. Not sure how 
> the Finns do this but like the Dutch they probably have a bot that creates 
> Wikidata items after a month or so has passed (this avoids creating items for 
> things that get deleted through the "speedy delete" process). You can create 
> the item yourself, or wait another month I guess.
> https://fi.wikipedia.org/w/index.php?title=Teuro&action=history


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] sparql + regex?

2016-09-25 Thread Ed Summers

> On Sep 25, 2016, at 7:19 AM, David Abián  wrote:
> 
> Tracked in Phabricator...
> 
> https://phabricator.wikimedia.org/T146576

Thanks!

//Ed


signature.asc
Description: Message signed with OpenPGP using GPGMail
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] sparql + regex?

2016-09-25 Thread Ed Summers
I've been experimenting a bit with queries that contain regular expressions and 
have noticed that they seem to be triggering 502 Bad Gateway errors. Or perhaps 
it's just a coincidence and there were other things going on around 10AM GMT?

Here's an example query where I'm looking for cities that start with "Silver":

---

PREFIX rdfs: 
PREFIX wd: 
PREFIX wdt: 

SELECT ?s ?label
WHERE {
  ?s wdt:P31 wd:Q515 .
  ?s rdfs:label ?label
  FILTER(regex(?label, "^Silver"))
}
LIMIT 15

---

Am I doing something wrong in the query? Occasionally it seems to work, but 
most of the time it waits for a while and then I get the 502 error. Any 
guidance you may have would be appreciated.

//Ed


signature.asc
Description: Message signed with OpenPGP using GPGMail
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Wikidata Suggest?

2016-09-22 Thread Ed Summers
I think this question has come up before, but I was wondering if there was 
anyone has a working Wikidata Suggest widget of some kind that works similarly 
to Freebase Suggest which has finally been turned off completely:

https://developers.google.com/freebase/v1/suggest

I've been using Freebase entities in the past to provide some measure of data 
normalization, and am in the middle of converting my entities over to Wikidata. 
The specific application is a jobs board that needs to control the names for 
Organizations, Subjects and Locations [1,2].

I need to create suggest like functionality in the application and was 
wondering if anyone else had done anything like a Freebase Suggest already. If 
this is a gap area I was thinking of creating a React component that uses the 
Wikidata SPARQL endpoint [3] which I was pleased to see already supports CORS. 
But I'm not married to React, so if someone has another JavaScript library that 
does this I'd love to hear from you.

//Ed

[1] https://github.com/code4lib/shortimer/issues/38
[2] http://jobs.code4lib.org
[3] https://query.wikidata.org/bigdata/namespace/wdq/sparql


signature.asc
Description: Message signed with OpenPGP using GPGMail
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Bounty for a OpenRefine issue regarding Wikidata reconciliation

2015-08-07 Thread Ed Summers
I believe it also still needs human readable types instead of codes to be 
usable. It would be super to see the source too!

//Ed


> On Aug 7, 2015, at 7:39 AM, Thad Guidry  wrote:
> 
> Magnus,
> 
> You don't have the 
> https://tools.wmflabs.org/wikidata-reconcile/?callback=jsonp working 
> correctly in order to add the reconcile URL of 
> https://tools.wmflabs.org/wikidata-reconcile to OpenRefine.
> 
> OpenRefine needs a URL path that returns JSON always, so can then append the 
> various parameters, not a URL that supplies HTML and then JSON after the 
> query.
> 
> See this about the JSON Service Metadata that your implementation still needs 
> do correctly: 
> https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API#service-metadata
> 
> If you do that, and the community agrees that your implementation is 
> acceptablethen you win the bounty !
> 
> : 
> 
> Thad
> +ThadGuidry
> 
>> On Fri, Aug 7, 2015 at 4:46 AM, Andrea Zanni  
>> wrote:
>> Yes Magnus. 
>> In my absolute newbiness, I was in fact not understanding 
>> if you already solved the problem, or it's just a matter of integration with 
>> OpenRefine.
>> 
>> Aubrey
>> 
>>> On Fri, Aug 7, 2015 at 11:30 AM, Magnus Manske 
>>>  wrote:
>>> You guys saw this right?
>>> https://tools.wmflabs.org/wikidata-reconcile/
>>> 
 On Fri, Aug 7, 2015 at 9:18 AM Andrea Zanni  
 wrote:
 Yes Thad, 
 I noticed after I posted that there was a recent thread and you were 
 involved :-D
 I'm just learning OpenRefine and I thought the bounty needed to be shared.
 
 Aubrey
 
> On Fri, Aug 7, 2015 at 2:10 AM, Thad Guidry  wrote:
> You found my bounty offered and posted by me :)  And yes, its still open 
> and not that hard to implement and hack on to get OpenRefine to use a 
> reconcile service.  Our loose documentation on it is here: 
> https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API
> 
> If anyone has questions you can mailto: openrefine-...@googlegroups.com
> 
> https://groups.google.com/forum/#!forum/openrefine-dev
> 
> Happy Hacking and 1st working implementation wins the bounty !
> 
> 
> Thad
> +ThadGuidry
> 
>> On Thu, Aug 6, 2015 at 5:48 PM, Andrea Zanni  
>> wrote:
>> Here some details:
>> https://github.com/OpenRefine/OpenRefine/issues/805
>> and here for the $265  bounty: 
>> https://www.bountysource.com/issues/985941-implement-wikidata-reconciliation-was-freebase
>> 
>> I hope someone is interested :-)
>> 
>> Aubrey
>> 
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
 
 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata
>>> 
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>> 
>> 
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] tool to do multiple searches in Wikidata at once?

2015-07-24 Thread Ed Summers

> On Jul 24, 2015, at 5:37 AM, Magnus Manske  
> wrote:
> 
> Early version:
> https://tools.wmflabs.org/wikidata-reconcile/

Wow, that was crazy fast Magnus :) I gave it a try in the latest OpenRefine on 
a very simple spreadsheet:

city
Paris
New York City

OpenRefine threw an exception (see below). I think OpenRefine is expecting the 
type key in each result. From the documentation [1] it looks like the value for 
the type key should be a array of types? I don’t know if it’s easy to get at 
that in your app. Speaking of which is the code available somewhere?

Awesome work!

//Ed

[1] 
https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API#query-response

//Ed

org.json.JSONException: JSONObject["type"] not found.
at org.json.JSONObject.get(JSONObject.java:406)
at org.json.JSONObject.getJSONArray(JSONObject.java:482)
at 
com.google.refine.commands.recon.GuessTypesOfColumnCommand.guessTypes(GuessTypesOfColumnCommand.java:216)
at 
com.google.refine.commands.recon.GuessTypesOfColumnCommand.doPost(GuessTypesOfColumnCommand.java:89)
at com.google.refine.RefineServlet.service(RefineServlet.java:177)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at 
org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at 
org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1166)
at org.mortbay.servlet.UserAgentFilter.doFilter(UserAgentFilter.java:81)
at org.mortbay.servlet.GzipFilter.doFilter(GzipFilter.java:132)
at 
org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157)
at 
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388)
at 
org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
at 
org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at 
org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418)
at 
org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at 
org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at 
org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:923)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:547)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at 
org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)







signature.asc
Description: Message signed with OpenPGP using GPGMail
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] tool to do multiple searches in Wikidata at once?

2015-07-23 Thread Ed Summers
Hi Sandra,

I suspect this is a bit too nerdy, but I was reconciling some messy CSV data to 
Wikidata recently and ended up writing a little Python library and command line 
tool that uses wbgetentities API call.

https://github.com/edsu/wikidata_suggest

I also created an awkward video I did for Day of Digital Humanities here:

https://vimeo.com/128305641

//Ed

> On Jul 23, 2015, at 3:29 AM, Gerard Meijssen  
> wrote:
> 
> Hoi,
> When the list is a list of articles in a Wikipedia, try "Linked items" one 
> magnificent tool by Magnus that can be used for this..
> Thanks,
>  GerardM
> 
> 
> 
> 
> https://tools.wmflabs.org/wikidata-todo/linked_items.php
> 
> On 23 July 2015 at 08:44, Sandra Fauconnier  
> wrote:
> Hi everyone,
> 
> I’ve been in the situation quite often (edit-a-thons; various to do lists) 
> where I had a list of terms (most usually names of a few hundreds of people, 
> or titles of Wikipedia articles), where I wanted to do a quick search on 
> Wikidata to retrieve each of these concept’s Q number.
> Does anyone know of a tool that helps me make this easier? Enter a list of, 
> say, 100 of these search terms, and receive Q number suggestions for each of 
> them? I’ve looked around on wmflabs but have not found anything in that 
> direction (also not with the help of Hay’s awesome tool directory).
> 
> Till now, I’ve done all these searches manually - use an excel sheet, look 
> for each term individually, enter Q number for each term - quite accurate but 
> very time-consuming!
> 
> Would appreciate all help/tips !
> Thanks! Sandra (User:Spinster)
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata



signature.asc
Description: Message signed with OpenPGP using GPGMail
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata