[Wikitech-l] rel=canonical links changes

2015-03-13 Thread Jiang BIAN
Hi wikimedia dev,

I noticed "rel=canonical links" in ruwiki is using "https" (while enwiki is
still using "http").

I'd like to understand if this is just a "bug" or is a starting of some
large change across all language sites (then what's the timeline)?


Thanks

-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] unexpected error info in HTML

2013-07-31 Thread Jiang BIAN
Hi,

I noticed some pages we crawled containing error message like this;

Failed to render property P373:
Wikibase\LanguageWithConversion::factory: given languages do not have the
same parent language


But when I open the url in browser, there is no such message. And using
index.php can also get normal content without error messages.

Here are examples you can retry:

bad
$ wget 'http://zh.wikipedia.org/zh-cn/Google'

good
$ wget 'http://zh.wikipedia.org/w/index.php?title=Google'


Looks like something is wrong on Wikipedia side, anything need to fix?



Thanks


-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] unexpected error info in HTML

2013-08-01 Thread Jiang BIAN
Hi,

I noticed some pages we crawled containing error message like this;

Failed to render property P373:
Wikibase\LanguageWithConversion::factory: given languages do not have the
same parent language


But when I open the url in browser, there is no such message. And using
index.php can also get normal content without error messages.

Here are examples you can retry:

bad
$ wget 'http://zh.wikipedia.org/zh-cn/Google'

good
$ wget 'http://zh.wikipedia.org/w/index.php?title=Google'


Looks like something is wrong on Wikipedia side, anything need to fix?



Thanks

-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] unexpected error info in HTML

2013-08-22 Thread Jiang BIAN
We are actually crawling the HTML via bot, so the bug is not actually fixed
for non-login user, right?

Could you share the bug's link?

Thanks


On Thu, Aug 22, 2013 at 4:38 PM, Liangent  wrote:

> On Fri, Aug 23, 2013 at 7:06 AM, Sumana Harihareswara <
> suma...@wikimedia.org
> > wrote:
>
> > On 08/01/2013 03:08 AM, Jiang BIAN  wrote:
> > > Hi,
> > >
> > > I noticed some pages we crawled containing error message like this;
> > >
> > >  > class="mw-content-ltr"> > > class="error">Failed to render property P373:
> > > Wikibase\LanguageWithConversion::factory: given languages do not have
> the
> > > same parent language
> > >
> > >
> > > But when I open the url in browser, there is no such message. And using
> > > index.php can also get normal content without error messages.
> > >
> > > Here are examples you can retry:
> > >
> > > bad
> > > $ wget 'http://zh.wikipedia.org/zh-cn/Google'
> > >
> > > good
> > > $ wget 'http://zh.wikipedia.org/w/index.php?title=Google'
> > >
> > >
> > > Looks like something is wrong on Wikipedia side, anything need to fix?
> > >
> > >
> > >
> > > Thanks
> >
> > I checked with Jiang Bian and found out that this is still happening --
> > can anyone help Google out here? :-)
> >
> > --
> > Sumana Harihareswara
> > Engineering Community Manager
> > Wikimedia Foundation
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
> There was a bug in some Wikibase version deployed in July which caused this
> error, but a fix was backported soon and since then I've never seen any
> similar error as a logged in user. If you still see some errors only when
> unlogged in at particular URLs (like what you described) now, it's likely
> that those URLs got cached in Squid when the bug was live... In this case
> purging those pages[1] should be able to fix the issue.
>
> [1] https://en.wikipedia.org/wiki/Wikipedia:Purge
>
> -Liangent
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] unexpected error info in HTML

2013-08-22 Thread Jiang BIAN
Thanks for the link. But I think this is targeting the language variant
related fix.

We actually observed stale cache in a wider range, see the bug entry:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46014


On Thu, Aug 22, 2013 at 5:26 PM, Liangent  wrote:

> On Fri, Aug 23, 2013 at 8:13 AM, Jiang BIAN  wrote:
>
> > We are actually crawling the HTML via bot, so the bug is not actually
> fixed
> > for non-login user, right?
> >
>
> I can't think of a good way to fix the problem from this aspect besides
> waiting for old cached page to expire, unless some sysadmin is happy to
> nuke all existing Squid cached pages.
>
> However if you have a list of affected pages as you're crawling HTML, which
> we don't have, you can simply purge them in batch and recrawl those pages.
>
>
> > Could you share the bug's link?
> >
>
> There was no bug created in bugzilla... I submitted a patch[1] directly to
> fix the bug once it was spotted.
>
> [1] https://gerrit.wikimedia.org/r/#/c/76060/
>
> -Liangent
>
>
> >
> > Thanks
> >
> >
> > On Thu, Aug 22, 2013 at 4:38 PM, Liangent  wrote:
> >
> > > On Fri, Aug 23, 2013 at 7:06 AM, Sumana Harihareswara <
> > > suma...@wikimedia.org
> > > > wrote:
> > >
> > > > On 08/01/2013 03:08 AM, Jiang BIAN  wrote:
> > > > > Hi,
> > > > >
> > > > > I noticed some pages we crawled containing error message like this;
> > > > >
> > > > >  > > > class="mw-content-ltr"> > > > > class="error">Failed to render property P373:
> > > > > Wikibase\LanguageWithConversion::factory: given languages do not
> have
> > > the
> > > > > same parent language
> > > > >
> > > > >
> > > > > But when I open the url in browser, there is no such message. And
> > using
> > > > > index.php can also get normal content without error messages.
> > > > >
> > > > > Here are examples you can retry:
> > > > >
> > > > > bad
> > > > > $ wget 'http://zh.wikipedia.org/zh-cn/Google'
> > > > >
> > > > > good
> > > > > $ wget 'http://zh.wikipedia.org/w/index.php?title=Google'
> > > > >
> > > > >
> > > > > Looks like something is wrong on Wikipedia side, anything need to
> > fix?
> > > > >
> > > > >
> > > > >
> > > > > Thanks
> > > >
> > > > I checked with Jiang Bian and found out that this is still happening
> --
> > > > can anyone help Google out here? :-)
> > > >
> > > > --
> > > > Sumana Harihareswara
> > > > Engineering Community Manager
> > > > Wikimedia Foundation
> > > >
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > >
> > >
> > > There was a bug in some Wikibase version deployed in July which caused
> > this
> > > error, but a fix was backported soon and since then I've never seen any
> > > similar error as a logged in user. If you still see some errors only
> when
> > > unlogged in at particular URLs (like what you described) now, it's
> likely
> > > that those URLs got cached in Squid when the bug was live... In this
> case
> > > purging those pages[1] should be able to fix the issue.
> > >
> > > [1] https://en.wikipedia.org/wiki/Wikipedia:Purge
> > >
> > > -Liangent
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > Jiang BIAN
> >
> > This email may be confidential or privileged.  If you received this
> > communication by mistake, please don't forward it to anyone else, please
> > erase all copies and attachments, and please let me know that it went to
> > the wrong person.  Thanks.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia REST API hits v1.0

2017-04-06 Thread Jiang BIAN
Congratulations!

On Thu, Apr 6, 2017 at 5:14 PM, Victoria Coleman 
wrote:

> Awesome! Well done Gabriel & team & everyone!!
>
> Victoria
> > On Apr 6, 2017, at 4:17 PM, Gabriel Wicke  wrote:
> >
> > It is official: The Wikimedia REST API
> > <https://www.mediawiki.org/wiki/REST_API>, your scalable and fresh
> source
> > of Wikimedia content and data in machine-readable formats, is now ready
> for
> > full production use. The 1.0 release means that you can now fully rely on
> > the stability guarantees set out in the API versioning policy
> > <https://www.mediawiki.org/wiki/API_versioning>. Read more about the
> > stability levels, use cases, as well as technical background on how the
> > REST API integrates with our caching layers in our blog post:
> >
> >
> > https://blog.wikimedia.org/2017/04/06/wikimedia-rest-api/
> >
> >
> > We are looking forward to your feedback at
> > https://www.mediawiki.org/wiki/Talk:REST_API, or here on-list.
> >
> >
> > This release was made possible by the hard work of many. First of all,
> > the Services
> > team <https://www.mediawiki.org/wiki/Wikimedia_Services> (Marko Obrovac,
> > Petr Pchelko and Eric Evans), created the general API proxy and storage
> > functionality, and curated the API documentation
> > <https://en.wikipedia.org/api/rest_v1/>. The actual end points are
> > co-designed with, and largely backed, by services developed by the
> > following WMF teams: Editing (Parsing
> > <https://www.mediawiki.org/wiki/Parsing> and citoid
> > <https://www.mediawiki.org/wiki/Citoid>), Reading
> > <https://www.mediawiki.org/wiki/Reading> (Infrastructure
> > <https://www.mediawiki.org/wiki/Wikimedia_Reading_Infrastructure_team>
> and
> > Web <https://www.mediawiki.org/wiki/Reading/Web>), and Analytics
> > <https://www.mediawiki.org/wiki/Analytics>. Volunteer Moritz Schubotz
> and
> > the MathJax <https://www.mathjax.org/> community contributed the math
> end
> > points, and the PDF end point is powered by the open source
> > electron-render-service <https://github.com/msokk/
> electron-render-service>
> > project. Finally, the WMF techops team
> > <https://www.mediawiki.org/wiki/Wikimedia_Technical_Operations> runs the
> > excellent caching infrastructure that makes this API scale so well, and
> > have helped with many aspects from hardware procurement to firewalling.
> >
> >
> > Thank you all for your hard work!
> >
> >
> > We are looking forward to continuing to work with you all on making this
> > API an even better platform for building user experiences, services, and
> > tools.
> >
> >
> > Cheers,
> >
> >
> > Gabriel and the Services team
> >
> >
> > --
> > Gabriel Wicke
> > Principal Engineer, Wikimedia Foundation
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Jiang BIAN

This email may be confidential or privileged.  If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person.  Thanks.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l