https://bugzilla.wikimedia.org/show_bug.cgi?id=34778

--- Comment #14 from Platonides <platoni...@gmail.com> 2012-04-23 18:00:19 UTC 
---
(In reply to comment #11)
> Dear Platonides,
> 
> Thank you for the comments on the latest revision.
> 
> Could you please provide pointers to the best practices regarding retrieving
> the title, and how best to parse URLs without stripos() and explode()?

The request object has a getVal() method.


> Regarding the timezone, we're not changing it other than to follow the RFC
> specification that all timestamps in HTTP headers MUST be in GMT, "without
> exception". This does not change the UX in the page. Please see:
> http://tools.ietf.org/html/rfc2616#page-20

Of course, so you either change the default and set it back to what it was
before or -better- use a function that doesn't need switching default
timezones.
I think your mmConvertTimestamp() function could be replaced with a call to
wfTimestamp() with TS_RFC2822 output.


> Could you please confirm what you mean by "HTML injection building links"? We
> do not change the HTML of the returned history page.
You're handcrafting many urls, such as 
 $first['uri'] = $alturi . "?title=" . $title . "&oldid=" . $oldestRevID;

This is horrible practise. It'd lead to html injection if outputted in html, in
http headers the server might be tricked to redirect to an attacker website
(maybe not possible with the broken way you read them, but stil...).
Look to wfExpandUrl() and wfAppendQuery()


> If there is a better way to discover the names of the Special pages (timegate
> and timemap) generated within the extension, please let us know and we'll
> update the extension.
(Answered by MaxSem)

> We do request only the parts of the history list that are required for the
> different operations.  The timegate needs the closest match, first, last,
> previous and next.  The timemap is a serialization of the set of versions of
> the resource, and thus requires the entire history list.

This unbounded query is retrieving all the revisions for the page.
$xares = $dbr->select( 'revision', array('rev_id', 'rev_timestamp'),
array("rev_page=$pg_id"), __METHOD__, array('DISTINCT', 'ORDER BY'=>'rev_id
DESC') );

Suppose we were visiting https://en.wikipedia.org/wiki/Main_Page which has 4104
revisions. Can you justify why you need all of them instead of just 3 or 4?



Also, it'd be helpful if you provided a public repository of the extension. You
can request it to be hosted with the other mediawiki extensions in our
repository. That'd help later for deployment.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to