2010/7/14 Salvatore Loguercio :
>
> Thanks much for the quick answer, I see the problem now.
> I wonder how these large wikitexts could be written on my target Wiki.
> Is there a way to 'force' the PHP max execution time limit through the API?
> If not, I guess I will have to contact a sysop..
>
Th
Thanks much for the quick answer, I see the problem now.
I wonder how these large wikitexts could be written on my target Wiki.
Is there a way to 'force' the PHP max execution time limit through the API?
If not, I guess I will have to contact a sysop..
Roan Kattouw-2 wrote:
>
> 2010/7/13 Sal
2010/7/13 Sal976 :
> I wonder if this error is due to server timeout or exceeding number of
> characters..
> Any suggestions?
>
It spends so much time parsing the wikitext you supplied that PHP's
max execution time limit was exceeded, which causes an empty (0-byte)
response. You can fix the error m
Hi,
I am using mwclient 0.6.4 (r93) to import some Wiki pages from en.wikipedia
to another wiki installation (presumably running Mediawiki 1.15).
Everything works fine, except when I try to import 'big' pages, e.g.:
http://en.wikipedia.org/wiki/Grb2
content= Mediawiki text file to be imported