I using Centos 5.5
yes I have php-pear
yum install php-pear*
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
Setting up Install Process
Package 1:php-pear-1.4.9-6.el5.noarch already installed and latest version
Package php-pear-Net-URL-1.0.15-1.el5.centos.noarch already ins
fo&format=json&action=query&redirects=true
Class PEAR_Error not found; skipped loading
I installed mediawiki software via svn on centos I installed all php modules
and peral too
--
From: "Enrique"
Sent: Thursday, September 16
ouncements and site admin list"
Subject: Re: [Mediawiki-l] blank page
> Install php-pear
>
> On Sep 15, 2010 4:49 PM, "Enrique" wrote:
>
> Hello all.
> In the major pages are show blank father installing mediawiki software and
> imported xml dump, I enable the
Hello all.
In the major pages are show blank father installing mediawiki software and
imported xml dump, I enable the $wgDebugLogFile and $wgShowExceptionDetails
and see this error on log file
Class PEAR_Error not found; skipped loading
thanks
___
M
Many images thumbnails in my wiki in no show using internet explorer, but
Mozilla yes
any ideas?
thanks.
___
MediaWiki-l mailing list
MediaWiki-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Fine I set the http_proxy on wgetrc config file and work fine
Thanks
--
From: "Platonides"
Sent: Wednesday, December 23, 2009 8:46 AM
To:
Subject: Re: [Mediawiki-l] error in header
> Enrique escribió:
>> Hello all
>> I;m
unction.file-get-contents]: failed to open stream:
php_network_getaddresses: getaddrinfo failed: Name or service not known in
/home/www/es-wiki/includes/HttpFunctions.php on line 113
I don't know if I need some package or something
thanks
------
thanks I have updated java and work fine
--
From: "Enrique"
Sent: Tuesday, December 22, 2009 1:29 PM
To: "MediaWiki announcements and site admin list"
Subject: Re: [Mediawiki-l] importing dump
> I'm using: java fu
I'm using: java full version "1.5.0_17-b04" on debian lenny
--
From: "Platonides"
Sent: Tuesday, December 22, 2009 9:57 AM
To:
Subject: Re: [Mediawiki-l] importing dump
> Enrique escribió:
>> hello all
>&g
hello all
I'm importing a eswiki-20091123-pages-meta-current.xml and have downloaded a
mwdumper.jar from
http://downloads.dbpedia.org/mwdumper_invalid_contributor.zip of
https://bugzilla.wikimedia.org/show_bug.cgi?id=18328 but recive an error.
So I'm trying with mwimport and recive an error too
Hi all!
I planing migrate mi local offline wikipedia to another more powerfull server
it have centos5.2 x64 installed by now
i can copy compilled math funtions /math directory of old server (debian lenny
based) to my new server(centos) to skip the steep of imstalling ocaml and
compilling text
I;m donwloading images with wikix software , but tha ammount of images data
and the internet trafic is too high
i have configure $wgForeignFileRepos but the response of the site is too
slow
have someone make some change or methot in the code of wikix software to
download only thumbnails??
Som
Sorry i forgot include more details
It error always occurs when is workin on the same directory
/images/4/47/Cour_intérieure_logis_royal_chateau_d'Amboise.JPG
- Original Message -
From: "Enrique"
To:
Sent: Friday, July 10, 2009 2:38 PM
Subject: [Mediawiki-l] Fatal e
I;m running php /maintenance/rebuildImages.php --missing it run fine passed
time show me an error: Fatal error: Call to a member function recordUpload() on
a non-object in /maintenance/rebuildImages.php on line 196
i'm downloaded mediawiki from SVN
MediaWiki 1.16alpha
PHP 5.2.6-1+lenny3 (apach
I'm reciving this error or notine: untested generator 'MediaWiki 1.15alpha',
expect trouble ahead while importin eswiki-20090615-pages-articles.xml with
mwimport perl script, it import all pages, but i don't know whats means this
notice, i'm using mediawiki 1.15.0 downloaded from SVN, I berlive
ncrease fisical memory to 3 GB, but the problem continue when i runn
importDump.php
- Original Message -
From: "Platonides"
To:
Sent: Friday, July 03, 2009 9:58 AM
Subject: Re: [Mediawiki-l] articles and templates
> Enrique wrote:
>> I'm downloaded eswiki-200
of memory
> On 7/2/09, Enrique wrote:
>>
>> I'm importing eswiki-pages-articles.xml with importdump.php scritp i
>> trying
>> many times and allways recive this error somes times refering to other
>> include but always when reached the line 52600 (1.87 pages/sec
I'm importing eswiki-pages-articles.xml with importdump.php scritp i trying
many times and allways recive this error somes times refering to other
include but always when reached the line 52600 (1.87 pages/sec 1.87
revs/sec)
on progres.log
line error:
52600 (1.87 pages/sec 1.87 revs/sec)
Fatal
When i make clic on some many link this go to an article thats no
correspond to the link.
how i fix this after imported .xml dump with mwdumper.jar?
___
MediaWiki-l mailing list
MediaWiki-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listi
I'm downloaded eswiki-20090615-pages-articles and imported with
mwdumper.jar. when i go to see an article in the place of an templates
are articles them I see these articles in many other pages.
the last year i make same procedure and i see the main page know not. in
the place of my templates i h
I'm importing eswiki-20090615-pages-articles.xml via importDump.php
script and showme this error
Fatal error: Out of memory (allocated 869793792) (tried to allocate
122881 bytes) in /home/www/wikipedia-es/includes/parser/Parser.php on
line 404 the problem looks like related with memory, i'm runnin
hello all
some week ago i send a pots to mediawiki-l list about an problem related
with an article which show on all articles the problem consist on
templates cite. messages. sources and many others causing many articles
repeat on many other articles
how i can fix this problem? i have delete th
I want to make the cache never expire, my mediawiki is for off line
read .
What's is 'apiThumbCacheExpiry' max value ?
i can make thumbs never expire?
$wgHTTPTimeout by default is 3 milliseconds , i had increase this
value , but the response of pages are slow while the thumbs images are
downloa
9-06-17 at 13:17 -0400, Enrique wrote:
> Yes when i go to any page i see not only the same title for all aticles,
> including login form, search, all, i see the title and the text of that
> article (Carlos Iglecias) and under the correct text of article or login
> form, special page..
Finally !!!
fixed by http://www.mediawiki.org/wiki/Special:Code/MediaWiki/49381
Now how i limit my user's to see only thumb images on articles and not
clic over image to see on original size or full size??
regards
On lun, 2009-06-22 at 10:13 -0400, Enrique wrote:
> hi
>
hi
i'm trying configure $wgForeignFileRepos to see images of wikipedia on
my local wiki. to do have configured to curl request:
$wgHTTPTimeout = 3000;
$wgHTTPProxy = "http://myProxyIp:Port";;
and
$wgForeignFileRepos[] = array(
'class' => 'ForeignAPIRepo',
'name'=> 'shared',
configuring $wgForeignFileRepos
* MediaWiki version: 1.15.0
* PHP version: 5.2.6-1+lenny3 with Suhosin-Patch 0.9.6.2 (cli)
* MySQL version: 14.12 Distrib 5.0.51a, for debian-linux-gnu
(i486) using readline 5.2
* URL: (local Intranet)
Hello all i'm trying set $wgFore
nometriaTangente.svg.png
ForeignAPIRepo::getThumbUrlFromCache could not write to thumb path
the permission under images/ directory are 777 , in images/tumb are many
folders with image names but empty.
On vie, 2009-06-19 at 06:55 -0400, Enrique wrote:
> I'm configuring $wgHTTPProxy = true; for CURL request an
I'm configuring $wgHTTPProxy = true; for CURL request and
$wgForeignFileRepos to see remote images on my local wikipedia i don't
know where set the our proxy ip address , because the curl is sending
request to 0.0.0.1:
heres go an lines of my debug log
Http::request: GET http://commons.wikimedia.
Yes when i go to any page i see not only the same title for all aticles,
including login form, search, all, i see the title and the text of that
article (Carlos Iglecias) and under the correct text of article or login
form, special page.., and when i logged as wikisysop i see different
title but st
MediaWiki version: 1.12
* PHP version: PHP 5.2.6-1+lenny3 with Suhosin-Patch 0.9.6.2 (cli)
* MySQL version: Ver 14.12 Distrib 5.0.51a, for debian-linux-gnu
(i486) using readline 5.2
* URL: (Local Intranet)
I have installed mediawiki on debian lenny for my local intranet
any where i go in to my local mediawiki i see the same article in the
top of the page, when i login in my site as system administrator i see
other article, i looking the way of delete thas articles but can't do
___
MediaWiki-l mailing list
MediaWiki-l
hello all
I'm back again with the same problem, and no one solution.
i have reinstalled my server but the problem continue in th top of any
page i see the same article follow of the article or page.
___
MediaWiki-l mailing list
MediaWiki-l@lists.wik
Hello all
I'm working many days trying configure mediawiki for offline reading on my
local
network for my users.
i haved installed mediawiki on debian box via apt-get install mediawiki
mediawiki-math and work fine afther configue it, but when i downloaded
eswiki-20090601-pages-articles.xml.bz2
hello all
i have installed mediawiki from apt-get install mediawiki mediawiki-math
mediawiki-extensions and configured on debian box and dumped
eswiki-20090601-pages-articles.xml.bz2 with mwdumped.jar to database all
work fine, i looked some articles math formulas see ok. but when run
rebuild.p
Hi all
I have my all articles header have the same header and is an article, i
dumped to my database pages-articles and now running rebuildall.php scritp.
i have the same problem somes times all links point to the same article, i
hope when rebuildall.php finished all wor fine, some know how fix
Hello all
I have an wikipedia for my local intranet reading, i update my local
wikipedia somes times depending of download of pages-articles and date .. ,
and make the proces with mwdumper.jar and runnig rebuildall.php but this
proces is largue and the sever responde slow while pthe proces is ru
- Original Message -
From: "Platonides"
To:
Sent: Friday, May 22, 2009 2:13 PM
Subject: Re: [Mediawiki-l] help please $wgForeignFileRepos
Enrique escribió:
> i'm trying configure my local wikipedia with this directive to see images
> of
> commons on my wi
hello all
I installed mediawiki 1.14 and dumped page-articles of wikipedia site,
affter when i go to main page this redirectme to oficial wikipedia site.
. some time ago i realized the same procedure with mediawiki 1.12 and
pages-articles of the time and work fine, and now i make this anain
I recentily hinstalled mediawiki 1.4.0 and dumped the page-articles, it's
work fine apparently but some times articles when i make clic in a link this
point to other article thats no is the article corresponding to the link.
sorry my english
___
Med
i'm trying configure my local wikipedia with this directive to see images of
commons on my wiki.
putting this on my Localsettings.php but don't see any change.
$wgForeignFileRepos[] = array(
'class' => 'ForeignAPIRepo',
'name'=> 'shared',
'apibase'
hello all
i noticed many error in may /var/log/apache/error.log refering to
/var/www/w file this file do no exist really, and this error is degrading
my system response.
[Mon May 11 11:28:39 2009] [error] [client 192.168.157.100] File does not
exist: /var/www/w, referer:
http://192.168.157.66
Hello all
i have a local wikipedia and working fine for my intranet users, i have
internet conection trougth proxy.
i need make to see images caching of original images (prefer thumbnail only)
to make best look of my wiki, my users no have internet acces.
what i can do??
other cuestion is if i ca
Hello all
Some time ago, i haved installed mediawiki software and imported XML dump
from wikipedia downloads to see offline on my intranet
i want set up only some category or some methot to delete entry category for
example : pornography, politic and many other thar no matter for me and my
users
???
- Original Message -
From: "Platonides"
To:
Sent: Tuesday, January 20, 2009 3:28 PM
> Enrique wrote:
>> Hello all
>> a have a copy of wikipedia in may local intranet for user offline
>> reading.
>> i;t work fine whith formula ect.ect..
>&g
I have traid download some Images with WIKIX but a have an internet conextio
so slow and inestable, it work, but some images get error 404, in this
moments i have 40 GB of inages but is great idea for me when an user look a
page the images download in this moment and cache
sorry abaut my english
Hello all
a have a copy of wikipedia in may local intranet for user offline reading.
i;t work fine whith formula ect.ect..
i need if is possible show the images of original wikipedia to my users and
cache they for future use and reduce my internet access maybe trougth my
proxi server, if some
47 matches
Mail list logo