Re: [Wikitech-l] Problems uploading word 2007 .doc files

2013-01-09 Thread Aran Dunkley
Thanks, yes that's the same problem, and they have some potential
workarounds there.

On 08/01/13 16:51, Luke Welling WMF wrote:
> Is this bug the same issue?  It looks like somebody put up a partial fix
>
> https://bugzilla.wikimedia.org/show_bug.cgi?id=38432
>
> - Luke Welling
>
>
> On Mon, Jan 7, 2013 at 6:30 PM, Aran Dunkley wrote:
>
>> The file was a .doc, but I've tried changing it to docx and get the same
>> result. Some other .doc and .docx files that are word 2007 upload no
>> problem. But I don't see how it can complain when I have the MimeType
>> verification disabled - completely disabling the verification would be
>> no problem since only specific users can upload.
>>
>> p.s. this is a MW 1.19.2 on Ubuntu Server 11.10 with PHP 5.3.6
>>
>> On 07/01/13 21:21, Matma Rex wrote:
>>> On Mon, 07 Jan 2013 23:47:58 +0100, Aran Dunkley
>>>  wrote:
>>>
>>>> Hello, can someone please help me with this .doc upload problem? I've
>>>> tried everything and even setting  $wgVerifyMimeType to false fails to
>>>> solve it. No matter what I do I keep getting the following error when I
>>>> upload *some* word 2007 .doc files:
>>>>
>>>> The file is a corrupt or otherwise unreadable ZIP file. It cannot be
>>>> properly checked for security.
>>>>
>>>> I don't know how that check can even be happening with $wgVerifyMimeType
>>>> disabled, but still the error occurs?!
>>> Word 2007 uses a .docx format as far as I know, not .doc. Which one
>>> were you using in your configuration?
>>>
>>> Also, .docx files are essentially ZIP files with magic data inside.
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problems uploading word 2007 .doc files

2013-01-07 Thread Aran Dunkley
The file was a .doc, but I've tried changing it to docx and get the same
result. Some other .doc and .docx files that are word 2007 upload no
problem. But I don't see how it can complain when I have the MimeType
verification disabled - completely disabling the verification would be
no problem since only specific users can upload.

p.s. this is a MW 1.19.2 on Ubuntu Server 11.10 with PHP 5.3.6

On 07/01/13 21:21, Matma Rex wrote:
> On Mon, 07 Jan 2013 23:47:58 +0100, Aran Dunkley
>  wrote:
>
>> Hello, can someone please help me with this .doc upload problem? I've
>> tried everything and even setting  $wgVerifyMimeType to false fails to
>> solve it. No matter what I do I keep getting the following error when I
>> upload *some* word 2007 .doc files:
>>
>> The file is a corrupt or otherwise unreadable ZIP file. It cannot be
>> properly checked for security.
>>
>> I don't know how that check can even be happening with $wgVerifyMimeType
>> disabled, but still the error occurs?!
>
> Word 2007 uses a .docx format as far as I know, not .doc. Which one
> were you using in your configuration?
>
> Also, .docx files are essentially ZIP files with magic data inside.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Problems uploading word 2007 .doc files

2013-01-07 Thread Aran Dunkley
Hello, can someone please help me with this .doc upload problem? I've
tried everything and even setting  $wgVerifyMimeType to false fails to
solve it. No matter what I do I keep getting the following error when I
upload *some* word 2007 .doc files:

The file is a corrupt or otherwise unreadable ZIP file. It cannot be
properly checked for security.

I don't know how that check can even be happening with $wgVerifyMimeType
disabled, but still the error occurs?!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Clone a specific extension version

2012-12-05 Thread Aran Dunkley
Hi Guys,
How do I get a specific version of an extension using git?
I want to get Validator 0.4.1.4 and Maps 1.0.5, but I can't figure out
how to use git to do this...
Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Awful trouble with 1.19 adding to html

2012-09-27 Thread Aran Dunkley
I removed all line breaks from the code being text being returned - but
the confusing thing is that the content being returned is marked as
isHTML (not wikitext) so the parser should be leaving it alone
completely and not checking for line breaks.

On 27/09/12 16:39, Isarra Yos wrote:
> On 27/09/2012 07:56, Aran Dunkley wrote:
>> Hello, does anyone here know why the parser insists on adding  tags
>> to HTML results returned by parser functions?
>>
>> I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
>> 1.19 and I can't get the parser to stop adding  tags throughout the
>> result returned by the parser-function expansion.
>>
>> I've said isHTML = true and tried setting noparse to true and many other
>> things, but what ever I do - even removing all whitespace, it still adds
>> 's!
>
> Is it inserting extra linebreaks? I don't really know anything about
> what you're working with specifically, but general wikitext can be
> really thrown off by an unexpected linebreak.
>


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Awful trouble with 1.19 adding to html

2012-09-27 Thread Aran Dunkley
Hello, does anyone here know why the parser insists on adding  tags
to HTML results returned by parser functions?

I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
1.19 and I can't get the parser to stop adding  tags throughout the
result returned by the parser-function expansion.

I've said isHTML = true and tried setting noparse to true and many other
things, but what ever I do - even removing all whitespace, it still adds
's!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Loading jQuery

2011-08-01 Thread Aran Dunkley
I've figured it out now thanks :) the problem was that my scripts had
things running inline that needed to be deferred until after the JS
modules had loaded.

On 02/08/11 18:10, Roan Kattouw wrote:
> On Mon, Aug 1, 2011 at 1:55 PM, Aran Dunkley  wrote:
>> Yes I'm using both jQuery and jQueryUI in some of my extensions and
>> they've broken in MediaWiki 1.17 because neither of them are being
>> loaded any more. I've tried many different variations of using
>> $wgOut->addModules and setting items in $wgResourceModules but no matter
>> what I do I can't seem to get the jQuery scripts to load. The browsers
>> error log just says that the $, jQuery and other functions aren't defined.
>>
> That's very strange. jQuery should definitely be there, otherwise
> something is very wrong. I wouldn't know how to debug this without
> more information. Is the JavaScript console reporting any JS errors?
> 
> Roan Kattouw (Catrope)
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Loading jQuery

2011-08-01 Thread Aran Dunkley
Yes I'm using both jQuery and jQueryUI in some of my extensions and
they've broken in MediaWiki 1.17 because neither of them are being
loaded any more. I've tried many different variations of using
$wgOut->addModules and setting items in $wgResourceModules but no matter
what I do I can't seem to get the jQuery scripts to load. The browsers
error log just says that the $, jQuery and other functions aren't defined.

On 01/08/11 23:51, Jelle Zijlstra wrote:
> Are you aware that jQuery and jQuery UI are two different things?
> 
> 2011/8/1 Aran Dunkley 
> 
>> Hello, I'm trying to update some of my extensions to work with the
>> ResourceLoader in 1.17, and I can't work out how to get jQuery to load.
>> I've tried the example at
>>
>> http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers
>>
>> and tried adding it directly,
>> $out->addModules( array( 'jquery.ui' ) );
>>
>> But nothing I do will actually get the script to load and become
>> available, can anyone tell me the new syntax to get jQuery there?
>>
>> Thanks,
>> Aran
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Loading jQuery

2011-07-31 Thread Aran Dunkley
Hello, I'm trying to update some of my extensions to work with the
ResourceLoader in 1.17, and I can't work out how to get jQuery to load.
I've tried the example at
http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers

and tried adding it directly,
$out->addModules( array( 'jquery.ui' ) );

But nothing I do will actually get the script to load and become
available, can anyone tell me the new syntax to get jQuery there?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nuking deleted pages and their revisions

2011-05-30 Thread Aran Dunkley
Hello,

I have a lot of deleted pages in my wiki and was wondering how to free
up the database by getting rid of them completely. I don't want to loose
the history of the pages that aren't deleted though, is there an
extension for this?

Thanks,
Aran
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Cite extension problem

2010-12-13 Thread Aran Dunkley
Hello, I've just upgraded the Cite extension on our wikis and it's now
giving me this error on all refs: "Cite error: Ran out of custom link
labels for group "". Define more in the
[[MediaWiki:cite_link_label_group-]] message.]", does anyone know the
quickest way to fix this?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Database dump character set problems

2010-04-15 Thread Aran Dunkley
$wgDBmysql5 is set to false, the show create table for page gives this 
on both the original and the new server:

mwiki_page | CREATE TABLE `mwiki_page` (
  `page_id` int(8) unsigned NOT NULL AUTO_INCREMENT,
  `page_namespace` int(11) NOT NULL DEFAULT '0',
  `page_title` varchar(255) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL,
  `page_restrictions` tinyblob NOT NULL,
  `page_counter` bigint(20) unsigned NOT NULL DEFAULT '0',
  `page_is_redirect` tinyint(1) unsigned NOT NULL DEFAULT '0',
  `page_is_new` tinyint(1) unsigned NOT NULL DEFAULT '0',
  `page_random` double unsigned NOT NULL DEFAULT '0',
  `page_touched` varchar(14) CHARACTER SET latin1 COLLATE latin1_bin NOT 
NULL DEFAULT '',
  `page_latest` int(8) unsigned NOT NULL DEFAULT '0',
  `page_len` int(8) unsigned NOT NULL DEFAULT '0',
  PRIMARY KEY (`page_id`),
  UNIQUE KEY `name_title` (`page_namespace`,`page_title`),
  KEY `page_random` (`page_random`),
  KEY `page_len` (`page_len`)
) ENGINE=MyISAM AUTO_INCREMENT=19105 DEFAULT CHARSET=latin1 
COLLATE=latin1_spanish_ci

Platonides wrote:
> Aran Dunkley escribió:
>   
>> Hi I'm wondering if anyone can help with this multibyte character 
>> corruption:
>> http://aqes.organicdesign.tv/Categor%C3%ADa:Arquitecto
>>
>> The site was moved from a shared host to a dedicated server, but now 
>> it's not rendering the multibyte characters properly in the titles, but 
>> seems ok in the content. The mediawiki version went from 1.15.1 to 
>> 1.15.3 and the MySQL version 5.1.30 to 5.1.45 and all the configuration 
>> seems identical on the new server.
>>
>> I've tried all the tips about --default-character-set to latin1 or utf8 
>> etc mentioned in the MW manual page but nothing seems to change it no 
>> matter what I use for export or import.
>>
>> A solution would be greatly appreciated as their site has been down for 
>> almost a week now while trying to solve this.
>>
>> Thanks,
>> Aran
>> 
>
> What do you have $wgDBmysql5 set to?
> What's the output of doing SHOW CREATE TABLE page; ?
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Database dump character set problems

2010-04-15 Thread Aran Dunkley
Hi I'm wondering if anyone can help with this multibyte character 
corruption:
http://aqes.organicdesign.tv/Categor%C3%ADa:Arquitecto

The site was moved from a shared host to a dedicated server, but now 
it's not rendering the multibyte characters properly in the titles, but 
seems ok in the content. The mediawiki version went from 1.15.1 to 
1.15.3 and the MySQL version 5.1.30 to 5.1.45 and all the configuration 
seems identical on the new server.

I've tried all the tips about --default-character-set to latin1 or utf8 
etc mentioned in the MW manual page but nothing seems to change it no 
matter what I use for export or import.

A solution would be greatly appreciated as their site has been down for 
almost a week now while trying to solve this.

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l