Well, now image magick got 20 GByte instead of just 4 for a single run.
So very likely imagemagick won't run out of disc space and will delete
its temporary files. I just did a test with a lot of images and all
files were deleted properly. If the issue comes up again I will code
Tims suggestion
Yet another example of poor coding, and instead of fixing the code the root
issue is ignored.
On Sun, May 25, 2014 at 7:27 PM, Jeremy Baron wrote:
> On Sun, May 25, 2014 at 7:36 PM, Dirk Hünniger
> wrote:
> > partitioning manually did the trick. Now I got 20 GByte mounted on tmp .
>
> So, you'r
On Sun, May 25, 2014 at 7:36 PM, Dirk Hünniger
wrote:
> partitioning manually did the trick. Now I got 20 GByte mounted on tmp .
So, you're planning to just let it fill up with 20 GB of corrupt files
instead of 4 GB?
(see the research Tim did for you)
-Jeremy
__
Hi,
partitioning manually did the trick. Now I got 20 GByte mounted on tmp .
Thanks a lot
Yours Dirk
On 2014-05-25 20:32, Tim Landscheidt wrote:
Dirk Hünniger wrote:
I currently got a problem that I got only 4 GByte of disc
space which causes permanent outages of my service. If
wikipages with
Dirk Hünniger wrote:
> I currently got a problem that I got only 4 GByte of disc
> space which causes permanent outages of my service. If
> wikipages with more that 4 GByte of raw images are processes
> imagemagic crashes and does not free the disk anymore
> leaving the system unusable. I got a m
On May 25, 2014 2:07 PM, "Dirk Hünniger"
wrote:
> I cannot do that. ImageMagic is using the temporary directory itself.
Surely there is some way. Either by tricking convert (chroot?) or setting a
parameter or environment var or at least pausing periodically and cleaning
while there's not any inst
Hi,
I cannot do that. ImageMagic is using the temporary directory itself. On
lvmdiskscan
dhun@mediawiki2latex:~$ sudo lvmdiskscan
/dev/ram0 [ 64.00 MiB]
/dev/ram1 [ 64.00 MiB]
/dev/vda1 [ 7.63 GiB]
/dev/ram2 [ 64.00 MiB]
/dev/vda2 [ 1.91 GiB]
/dev/ram
(and of course, clean up after your crashes… you could make a working dir
per invocation and delete contents for just that instance on crash)
-Jeremy
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
It looks much better now.
Regarding newcommand's, it's more natural to say:
第4章 instead of 章4
第2页 instead of 页2
图8 instead of 图形8
and in Chinese you don't need to (and shouldn't) add spaces between words.
I feel there're some extra ones added especially near links.
-Liangent
On May 26, 2014 1:0
On May 25, 2014 1:42 PM, "Dirk Hünniger"
wrote:
> I currently got a problem that I got only 4 GByte of disc space which
causes permanent outages of my service. If wikipages with more that 4 GByte
of raw images are processes imagemagic crashes and does not free the disk
anymore leaving the system u
Hi,
I currently got a problem that I got only 4 GByte of disc space which
causes permanent outages of my service. If wikipages with more that 4
GByte of raw images are processes imagemagic crashes and does not free
the disk anymore leaving the system unusable. I got a medium instance so
there
Hi,
I made a language file for chinese now and installed it on the server.
So please have a try:
http://mediawiki2latex.wmflabs.org/
Yours Dirk
PS:The language file:
\HyphSubstLet{ngerman}{ngerman-x-latest}
\usepackage{xeCJK}
\setCJKmainfont{WenQuanYi Zen Hei}
\newcommand{\mychapterbabel}{章}
\n
I failed to compile your document in it's original form:
(../headers/babel.tex
(/var/lib/texmf/tex/generic/babel/babel.sty
(/usr/share/texlive/texmf-dist/tex/generic/babel-english/english.ldf
(/usr/share/texlive/texmf-dist/tex/generic/babel/babel.def
! Undefined control sequence.
\initiate@active@
On May 25, 2014 10:49 AM, "Petr Bena" wrote:
> It's a shame that developers didn't provide any feedback links, there
> is number of things that should be improved on these "Games" but...
> where should I report that hm?
Use the bitbucket issue tracker at [1]. I agree it should be better
publicize
https://bitbucket.org/magnusmanske/wikidata-game/issues?status=new&status=open
This is linked on the main page ;p
On 25 May 2014 16:49, Petr Bena wrote:
> It's a shame that developers didn't provide any feedback links, there
> is number of things that should be improved on these "Games" but...
It's a shame that developers didn't provide any feedback links, there
is number of things that should be improved on these "Games" but...
where should I report that hm? maybe they have bugzilla section, maybe
not. But there should be some link on their webpages.
On Sun, May 25, 2014 at 4:50 PM, Ti
Emilio J. Rodríguez-Posada wrote:
> I have discovered the Wikidata games today.[1] Since long time ago I have
> tried to do something like that. I did a first approach with the "Images
> for biographies" tool[2] (I have to migrate it to Labs from Toolserver).
> I believe that many people can be
Hello;
I have discovered the Wikidata games today.[1] Since long time ago I have
tried to do something like that. I did a first approach with the "Images
for biographies" tool[2] (I have to migrate it to Labs from Toolserver).
I believe that many people can be involved in wiki projects doing tiny
Hi,
I am sending you the latex source of the main page of the chinese
wikipedia as attachment to your personal Email. You can look at it. But
if you want to compile it you need to have ubuntu 14.04 and do
sudo apt-get install mediawiki2latex
xelatex main.tex
Yours Dirk
PS: If about the -c co
I don't really have an idea about how to "go for the command line version
and use the -c command line option"; I know nothing about Haskell anyway...
I hope that it's available on the web, is it possible to add a checkbox or
something?
-Liangent
On Sun, May 25, 2014 at 7:18 PM, Dirk Hünniger w
Hi,
I didn't take any special care about CJK. Its a bit hard for me since I
cannot read any of these languages myself. Maybe you can have a look at
the LaTeX source and tell me what I need to change. Currently no CJK
package is loaded. The only thing I am doing is to switch to ttf fonts
that c
I had a try using an article on Chinese Wikipedia. Although I'm not sure
whether the cause is in generated LaTeX source or the way you invoke LaTeX,
the most notable problem is that in output PDF, word wrap doesn't take
place correctly so almost every line overflows. See
https://en.wikipedia.org/wi
Hi,
if you want the tex source go for the command line version and use the
-c command line option. If you want to convert from tex to mediawiki use
pandoc. In the imprint of each pdf there is a link to the sourceforge
page. Its slow, but I cannot make it any faster. Its mostly the runtime
of L
Hi,
On Sun, 25 May 2014, at 18:02, Dirk Hünniger wrote:
> It not a private server anymore. Its now running on Wmflabs already.
>
> http://mediawiki2latex.wmflabs.org
I would probably link to the source code and a bug tracker on its main page.
- I see it generated a PDF. Nicely formatted. :) But
I talk about short-term. So it will take time that Wikidata will fully
support list generation and than it takes time to transfer all lists e.g
from WikiLovesMonuments to Wikidata. Some lists in de.wikipedia are
created manually in a form that will be hard to transfer.
Greetings Tim
Am 25.05.2014
Hi,
It not a private server anymore. Its now running on Wmflabs already.
http://mediawiki2latex.wmflabs.org
Yours Dirk
On 2014-05-25 08:24, Gryllida wrote:
Hi,
On Sun, 22 Dec 2013, at 21:25, Dirk Hünniger wrote:
Hello,
I wrote a Mediawiki to LaTeX Compiler. It is part of the current version
o
26 matches
Mail list logo