Op 7-10-2013 21:18, Vincent van Ravesteijn schreef:
Op 7-10-2013 21:05, Enrico Forestieri schreef:
On Mon, Oct 07, 2013 at 02:28:18PM -0400, Richard Heck wrote:
It is very strange that I can't reproduce, if this is the cause,
and, indeed, that people haven't been seeing this bug all along.
Actually, all my files are converted correctly. But using the one
attached to the bug report triggers reproducibly the behavior.
Not that I get converted files which are always mangled in different
ways, which is a symptom of a race condition. Also, I have some large
files (~ 700k) that convert flawlessly. So, the bug also depends on
the structure of the document, somehow.

When I open the document in a plain text editor, it complains about characters that cannot be represented in the current codeset (or something like that).

I'm now looking at commit d5a41946a (Jurgen Spitzmuller; May 19 2009).

In this code the static char array is replaced by a static vector. This vector is resized when maxoutbufsize is larger than the current size of the vector. However, once the size of this vector is increased, it will remain this large for ever.

Maybe the old code always overwrites the full variable, while this new code does not.

I could reproduce it now a few times. To do this:
- I switched to the Czech UI of Ubuntu,
- I replaced the default buffer size to 0.
I don't know whether the above steps are required or not.

Something else I notice: Sometime the macros are exported as "\newcommand" and sometimes as "\renewcommand". This seems to be rather random.

Vincent



Reply via email to