On 2015-09-08 07:43, Adrian Johnson wrote:
On 08/09/15 21:06, Even Rouault wrote:
Hi,

A too huge number may cause the gmallocn() in Catalog::cachePageTree()
to crash even if we call it with a low page number.

Even


+      // to avoid too huge memory allocations layer and avoid crashes
+      // This is the maximum number of indirect objects as per
ISO-32000:2008 (Table C-1)

Table C-1 is a list of minimum limits for 32-bit readers.


+      // We could probably decrease that number again. PDFium for
example uses 1 Mi
+      else if (numPages > 8 * 1024 * 1024) {
+        error(errSyntaxWarning, -1,
+              "Page count ({0:d}) too big. Limiting number of
reported pages to 8 Mi",
+              numPages);

Instead of imposing an arbitrary limit we should just add a check for
gmallocn() returning NULL and print an error.

For broken PDFs that report an invalid size (see bug 85140) we could
check if the page count exceeds the number of objects in the XRef.

By the way, bug 91353 is also about an invalid (fuzzed) PDF claiming millions of pages. I thought about maybe walking the page tree to count pages if the count seems unreasonable but it didn't seem worth the effort. I like your idea better.
_______________________________________________
poppler mailing list
poppler@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/poppler

Reply via email to