Hey friends,

Yesterday we received a security report [1] on curl that identified a possible performance issue. We deemed it not a security problem, but it still triggered this question:

When a user sets a URL in libcurl it allows up to an 8 megabyte string. If that URL uses a maximum amount of "../" occurances, all those need to be "optimized away" in the dotdot normalization phase of the URL parsing.

That makes 2,666,663 three-byte sequences to remove in the worst possible case.

While asking ourselves how much faster we can make that code, I figure we could also open up the question:

How long URL do we really need to support? In practise, a URL longer than a few tens of kilobytes is not likely to actually work on the internet. Most HTTP servers for example have a limit of somewhere around 8-9 kilobytes for the path component.

Maybe we could save ourselves and our users some future trouble by reducing the max URL limit size? Maybe we can set it to 1 megabyte (or even lower) without any legitiate user ever noticing?

Thoughts?

[1] = https://hackerone.com/reports/3463608

--

 / daniel.haxx.se || https://rock-solid.curl.dev
--
Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library
Etiquette:   https://curl.se/mail/etiquette.html

Reply via email to