Re: URL lengths
In article <20171128115425.22cudniq3zrfba3l@somnambulist.local>, Daniel Silverstone wrote: > I don't believe we limit URL length per-se, though they get interned and > as such four gigabytes per URL is probably the absolute limit. In > addition, POST data is nominally unlimited though I believe we have a > similar four gigabyte limit. Right, so it will be an arbitrary comparatively small allowance I will make, and then politely refuse to handle anything larger. Thank you all for your assistance. John -- | John Williams | joh...@ukgateway.net Names for Soul Band:- Soul Doubt *
Re: URL lengths
On Mon, Nov 27, 2017 at 18:08:46 +, John Williams wrote: > What is the maximum URL length (including POST data) that NetSurf can > handle? I don't believe we limit URL length per-se, though they get interned and as such four gigabytes per URL is probably the absolute limit. In addition, POST data is nominally unlimited though I believe we have a similar four gigabyte limit. So if you want to store any possible URL plus POST data you'd need eight gigabytes per allocation. D. -- Daniel Silverstone http://www.netsurf-browser.org/ PGP mail accepted and encouraged.Key Id: 3CCE BABE 206C 3B69
Re: URL lengths
On Mon, Nov 27, 2017 at 06:08:46PM +, John Williams wrote: > An genuine real-life number would be better! >From memory, they are dynamically allocated as needed and can be of arbitrary length. This is a lot more efficient than using statically-sized buffers for the worst common case, especially in a piece of software that might be storing information on tens of thousands of URLs (Everything in your history, the links to all the resources on every page you have open, etc). My advice is firstly to either measure and then allocate, or read in the data and reallocate as needed dynamically. Secondly, use a different language; BBC Basic really isn't good for this sort of thing :) (If using a modern interpreted language like Lua, Python, or even Perl, asking this question would never have occurred to you.) B.