hi!

guys, this is nasty! i remember during my University days when I was
sysadim, we use to download files with sizes 4.4GB (e.g. iso images in DVD
format). is done with a breeze.  Now, when i do:
wget -c http://some-some-site/dvd/<platform>/<some-dvd-iso-format.iso

after fetching 4.0GB i get an error:  File Size Exceeded (core dumped).
after googlling a bit i saw that this is not the only case. note that i am
doing this on a network that has on restrictive policies inplaced.  meaning
"download to your  hearts content"!  issuing the command  ulimit doesn't
help at all.

1) is there any  thing from the group who have learned to overcome this
problem?
2) is this a sort of version issue? a bug on the tcp/ip stack?
3) test where done on debian40r0, debian40r1, ubuntu6.06,ubuntu7.04,
ubuntu7.10,SLES10.

anyone from the list who could provide me an indepth understanding or some
idea on what's causing this issue?

is there a policy placed to the internet infrastructure? which limits such
fetching activities?



-- 
Ronald Allan V. Tomimbang
_________________________________________________
Philippine Linux Users' Group (PLUG) Mailing List
[email protected] (#PLUG @ irc.free.net.ph)
Read the Guidelines: http://linux.org.ph/lists
Searchable Archives: http://archives.free.net.ph

Reply via email to