Zarabozo, Francisco (GE, Corporate) wrote:
Hello All,

Thank tou for your answers. I think you're both right about handling files by chunks instead of loading a huge file into memory at once. However, there's definitely something wrong in the way that XML::Simple manages memory usage, which is my point.

The file I'm loading is a 124 MB file. There's no reason at all to reach 3 GB in memory allocation.


Ye gods , XML is a poor choice as a DB replacement as you have witnessed first hand.

BTW my 2 cents says that perl bombs out due to excessive recursion depth! I have had it happen to me on occasion. The trick is to break out of the recursion early.
begin:vcard
fn:Angelos Karageorgiou
n:Karageorgiou;Angelos
org:Vivodi Telecommunications S.A.
email;internet:[email protected]
title:Technology Manager
tel;work:+30 211 7503 893
tel;fax:+30 211 7503 701
tel;cell:+30 6949120773
note;quoted-printable:=0D=0A=
	=0D=0A=
	Linkedin Profile =
	=0D=0A=
	http://www.linkedin.com/in/unixgr=0D=0A=
	=0D=0A=
	=0D=0A=
	=0D=0A=
	Personal Web Site=0D=0A=
	http://www.unix.gr=0D=0A=
	=0D=0A=
	=0D=0A=
	Blog Site=0D=0A=
	http://angelos-proverbs.blogspot.com
x-mozilla-html:FALSE
url:http://www.linkedin.com/in/unixgr
version:2.1
end:vcard

_______________________________________________
ActivePerl mailing list
[email protected]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to