Hi Carson, the problem is, that in 1.3 we are using libxml to read in the complete file. The same happens in 1.2, but there we use our own code todo it ...
In order to fix this behaviour, I have to rewrite rrd_restore entirely to use an incremental parsing aproach (xmlread) ... for now, your best bet with large rrd files is to use 1.2 cheers tobi Yesterday Carson Gaspar wrote: > Tobias Oetiker wrote: > > Yesterday Carson Gaspar wrote: > > > > > Tobias Oetiker wrote: > > > > > > > the restore issues may come from the switch to libxml for reading the > > > > rrd dump files ... does this memory issue occure with any file you > > > > try to restore or only with realy big ones ? > > > The memory bloat (compared to 1.2.x) is for all files, but the degree > > > varies > > > by input file. I haven't determined what the difference is, but the ones > > > that > > > completely exhaust VM are definitely among the ones with the most samples. > > > > what size are you talking about ? > > One failing example: > > -rw-r--r-- 1 gaspac fir1 966M May 19 14:28 0.xml > > 1.2.x uses 1,100,399K (according to "timex -p -m") > > 1.3.8 spews thousands of errors: > > 0.xml:81729: error: xmlSAX2Characters: out of memory > v><v> 0.0000000000e+00 </v><v> 0.0000000000e+00 </v><v> 0.0000000000e+00 > </v><v> > > A 32-bit binary can't deal at all. I tried building a 64-bit binary and (as I > recall) it got to over 15GB of VM before dying. (I'd give more exact numbers, > but rebuilding the huge numbers of rrdtool dependencies 64-bit takes ages). > > -- Tobi Oetiker, OETIKER+PARTNER AG, Aarweg 15 CH-4600 Olten, Switzerland http://it.oetiker.ch [email protected] ++41 62 775 9902 / sb: -9900 _______________________________________________ rrd-users mailing list [email protected] https://lists.oetiker.ch/cgi-bin/listinfo/rrd-users
