On Sun, 28 Apr 2002, Per Einar Ellefsen wrote:

> At 17:18 28.04.2002, Ernest Lergon wrote:
> >Now I'm scared about the memory consumption:
> >
> >The CSV file has 14.000 records with 18 fields and a size of 2 MB
> >(approx. 150 Bytes per record).
>
> Now a question I would like to ask: do you *need* to read the whole CSV
> info into memory? There are ways to overcome this. For example, looking at
> your data I suppose you might want to look for specific IDs, in that case
> it would be much more efficient to read one line at a time and check if
> it's the correct one. Otherwise you might want to make the move th a
> relational database, this is the kind of thing RDMSes excel at.

You might also want to look at loading your CSV data into a  MLDBM file,
and then having  your apache processes access it from there.  That way
most of your data stays on disk, and you access it in much the same way as
before through a hash of arrays.

Andrew McNaughton

Reply via email to