I have been reading really large files using this setup;
file: read/lines %filename
foreach line file [do stuff]
This seems to work well, however what is the limitation. What are the
limitations on such a read, does the entire file get read into memory
when issuing the read/lines command?
Seems to work for me anyhow....
Francois
On Thu, 24 Feb 2000 [EMAIL PROTECTED] wrote:
>
> > I need to read a BIG text file line by line but I
> > don't know exactly how to do.
> >
> > As I understand the following will read the entire
> > file to a list of lines and that is not what I want.
> >
> > lines: read/lines %textfile
> >
> > I want to read one line, process it before reading
> > the next line and so on.
> >
>
> Hi Peter:
>
> It's relatively easy to act on files larger than memory, one line at a time.
> I believe BO at REBOL came up with the technique originally. I've adapted it
> and use it for manipulating large log files on my internet servers.
>
> Here it is, enjoy:
>
> hugefile: open/direct/read/lines %huge_file
>
> while [ ( line: pick hugefile 1 ) <> none ] [
> ;;do stuff to each line;; ]
>
> close hugefile
>
>
> --Ralph Roberts
>