------- Comment #23 from dir at lanl dot gov  2005-12-22 16:01 -------
What is happening here is actually quite simple. The program reads the green
word for the previous record from the file from location "file_position (u->s)
- length)", that word gives the length of the previous record in bytes. It then
pulls that number out of the buffer and puts it into correct integer form for
the current machine, that is the number 'm'. The length of a binary record is
always the length of the two green words plus the length of the data or (m +
2*sizeof (gfc_offset)). The position before the currect record is then
file_position (u->s) - m - 2*sizeof (gfc_offset).
    I did not change the current coding and it works for all of my tests. Any
time that p is not NULL, length returns the number of bytes read, which should
be sizeof (gfc_offset). If length is not equal to sizeof (gfc_offset), there is
an error in the file data - so may be that should be put in as an error test.


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=25139

Reply via email to