ID:               26510
 Updated by:       [EMAIL PROTECTED]
 Reported By:      thomas303 at web dot de
-Status:           Open
+Status:           Feedback
 Bug Type:         Filesystem function related
 Operating System: Suse Linux 8.2
 PHP Version:      4.3.5-dev
 New Comment:

fgetcvs() can support multi-line rows, which may explain why a file
with X number of lines when read with fgetcvs() would return less lines
(but equal amount of total data).

If you cannot provide the original cvs file, could you provide an
equivalent that can replicate the problem. Without a cvs file it's
nearly impossible for us to resolve this problem.


Previous Comments:
------------------------------------------------------------------------

[2003-12-03 15:24:40] thomas303 at web dot de

Hi, me again...

in case I consider it correctly, fgetcsv is sometimes
- reading and processing a line, and twice jumping to the next line 
- or ignoring one line and processing the next one.

best regards
thomas

------------------------------------------------------------------------

[2003-12-03 15:19:27] thomas303 at web dot de

Hi,

using fget and explode (like written in the script above) $zaehler
returns 20967.

Using fgetcsv it returns 17861 reading the same file.

I am really sorry, I can't provide the file.

Best regards
Thomas

------------------------------------------------------------------------

[2003-12-03 15:15:57] thomas303 at web dot de

#!/usr/local/bin/php
<?php

$file_name="pks.dat";

        $fp=fopen($file_name,"r");
        while(feof($fp)===False)
        {
                // The comment-lines should work instead of fgetcsv.
                // $new = fgets($fp,4096);
                // $new_field_values=explode("|",$new);
                $new_field_values=fgetcsv($fp,4096,"|");
                $zaehler++;
        }
        fclose ($fp);
        print $zaehler."\n";
?>

------------------------------------------------------------------------

[2003-12-03 15:03:10] thomas303 at web dot de

Hi,

I just wanted to send you some additional information about the way I
tested:

I tested importing a File on 3 Systems:
- 2 times php4.3.3
- 1 time php4.3.5-dev

I used one file on all these tests and every time I got the problem
with fgetcsv which got solved using gets and explode.

Using another system with php4.3.3, same data structure but other data
in 30,000 rows, I have never had a problem. So it could be the data or
maybe fgetcsv is not binary safe.

I didn't have the problem some days ago under same circumstances with
nearly the same data and size. I spent hours in trying to reproduce why
the error occurs, but I can't.

Maybe a look into the PHP source code helps you more than I can at the
moment.

I can't give you the original file as I already wrote.

But I'll spend some minutes in creating a test script.

Best regards,
Thomas

------------------------------------------------------------------------

[2003-12-03 14:45:45] [EMAIL PROTECTED]

Please provide the prolematic cvs file and shortest possible script
that can be used to replicate the problem.

------------------------------------------------------------------------

The remainder of the comments for this report are too long. To view
the rest of the comments, please view the bug report online at
    http://bugs.php.net/26510

-- 
Edit this bug report at http://bugs.php.net/?id=26510&edit=1

Reply via email to