I don't know if this happens in PHP, but this is similar to C's "add one to variable".
ie. You have three ways of adding one to a variable 1) a = a + 1 2) a += 1 3) a++ or ++a from what I was told, 1) is slower than 2) which is slower than 3). So it would seem that this is the same sort of thing for strings in PHP. Martin -----Original Message----- From: George Whiffen [mailto:[EMAIL PROTECTED]] Sent: Wednesday, November 21, 2001 11:39 PM To: [EMAIL PROTECTED] Subject: [PHP] Different syntax = different performance (concatenating assignment) Dear All, I had always thought of concatenating assignment and concatenation + assignment to the same variable as really just syntatical variations i.e. I thought that $data = $data . <some strings>; and $data .= <some strings>; were just alternate syntaxes for the same operation. I've always tended to use the long format on the grounds that it was more readable and maintainable. How wrong I was! It seems the performance on big strings can be hugely different. I think I know why but I'd appreciate confirmation. I came across this when investigating a performance issue with writing out a gz-encoded csv file from an SQL table. The code is something like: $data = ''; while ($row_product = mysql_fetch_array($cur_product)) { $data = $data . '"'.str_pad(strip_tags(strtr($row_product[product_code],'"\',',' ')),40) .'","'.str_pad(strip_tags(strtr($row_product[product_name],'"\',',' ')),60) .'","'.str_pad(strip_tags(strtr($row_product[product_desc],'"\',',' ')),120) .'"'."\r\n"; } $Size = strlen($data); $Crc = crc32($data); $data = gzcompress($data); $data = "\x1f\x8b\x08\x00\x00\x00\x00\x00" . substr($data,0,strlen($data) -4) . pack("V",$Crc). pack("V",$Size); fwrite($handle,$data); fclose($handle); There seemed to be plenty of reasons why this ran slow (5 seconds plus on only a couple of thousand product rows). I suspected each of the strtr, strip_tags, str_pad and gzcompress in turn but it turned out that a simple change:- $data = $data . into $data .= ran an order of magnitude faster (i.e. less than 0.5s). I guess that in the first case a working copy of $data has to be made, whereas in the second, the concatenation is done directly on the existing copy of data i.e. the performance difference is just the price of creating and throwing away two thousand copies of $data. Does that make sense? Anyone know of other cases where alternate syntaxes can make such a difference to performance? If I get some confirmation of this analysis I'll bung a note on the manual at http://www.php.net/manual/en/language.operators.string.php Humbled, George -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]