php-general Digest 20 Oct 2009 10:28:40 -0000 Issue 6400

Topics (messages 299074 through 299088):

Re: Please don't kick me!
        299074 by: Floyd Resler
        299076 by: Philip Thompson
        299077 by: Floyd Resler
        299078 by: Kim Madsen
        299079 by: ray.bigdoghost.com

Re: Sanitizing potential MySQL strings with no database connection
        299075 by: Dotan Cohen
        299082 by: Jim Lucas

Re: Blocking video streaming
        299080 by: Talawa

Re: Header problem - SOLVED
        299081 by: Kim Madsen

MySQLi and prepared statements
        299083 by: Chris W

Statically linked library?
        299084 by: Ning Shi

Re: PEAR segfaulting
        299085 by: Greg Beaver
        299086 by: Greg Beaver

Access violation error again
        299087 by: Marshall Burns

Re: PHP broadcast mailer
        299088 by: Tom Chubb

Administrivia:

To subscribe to the digest, e-mail:
        [email protected]

To unsubscribe from the digest, e-mail:
        [email protected]

To post to the list, e-mail:
        [email protected]


----------------------------------------------------------------------
--- Begin Message ---
Phillip,
I use ezpdf (http://www.ros.co.nz/pdf/). I've been using it for years and have found it very capable of making any PDF I want.

Take care,
Floyd

On Oct 19, 2009, at 4:47 PM, Philip Thompson wrote:

Hi all.

I know this question has been asked a thousand times on the list, but my searches in the archives are not being nice to me. So... please don't kick me.

Currently, we use DOMPDF to generate PDFs from HTML. However, it's no longer maintained and it has a few bugs that we just can no longer live with. What PDF generating software do you use? It does not have to be free, but it must run on linux and may be command line or run through code. Some of the ones I have researched are...

html2pdf
html2ps
html2fpdf
xhtml2pdf
fpdf
tcpdf

You're thoughts would be appreciated. Oh, my preference would be to send HTML/CSS to a script and it just automagically convert to PS/PDF.

Thanks,
~Philip

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




--- End Message ---
--- Begin Message ---
On Oct 19, 2009, at 3:52 PM, Floyd Resler wrote:

Phillip,
I use ezpdf (http://www.ros.co.nz/pdf/). I've been using it for years and have found it very capable of making any PDF I want.

Take care,
Floyd

This one seems fairly neat. However, it appears as though the author no longer keeps up with it - last entry on 6/17/2006. Have you ever run into any problems or setbacks with it?

Thanks,
~Philip

--- End Message ---
--- Begin Message --- Nope. I've never had any troubles with it. I've been able to produce all kinds of PDFs including loan agreements, inventory pick lists with barcodes, and various others. I find it incredibly powerful and easy to use.

Take care,
Floyd

On Oct 19, 2009, at 5:17 PM, Philip Thompson wrote:

On Oct 19, 2009, at 3:52 PM, Floyd Resler wrote:

Phillip,
I use ezpdf (http://www.ros.co.nz/pdf/). I've been using it for years and have found it very capable of making any PDF I want.

Take care,
Floyd

This one seems fairly neat. However, it appears as though the author no longer keeps up with it - last entry on 6/17/2006. Have you ever run into any problems or setbacks with it?

Thanks,
~Philip

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




--- End Message ---
--- Begin Message ---
Hi Philip

Philip Thompson wrote on 2009-10-19 22:47:
> Hi all.
>
> I know this question has been asked a thousand times on the list, but my searches in the archives are not being nice to me. So... please don't kick me.

Why would anyone do that? We're PHPeople and PHfrinds (ho ho)

> Currently, we use DOMPDF to generate PDFs from HTML. However, it's no longer maintained and it has a few bugs that we just can no longer live with. What PDF generating software do you use? It does not have to be free, but it must run on linux and may be command line or run through code. Some of the ones I have researched are...
>
> html2pdf
> html2ps
> html2fpdf
> xhtml2pdf
> fpdf
> tcpdf
>
> You're thoughts would be appreciated. Oh, my preference would be to send HTML/CSS to a script and it just automagically convert to PS/PDF.

I've been using fpdf for 4-5 years for invoices among others and are very happy with that.


--
Kind regards
Kim Emax - masterminds.dk

--- End Message ---
--- Begin Message --- ----- Original Message ----- From: "Philip Thompson" <[email protected]>
To: "PHP General list" <[email protected]>
Sent: Monday, October 19, 2009 1:47 PM
Subject: [PHP] Please don't kick me!


Hi all.

I know this question has been asked a thousand times on the list, but my searches in the archives are not being nice to me. So... please don't kick me.

Currently, we use DOMPDF to generate PDFs from HTML. However, it's no longer maintained and it has a few bugs that we just can no longer live with. What PDF generating software do you use? It does not have to be free, but it must run on linux and may be command line or run through code. Some of the ones I have researched are...

html2pdf
html2ps
html2fpdf
xhtml2pdf
fpdf
tcpdf

You're thoughts would be appreciated. Oh, my preference would be to send HTML/CSS to a script and it just automagically convert to PS/PDF.

Thanks,
~Philip




I've been using ezpdf for many years and I think it is still the best option available.
Ihttp://www.ros.co.nz/pdf/


Best Regards
--- End Message ---
--- Begin Message ---
2009/10/19 Kim Madsen <[email protected]>:
> Dotan Cohen wrote on 2009-10-18 21:21:
>
>> I thought that one could not test if a database connection is
>> established or not, this is the most relevant thing that I found while
>> googling that:
>> http://bugs.php.net/bug.php?id=29645
>
> from http://www.php.net/manual/en/function.mysql-connect.php
>
> $link = mysql_connect('localhost', 'mysql_user', 'mysql_password');
> if (!$link) {
>    die('Could not connect: ' . mysql_error());
> }
>
> So just test if $link is available
>

I need to know if there is _any_ connection available, not a specific
connection. In one script it may be $link but in another $connection.


>> All the connections are to MySQL databases, but to _different_ MySQL
>> databases on the same host.
>
> Would't this solve you problem?
>
> $link1 = mysql_connect('localhost', 'mysql_user1', 'mysql_password');
> $link2 = mysql_connect('localhost', 'mysql_user2', 'mysql_password');
>
> if($link1) {
> etc...
>
> or I would say that your "different scripts" should require different db
> connection files.
>

Of course they connect differently, each to a different database (all
on localhost).


-- 
Dotan Cohen

http://what-is-what.com
http://gibberish.co.il

--- End Message ---
--- Begin Message ---
Dotan Cohen wrote:
> 2009/10/19 Kim Madsen <[email protected]>:
>> Dotan Cohen wrote on 2009-10-18 21:21:
>>
>>> I thought that one could not test if a database connection is
>>> established or not, this is the most relevant thing that I found while
>>> googling that:
>>> http://bugs.php.net/bug.php?id=29645
>> from http://www.php.net/manual/en/function.mysql-connect.php
>>
>> $link = mysql_connect('localhost', 'mysql_user', 'mysql_password');
>> if (!$link) {
>>    die('Could not connect: ' . mysql_error());
>> }
>>
>> So just test if $link is available
>>
> 
> I need to know if there is _any_ connection available, not a specific
> connection. In one script it may be $link but in another $connection.
> 

Dotan,

You are making this thing harder then it has to be.

All you need is to replicate the escaping of the same characters that
mysql_real_escape_string() escapes.  Simply do that.  They are listed on the
functions manual page on php.net

http://php.net/mysql_real_escape_string

Here is a function that I mocked up really quick.

I have no idea if it will work, but it is a start down the right road to solve
your problem(s)...

<?php

function clean_string($input) {

  /**
   * Character to escape...
   *    \x0     \n      \r      \       '       "       \x1a
  **/

  $patterns = array( "\x0",   "\n", "\r", "\\",   "'",    "\"", "\x1a");
  $replace = array(  '\\\x0', '\n', '\r', '\\\\', '\\\'', '\\"',  '\\\x1a');
  return str_replace($patterns, $replace, $input);
}

?>

Jim Lucas

--- End Message ---
--- Begin Message ---
Talawa a écrit :
Kim Madsen a écrit :
Hey

Talawa wrote on 2009-10-19 18:29:
Hello everyone,

I post a message here because i didn't find any solution yet.
I just finished video streaming service on my website. I use xmoov script (http://xmoov.com/xmoov-php/) to do that. It works like a charm, but I find an issue. When the video is buffering into the flash player, all others requests are pending until the video is loaded.

I discovered in my search that fopen() function could block php process.

Does someone know this problem ?

I've had a similar problem with zip downloads (which also use fopen), I supect either the headers, a caching problem or latin1/utf-8

What does your headers look like? (firefox has a lovely plugin "live http headers")

Which character encoding do you use?

Show us code bit from fopen to fclose

I forced charset encoding in my htaccess to ISO-8859-1.

Here's the http headers I receive :
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0,
pre-check=0, max-age=0
Pragma: no-cache
Last-Modified: Mon, 19 Oct 2009 19:00:33 GMT
X-Pad: avoid browser bug
Content-Length: 14145470
Vary: User-Agent
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: video/x-flv

And here's the php code :
if (($fh = fopen($file, 'rb')) === FALSE)
       return;

# assemble packet interval
$packet_interval = 0.3;

# assemble packet size
$packet_size = STREAM_SPEED * 1042;

$seekPos = intval($seekPos);
$fileSize = filesize($file) - (($seekPos > 0) ? $seekPos  + 1 : 0);

session_cache_limiter("nocache");
header("Cache-Control: private",false);
header("Expires: Thu, 19 Nov 1981 08:52:00 GMT");
header("Last-Modified: " . date("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate,
post-check=0, pre-check=0, max-age=0");
header("Pragma: no-cache");
header("Content-Type: video/x-flv");
header("X-Pad: avoid browser bug");
header("Content-Length: " . $fileSize);
# FLV file format header
if($seekPos != 0)  {
       print('FLV');
       print(pack('C', 1));
       print(pack('C', 1));
       print(pack('N', 9));
       print(pack('N', 9));
}

# seek to requested file position
fseek($fh, $seekPos);

# output file
while(!feof($fh)) {
       # get start time
       list($usec, $sec) = explode(' ', microtime());
       $time_start = ((float)$usec + (float)$sec);
       # output packet
       print(fread($fh, $packet_size));
       # get end time
       list($usec, $sec) = explode(' ', microtime());
       $time_stop = ((float)$usec + (float)$sec);
       # wait if output is slower than $packet_interval
       $time_difference = $time_stop - $time_start;
       if($time_difference < (float)$packet_interval) {
usleep((float)$packet_interval * 1000000 - (float)$time_difference * 1000000);
       }
}
fclose($fh);


Thanks for help.

I solve the issue. I forgot to close the session before sending the stream.
Add session_write_close(); before sending.

In hope it may help.

Cheers.

--- End Message ---
--- Begin Message --- This has been solved today. Talawa had a similar problem and came up with a solution to his problem, namely using session_write_close() before creating the headers. That stunt also solved my problem :-)

--
Kind regards
Kim Emax

Kim Madsen wrote on 2009-10-03 13:30:
Hi PHP people

I have a really strange and annoying problem. I've got a site, where
members can download music. User clicks index.php (in index.php
there's an iframe, that opens another file), if certain check are okay
then a popup window opens download.php, where a mp3 file is fetched
from the server and renamed in the header, then pushed to the enduser,
this works fine. But now I want to create zipfiles too but when a user
downloads a zipfile it's like the whole site is freezed until download
has completed. My guess is that this is some sort of header problem
(see headers below), due to three headers at the same time, cause the
class works as expected in the test page i've created. Inputs to
correct headers would be appriciated very much :-)

Mp3 headers:
 $new_filename = "attachment; filename=\"{$artist} - {$title}.mp3\"";
 header('Content-Description: File Transfer');
 header("Content-Type: application/octet-stream");
 header("Content-Length: $size");
 header("Content-Disposition: $new_filename");
 header("Content-Transfer-Encoding: binary");
 readfile($source_file);

Zip headers:
 $zip = new zipfile();
 $zip->add_dir(".");
 $new_filename= "{$artist} - {$title}.mp3";
 if(mysql_num_rows($result)) {
   $zip->add_file($file, $new_filename);
 }
 header("Pragma: public");
 header("Expires: 0");
 header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
 header("Cache-Control: private",false);
 header("Content-type: application/zip");
 #header("Content-Type: application/octet-stream");
 header("Content-disposition: attachment; filename=\"zipTest.zip\"");
 header('Content-Transfer-Encoding: binary');
 ob_end_clean();
 echo $zip->file();

Code example: http://lps.netlinq.dk/test010/test_zip.class.php

Headers (fetched with firefox add-on: live http headers)

This is headers from the site, where the problem occurs:

1. click on the link to a title (Maxwell in this case)
----------------------------------------------------------
http://lps.netlinq.dk/?action=download&track_id=357

GET /?action=download&track_id=357 HTTP/1.1
Host: lps.netlinq.dk
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/
20090803 Ubuntu/9.04 (jaunty) Shiretoko/3.5.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/
*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://lps.netlinq.dk/?action=download&track_id=350
Cookie: login_email=kim%40emax.dk;
PHPSESSID=fbb5d6adec802766cf6f638c99ab4f1d

HTTP/1.x 200 OK
Date: Fri, 02 Oct 2009 15:15:21 GMT
Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.6 with Suhosin-Patch
mod_ruby/1.2.6 Ruby/1.8.6(2007-09-24) mod_ssl/2.2.8 OpenSSL/0.9.8g
X-Powered-By: PHP/5.2.4-2ubuntu5.6
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-
check=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 4250
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: text/html

2. I click on "download zip" (this is a link to index.php)
if conditions are met, then a popup with download.php is activated and
here a zip header is made

----------------------------------------------------------
http://lps.netlinq.dk/index.php

POST /index.php HTTP/1.1
Host: lps.netlinq.dk
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/
20090803 Ubuntu/9.04 (jaunty) Shiretoko/3.5.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/
*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://lps.netlinq.dk/?action=download&track_id=357
Cookie: login_email=kim%40emax.dk;
PHPSESSID=fbb5d6adec802766cf6f638c99ab4f1d
Content-Type: application/x-www-form-urlencoded
Content-Length: 131
action=ask_questions&download_zipfile=1&version_id
%5B1065%5D=1&version_id%5B1066%5D=1&version_id%5B1067%5D=1&version_id
%5B1068%5D=1

HTTP/1.x 200 OK
Date: Fri, 02 Oct 2009 15:15:29 GMT
Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.6 with Suhosin-Patch
mod_ruby/1.2.6 Ruby/1.8.6(2007-09-24) mod_ssl/2.2.8 OpenSSL/0.9.8g
X-Powered-By: PHP/5.2.4-2ubuntu5.6
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-
check=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 3216
Keep-Alive: timeout=15, max=99
Connection: Keep-Alive
Content-Type: text/html
----------------------------------------------------------
http://lps.netlinq.dk/download.php?track_id=357&member_id=1&string=41e0cd250ca3a40598e2019fd4c813cc&kbit=320&zipfile=1

GET /download.php?
track_id=357&member_id=1&string=41e0cd250ca3a40598e2019fd4c813cc&kbit=320&zipfile=1
HTTP/1.1
Host: lps.netlinq.dk
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/
20090803 Ubuntu/9.04 (jaunty) Shiretoko/3.5.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/
*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://lps.netlinq.dk/index.php
Cookie: login_email=kim%40emax.dk;
PHPSESSID=fbb5d6adec802766cf6f638c99ab4f1d

HTTP/1.x 200 OK
Date: Fri, 02 Oct 2009 15:15:30 GMT
Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.6 with Suhosin-Patch
mod_ruby/1.2.6 Ruby/1.8.6(2007-09-24) mod_ssl/2.2.8 OpenSSL/0.9.8g
X-Powered-By: PHP/5.2.4-2ubuntu5.6
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-
check=0
Pragma: no-cache
Content-Disposition: attachment; filename="Maxwell - Bad Habits
(Remixes).zip"
Content-Transfer-Encoding: binary
Content-Length: 54234978
Keep-Alive: timeout=15, max=98
Connection: Keep-Alive
Content-Type: application/zip
----------------------------------------------------------
3. as long as the zip file is downloading the site "freezes" until
download is complete, then the link I've clicked is "activated"
___________________________________________________________________________

And this is the headers from the test page:
http://lps.netlinq.dk/test010/test_zip.class.php
http://home.emax.dk/~emax/test/test_zip.class.php

Same files, different servers

headers from test010:

http://home.emax.dk/~emax/test/test_zip.class.php?zip_to_browser=1

GET /~emax/test/test_zip.class.php?zip_to_browser=1 HTTP/1.1
Host: home.emax.dk
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/
20090803 Ubuntu/9.04 (jaunty) Shiretoko/3.5.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/
*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://home.emax.dk/~emax/test/test_zip.class.php?test=15

HTTP/1.x 200 OK
Date: Fri, 02 Oct 2009 14:31:08 GMT
Server: Apache/1.3.37 (Unix) PHP/4.4.4
X-Powered-By: PHP/4.4.4
Pragma: public
Expires: 0
Cache-Control: must-revalidate, post-check=0, pre-check=0, private
Content-Disposition: attachment; filename="zipTest2.zip"
Content-Transfer-Encoding: binary
Keep-Alive: timeout=15, max=99
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/zip
____________________________________________________________________

headers from Netlinq:

GET /test010/test_zip.class.php?zip_to_browser=1 HTTP/1.1
Host: lps.netlinq.dk
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/
20090803 Ubuntu/9.04 (jaunty) Shiretoko/3.5.2
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/
*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://lps.netlinq.dk/test010/test_zip.class.php?test=21
Cookie: login_email=kim%40emax.dk;
PHPSESSID=fbb5d6adec802766cf6f638c99ab4f1d

HTTP/1.x 200 OK
Date: Fri, 02 Oct 2009 14:55:37 GMT
Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.6 with Suhosin-Patch
mod_ruby/1.2.6 Ruby/1.8.6(2007-09-24) mod_ssl/2.2.8 OpenSSL/0.9.8g
X-Powered-By: PHP/5.2.4-2ubuntu5.6
Pragma: public
Expires: 0
Cache-Control: must-revalidate, post-check=0, pre-check=0, private
Content-Disposition: attachment; filename="zipTest2.zip"
Content-Transfer-Encoding: binary
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/zip
____________________________________________________________________



--
Kind regards
Kim Emax - masterminds.dk

--- End Message ---
--- Begin Message --- If I am using the mysqli extension and prepared statements, after I execute bind_param, is there a away to print the actual query that gets sent to the server?


--
Chris W
KE5GIX

"Protect your digital freedom and privacy, eliminate DRM,
learn more at http://www.defectivebydesign.org/what_is_drm";

Ham Radio Repeater Database.
http://hrrdb.com

--- End Message ---
--- Begin Message ---
Hi,

I'm writing a program to record the system calls my web application places
by interposing the libc libraries. However, I noticed that none of the
system calls generated by the MySQL module in PHP is being recorded. I did
ltrace and it seems like the module is statically linked, but doing "file
/usr/lib/php/modules/mysql.so" says differently. I'm using Fedora 11 and
wondering if PHP is statically linked or not by default?

Thank you.

-- 
Ning

--- End Message ---
--- Begin Message ---
Ashley M. Kirchner wrote:
> 
>    Typing 'pear segmentation fault' in Google produces tons of responses
> so I know I'm not the only one with this issue, but I'll be damned if I
> can figure out what the problem is and how to fix it.  I rolled my own
> PHP 5.3.0 from source.  Compilation went fine, no errors.  Installation
> went without any errors.  I can run 'pecl' and install some packages,
> but when I try to run 'pear', it segfaults after it downloads a package:
> 
> --------------------
> $ pear install DB
> WARNING: "pear/DB" is deprecated in favor of "pear/MDB2"
> downloading DB-1.7.13.tgz ...
> Starting to download DB-1.7.13.tgz (132,246 bytes)
> .............................done: 132,246 bytes
> Segmentation fault
> --------------------
> 
> 
>    Running it with -vvv gives me:
> 
> --------------------
> $ pear -vvv install DB
> 
> Warning: file_exists(): Unable to find the wrapper "channel" - did you
> forget to enable it when you configured PHP? in
> PEAR/Downloader/Package.php on line 1510
> 
> Warning: is_file(): Unable to find the wrapper "channel" - did you
> forget to enable it when you configured PHP? in
> PEAR/Downloader/Package.php on line 1520
> 
> Warning: is_file(): Unable to find the wrapper "channel" - did you
> forget to enable it when you configured PHP? in
> PEAR/Downloader/Package.php on line 1520
> WARNING: "pear/DB" is deprecated in favor of "pear/MDB2"
> pear/DB: Skipping required dependency "pear/PEAR" version 1.9.0, already
> installed as version 1.8.0
> Downloading "http://pear.php.net/get/DB-1.7.13.tgz";
> downloading DB-1.7.13.tgz ...
> Starting to download DB-1.7.13.tgz (132,246 bytes)
> .............................done: 132,246 bytes
> Segmentation fault
> --------------------
> 
> 
>    So, is this a known issue?

Hi,

Most likely your machine has a borked zlib library.  Try:

pear install -Z DB

This will download the uncompressed .tar and should work fine.  If it
does, look to fix your zlib install and re-build PHP.

Greg

--- End Message ---
--- Begin Message ---
Ashley M. Kirchner wrote:
> 
>    Well, it boiled down to zlib.so causing the segfault.  As soon as I
> removed that module, everything worked.  Recompiling just the zlib.so
> module yielded the same result: pear segfaults.
> 
>    So now the question is: is it zlib's fault, or is it pear?
> 
>    At this point I've accomplished what I needed, but it doesn't
> actually fix the problem: enabling zlib will cause pear to segfault.

Ha, I guess I should wait before replying :).

It is zlib's fault.  We've had reported problems for years on some
poorly configured systems (not your problem: several distros decided to
include a borked zlib and apparently some still do for no good reason at
all).

Greg

--- End Message ---
--- Begin Message ---
Hi,

 

            The error I posted about the other day, "PHP has encountered an
Access Violation at .", is recurring on an intermittent basis. I find that
once it starts occurring, it continues happening for about an entire day,
and then it mysteriously stops occurring again for a day or two.

 

            I am running PHP on a commercial server at Hostway.com, so I
don't have the ability to reboot the server as a test. Nor do I think I have
the ability to check PHP error logs or ini files on my own. I've written to
Hostway support about the problem and am waiting for their reply, but I'm
afraid they might just tell me it's a programming problem they can't help me
with.

 

            I have reduced the problem to a very simple script that shows it
occurs on a call to "file_get_contents()" on a URL. It can likewise occur on
calls to "simplexml_load_file()". Note that the problem is not directly
caused by these function calls because I can run scripts with those calls
over and over again until suddenly the error starts occurring without any
discernible cause. Even after the error does start occurring, it does not
occur when I call "file_get_contents()" on a file, only on a URL, but the
URL does not have to be valid, so the problem is not being caused by actual
data transfer in an open stream. 

 

            See below for the code that yields the problem, when the problem
is occurring, along with two example outputs from it, and with links to the
script running on the server so you can see the problem yourself, unless the
intermittency phase is that the error is not being generated at the moment
you try it! The second example shows that the problem occurs without
successfully opening a stream. It seems to me that this means the problem is
somewhere in the server that my script is running on, not caused by any
connection to another server.

 

            An additional detail. The problem started happening after I
started using the Google Maps API to access data on mailing addresses via
calls of the form 

 

simplexml_load_file("http://maps.google.com/maps/geo?output=xml&key="; .
GoogleKey . "&q=" . urlencode($sAnAddress));

 

I wonder if the problem is caused by something the calls to Google do to
affect the server over time, so that the error is not directly caused by any
single such call, but by the cumulative effect of hundreds of such calls in
some mysterious way. 

 

            Does anyone have any idea what is going on or how to test or
check things on the server to figure out what the problem is?

 

Stumped,

Marshall

 

 

 

==============================

<html>

 <head>

  <title>Test</title>

 </head>

 <body>

<?php

 

$sURL = $_GET['URL'];

 

echo('<p>Getting ' . $sURL . '<p>');

$sFileCont = file_get_contents('http://' . $sURL);

echo('<br>Dump:' . $sFileCont);

 

?>

</body>

</html>

==============================

 

 

Running this with various inputs, I get:

 

==============================

www.ennex.com/util/php/test.php?URL=www.Google.com

 

Getting www.Google.com

 

PHP has encountered an Access Violation at 0A0591E4

==============================

www.ennex.com/util/php/test.php?URL=www.BadURL.com

 

Getting www.BadURL.com

 

Warning: file_get_contents() [function.file-get-contents]:
php_network_getaddresses: getaddrinfo failed: No such host is known. In
D:\WWWRoot\ennex.com\www\util\php\test.php on line 11

 

PHP has encountered an Access Violation at 0A11B8D6

==============================

 


--- End Message ---
--- Begin Message ---
2009/10/18 Paul M Foster <[email protected]>

> On Sat, Oct 17, 2009 at 01:41:03AM -0400, Brian Hazelton wrote:
>
> > I am in charge of an email newsletter list and making sure it gets sent
> > out in time. My problem is I have never done broadcast emailing and
> > right now we have 400 subscribers but want to build a system that can
> > scale well regardless of the number of subscribers. Right now I use
> > mysql to store the email and use phpmailer in a loop to send an email to
> > each of the emails in the db, it is already slow with just 400(takes
> > around 10 min (i think that's slow isnt it?). Has anyone built a
> > broadcast email script and willing to help me?
> >
>
> Use PHPList. It's free.
>
> Paul
>
>
I second that and make sure you add an SPF Key to your domain:
http://old.openspf.org/wizard.html
You should be able to get the SMTP limits from your host and these can be
configured in PHPlist.
The bounce handling takes away a lot of admin work once your list starts
getting larger and people's email addresses change/stop working.

Tom

--- End Message ---

Reply via email to