php-general Digest 8 Apr 2009 03:47:36 -0000 Issue 6055
php-general Digest 8 Apr 2009 03:47:36 - Issue 6055 Topics (messages 291179 through 291200): Re: Best Practices for Hiding Errors 291179 by: Michael A. Peters file_get_contents for URLs? 291180 by: Skip Evans 291182 by: Jan G.B. 291186 by: Richard Heyes Re: difficult select problem 291181 by: PJ 291184 by: Lex Braun 291185 by: Bob McConnell 291188 by: PJ 291189 by: Michael A. Peters 291194 by: Bastien Koert 291199 by: Chris Re: Out of the blue question.. 291183 by: bruce 291187 by: Daniel Brown Re: PHP bandwidth control 291190 by: tedd 291192 by: Stuart 291193 by: Michael A. Peters 291195 by: Bastien Koert 291196 by: 9el 291198 by: Paul M Foster Re: PHP class or functions to manipulate PDF metadata? 291191 by: O. Lavell Re: Possible Server Infection? 291197 by: sono-io.fannullone.us Re: PHP require_once() opens some files but not others in same library 291200 by: Henning Glatter-Gotz Administrivia: To subscribe to the digest, e-mail: php-general-digest-subscr...@lists.php.net To unsubscribe from the digest, e-mail: php-general-digest-unsubscr...@lists.php.net To post to the list, e-mail: php-gene...@lists.php.net -- ---BeginMessage--- 9el wrote: Intead of displaying errors to page. Its better to use error log file. I believe by default they are sent to the server error log file regardless of your error report setting. It wouldn't surprise me if there's a slick class out there for parsing the error log and extracting php errors to nicely display on a separate page. ---End Message--- ---BeginMessage--- Hey all, I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Any other recommendations? I just came across this code and it's one of the client's biggest complaints. -- Skip Evans Big Sky Penguin, LLC 503 S Baldwin St, #1 Madison WI 53703 608.250.2720 http://bigskypenguin.com Those of you who believe in telekinesis, raise my hand. -- Kurt Vonnegut ---End Message--- ---BeginMessage--- Well, you might want to do it with curl, you might want to write your own socketscript, or your just check the return variable of file_get_contents() - it'll be false on failure and it won't try to get an invalid URL forever. Guess the error is somewhere else, when your script continues indefinitely. I'm using theis function in that way with a daily cronjob, and the remote server isn't so stable... trust me. ;) But setting the timeout can be done in php.ini or like suggested on the php.net manual: http://www.php.net/manual/en/function.file-get-contents.php#82527 byebye 2009/4/7 Skip Evans s...@bigskypenguin.com: Hey all, I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Any other recommendations? I just came across this code and it's one of the client's biggest complaints. -- Skip Evans Big Sky Penguin, LLC 503 S Baldwin St, #1 Madison WI 53703 608.250.2720 http://bigskypenguin.com Those of you who believe in telekinesis, raise my hand. -- Kurt Vonnegut -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php ---End Message--- ---BeginMessage--- Hey all, Hello. I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Looking at the docs alone, it looks like you can pass a stream as the third argument to file_get_contents(). So create a stream, set the timeout on that (using stream_context_create() stream_context_set_option() ), and then pass it to
Re: [PHP] difficult select problem
PJ wrote: I've searched the web, the tutorials, etc. with no luck and have asked on MySql list with no luck and have even posted here with no replies. So, let's try again: I am trying to limit the search for books to only those that start with A (does not have to be case sensitive); but within that selection there may be a second author whose name may start with any letter of the alphabet. So, the real problem is to find if there is a second author in the selection of books and to display that name. My query shows all the books that begin with A: $SQL = SELECT b.*, c.publisher, a.first_name, a.last_name FROM book AS b LEFT JOIN book_publisher as bp ON b.id = bp.bookID LEFT JOIN publishers AS c ON bp.publishers_id = c.id LEFT JOIN book_author AS ba ON b.id = ba.bookID LEFT JOIN author AS a ON ba.authID = a.id WHERE LEFT(last_name, 1 ) = '$Auth' ; Within the results there are some books that have 2 authors and now I have to find if there is a second author in the results and then be able to echo the second author. So far, I have not been able to figure out how to go about that. Do I need to do another query to find the second author or can it somehow be incorporated into the original query? Or can it be done with a UNION ? Please help. ... and we begin ... $SQL = SELECT b.*, c.id AS publisher_id, c.publisher, a.id AS author_id, a.first_name, a.last_name FROM book AS b INNER JOIN book_publisher as bp ON b.id = bp.bookID INNER JOIN publishers AS c ON bp.publishers_id = c.id INNER JOIN book_author AS ba ON b.id = ba.bookID INNER JOIN author AS a ON ba.authID = a.id WHERE b.id IN ( SELECT b.id FROMbook AS b INNER JOIN book_author AS ba ON b.id = ba.bookID INNER JOIN author AS a ON ba.authID = a.id WHERE a.last_name LIKE '{$Auth}%' ); Ok, with those changes made, you will now see that your result set is going to have duplicate entries for each book. Well, almost duplicate... Basically what you need to realize is that you are talking about row counts compounding upon themselves... I will try and explain... if you had ten books with a total of 16 unique authors and 8 unique publishers, you would end up with a result set (without any WHERE clause) that was 10 x 16 x 8 = 1280 rows. now, say that you put a limit of authors last name must start with and 'A' on the select (like you have) say that three of the authors match that and of those three, they had 4 books that they have written and those 4 books were published by 2 unique publishers. You would end up with 3 x 4 x 2 = 24 rows in result set. I would consider handling all the above as I have stated, but you will need to add some sorting of the data in PHP to make sense of it all. something like the following should do. (completely untested!!!) typed right in the client here... $book_information = array(); if ( ( $results = mysql_query($SQL, $db) ) !== false ) { while ( $row = mysql_fetch_assoc($results) ) { $book_information[$row['id']]['title'] = $row['title']; # NOTICE: this row is here so you can replace it with other information from the book table $book_information[$row['id']]['...'] = $row['...']; $book_information[$row['id']]['authors'][$row['author_id']] = array('first_name' = $row['first_name'], 'last_name' = $row['last_name']); $book_information[$row['id']]['publishers'][$row['publisher_id']] = $row['publisher']; } } Do a print_r($book_information) and you should get an understanding of what is happening. Jim Lucas -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Re: PHP class or functions to manipulate PDF metadata?
O. Lavell wrote: Peter Ford wrote: O. Lavell wrote: [..] Any and all suggestions are welcome. Thank you in advance. So many people ask about manipulating, editing and generally processing PDF files. In my experience, PDF is a write-once format - any manipulation should have been done in whatever source generated the PDF. I think of a PDF as being a piece of paper: if you want to change the content of a piece of paper it is usually best to chuck it away and start again... Even more so, this would apply to the PDF metadata: metadata is supposed to describe the nature of the document: it's author, creation time etc. That sort of data should be maintained with the document and ideally not changed throughout the document's lifetime (like the footer, or end-papers in a physical book) Thank you very much for your reply. And it's not that I don't agree with you. Because I do, completely. However... PDFs often come from sources that can't be bothered to fill in the relevant fields correctly, completely, or at all. For those cases I would like the users of my application to be able to correct the values found in the metadata. Upload the PDF, get a nice little HTML form with 4 or 5 values to review or edit. That sort of thing. I do accept that the metadata should be machine-readable: that part of your project is reasonable and I'm fairly sure that ought to be possible with something simple. The best bet I found so far is PDFTK (http://www.pdfhacks.com/pdftk/) which is a command-line tool that you could presumably call with exec or whatever... Like I said, this is what I am already doing with the pdfinfo utility from xpdf. Sorry - I guess I didn't read that bit carefully enough... But now that you mentioned pdftk... I just tried it and it does seem to come close to what I want. It is capable of writing a new PDF with the contents of an existing one, with new metadata fed as a text file. So it shouldn't be very hard to write a little PHP around that process. Now I need to think a bit more about this approach. Perhaps it can be implemented using only pure PHP, after all. But for the time being, pdftk will do. So thank you again for pushing me in that direction, even if unintentionally and despite the fact that what I am doing goes against your judgement ;) As I know only too well, you can't always choose your customers (especially if they choose you...) and you certainly can't control all of the sources of data you have to deal with! I have spent many hours/days/possibly longer hacking through files that are in one form to get data into another, and PDF is the one that always makes me nervous :( My judgement is certainly not final, or even particularly important: if I had time I would also look into at least getting the metadata with pure PHP. Good luck... -- Peter Ford phone: 01580 89 Developer fax: 01580 893399 Justcroft International Ltd., Staplehurst, Kent -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
2009/4/7 Chris dmag...@gmail.com: I guess there are multiple ways to engage this problem. It depends how deep you want to log the traffic. If you just want to count the traffic of each image, video etc you could just wrap up each image and video to go through php first with file_get_contents() (look in the php manual there are some examples how to work with this), count how many bytes of data will be sent out and log this in a database or however you want to do this. While it's a good suggestion, don't use file_get_contents because it reads the whole file in to memory. If you use it on a 200Meg movie, it uses 200Meg of memory. Use filesize() to work out the size. Then use fpassthru to shove the data through. http://www.php.net/fpassthru -- Postgresql php tutorials http://www.designmagick.com/ Thanks for the addition! I had that in mind but I didn't know the function fpassthru. That is of course better. So a little pseudo code: if ( used_bandwith + filesize allowed_bandwidth) error_message() else write_in_database(used_bandwith = used_bandwith + filesize) fpassthru(file) Nice one! Good luck playing around with this :) -- Currently developing a browsergame... http://www.p-game.de Trade - Expand - Fight Follow me on twitter! http://twitter.com/moortier -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
2009/4/7 Chris dmag...@gmail.com: I guess there are multiple ways to engage this problem. It depends how deep you want to log the traffic. If you just want to count the traffic of each image, video etc you could just wrap up each image and video to go through php first with file_get_contents() (look in the php manual there are some examples how to work with this), count how many bytes of data will be sent out and log this in a database or however you want to do this. While it's a good suggestion, don't use file_get_contents because it reads the whole file in to memory. If you use it on a 200Meg movie, it uses 200Meg of memory. Use filesize() to work out the size. Then use fpassthru to shove the data through. http://www.php.net/fpassthru -- Postgresql php tutorials http://www.designmagick.com/ Just another small addition I just got from the php manual: You can use readfile() instead of fpassthru() so you don't have to use fopen(). pseudo code updated: if ( used_bandwith + filesize allowed_bandwidth) error_message() else write_in_database(used_bandwith = used_bandwith + filesize) readfile(file) Theres always something to learn in PHP Land. -- Currently developing a browsergame... http://www.p-game.de Trade - Expand - Fight Follow me on twitter! http://twitter.com/moortier -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Out of the blue question..
Didn't really read Bruce's email didya Chris?!! Of course I read it - I guess I misunderstood the intent. No need to bite my head off - sheesh :P -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
2009/4/7 Phpster phps...@gmail.com: Misk.com is also good at $10/ yep Bastien Sent from my iPod Come on now, please. JD clearly said he wants to do this at home to learn something by doing it. I can understand that very well. Giving answers nobody asked for is like posing questions nobody wants to answer. -- Currently developing a browsergame... http://www.p-game.de Trade - Expand - Fight Follow me on twitter! http://twitter.com/moortier -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Out of the blue question..
2009/4/7 Chris dmag...@gmail.com: bruce wrote: Hi Ladies/Gents of the list... I've got an issue/question and figured I'd fire it to the list. Over time, I've had a few projects that I've worked on, where I've required someone with skills way beyond mine for a given area. And rather than spend hours trying to figure it out, I've sometimes hired someone to aid for a very short amount of time.. The issues have ranged from serious db optimization, to server security issues, etc... I was wondering, I'm assuming that other developers here have had similar issues. Has anyone thought about doing a group hire of someone with the advanced skills to solve the particular issue for the problem domain. IE if enough people have, or know you're going to have mysql questions... Then we find someone who's good/skilled and more or less have that person on retainer for the group. This of course depends on how many people would want to use the person's skills, and how often we'd need the person, and other issues... I figured that I'd post here to see what the list thinks/thought.. etc... You can post job requests for stuff like this on elance.com, rentacoder.com and I'm sure hundreds of other places. I don't think this list is the appropriate place for it really but that's just my opinion. Didn't really read Bruce's email didya Chris?!! @Bruce: In my experience most common problems that everyone has are documented somewhere on the internet, and for those problems that aren't common I think it's unlikely you'd get enough people willing to pay a retainer for something that's readily available on demand fairly inexpensively, but I've been wrong before. -Stuart -- http://stut.net/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
Misk.com is also good at $10/ yep Bastien Sent from my iPod On Apr 6, 2009, at 23:57, Michael Kubler mdk...@gmail.com wrote: DO NOT USE GO-DADDY. Sorry, just had to say that Go-Daddy will cause all sorts of issues when your domain expires, or if you check for a domain but don't purchase it straight away. When you come back a little bit later you'll have to pay hundreds of dollars for the domain (as they registered it while you were gone), instead of the usual $20/yr type thing. Try Planetdomain (although their website is being re-designed at the moment), MelbourneIT (expensive), or even Google (surprisingly cheap). Actually if you search, there's a website that has a list of the different domain registrars and their costs that you could look at. As for quota control you can pipe everything through PHP which is more CPU intensive but will be more accurate in terms of which user was accessing the account. You could also parse the Apache log files (or whatever the web server is), which is more accurate but also slower. For bandwidth you can use something like the bandwidth mod for Apache which will allow you to prevent your webserver from completely saturating your Internet connection, allowing you to still surf the net or play games while people are accessing your site. Michael Kubler *G*rey *P*hoenix *P*roductions http://www.greyphoenix.biz JD wrote: Excellent, thanks both for the suggestions. I'd like to continue hosting it myself if for no other reason than I want to learn how to manage some of the hardware, software and operating systems that I otherwise don't get much exposure to. I'm treating this as a learning experience. I like the idea of the file_get_contents() as it sounds easier to implement, but, again, I'm using this as a learning experience so maybe I'll try and parse out the log files as you suggest. Again, many thanks! Dave -- Original Message -- From: Michael A. Peters mpet...@mac.com To: Yannick Mortier mvmort...@googlemail.com Cc: JD danceintherai...@netzero.com, php-general@lists.php.net Subject: Re: [PHP] PHP bandwidth control Date: Mon, 06 Apr 2009 06:03:12 -0700 Yannick Mortier wrote: 2009/4/6 JD danceintherai...@netzero.com: Hello, I am relatively new to PHP and am trying to make a video/image sharing site for my family to upload and share family videos and pictures. My concern is that because I'm hosting this site at my house, I will quickly exceed my bandwidth limitations each month if all the family members I think will use the site do actually end up using it. What I'd like to do is set up each family member with their own login and track how much bandwidth they use and cap it after a certain amount. The login stuff is easy and I have that figured out, but I haven't been able to figure out a good way to track the bandwidth used by each user that logs in. Is there a good way to do this with PHP? Thanks, Dave Click here for free information on how to reduce your debt by filing for bankruptcy. http://thirdpartyoffers.netzero.net/TGL2231/fc/BLSrjnxXKInZ3kl2SDnqN7ifO3PSaE96m9RMpRCn9agvvsomFpM5Y0grTAM/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php I guess there are multiple ways to engage this problem. It depends how deep you want to log the traffic. If you just want to count the traffic of each image, video etc you could just wrap up each image and video to go through php first with file_get_contents() (look in the php manual there are some examples how to work with this), count how many bytes of data will be sent out and log this in a database or however you want to do this. If the bandwith limit is exceeded you don't deliver the image anymore and display an error message instead. If you want to catch all traffic you must parse the log files from you webserver. To do this you could save the IP with which the login of the user was performed and connect all traffic that was done by that IP to the User. If the traffic limit is exceeded you display an error message. I guess for some family-internal sharing the first approach should be good enough. Just make sure you take some bandwith for the html pages into your calculations. My suggestion would be to do it on a real server and avoid any and all ISP restrictions, present and future. Don't register your domain with your host though, I found it to be a real PITA to switch hosts when you use them as your registrar, getting them to relinquish control of the domain can be a PITA. Instead register with someone like godaddy that lets you specify the nameservers and host elsewhere. Then if you feel like you need to move it to a different host, your current host can't be jerks about it. Purify your water with
Re: [PHP] Best Practices for Hiding Errors
It's just an observation ;) If you have to use it or not, you have to decide better way. Regards, Igor Escoar Systems Analyst Interface Designer -- Personal Blog ~ blog.igorescobar.com Online Portifolio ~ www.igorescobar.com Twitter ~ @igorescobar On Mon, Apr 6, 2009 at 9:56 PM, Chris dmag...@gmail.com wrote: Igor Escobar wrote: Becarefull, error supression is slow. If it's the only way to stop an error from showing up, what's the problem? php will still generate the warning/notice even if display_errors is disabled - which will be even slower. Plus I never said use it everywhere, I said use it in particular cases and comment your code about why you had to use it. -- Postgresql php tutorials http://www.designmagick.com/
[PHP] PHP-MYSQL Question
Hi guys, Please can anyone tell me what I'm doing wrong with the code below? It keep returning unsuccessful. $result=mysql_query(CREATE TABLE table2(table2_id INT NOT NULL PRIMARY KEY AUTO_INCREMENT, table1_id INT NOT NULL, name VARCHAR(100) NOT NULL, school VARCHAR(100) NOT NULL, comment TEXT NOT NULL, entrydate TIMESTAMP NOT NULL, FOREIGN KEY(table1_id) REFERENCES table1(table1_id)) ENGINE = INNODB ); if($result){ printSuccessful;} else {print Unsuccessful;} Thanks in advance. Cheers. Alugo Abdulazeez. _ Drag n’ drop—Get easy photo sharing with Windows Live™ Photos. http://www.microsoft.com/windows/windowslive/products/photos.aspx
Re: [PHP] PHP-MYSQL Question
This isn't PHP but mysql question. You didn't mention that the table itslef is created or not. If not, then it is probably a mysql error, maybe your installation of mysql doesn't support INNODB. SanTa - Original Message - From: abdulazeez alugo defati...@hotmail.com To: php-general@lists.php.net Sent: Tuesday, April 07, 2009 3:05 PM Subject: [PHP] PHP-MYSQL Question Hi guys, Please can anyone tell me what I'm doing wrong with the code below? It keep returning unsuccessful. $result=mysql_query(CREATE TABLE table2(table2_id INT NOT NULL PRIMARY KEY AUTO_INCREMENT, table1_id INT NOT NULL, name VARCHAR(100) NOT NULL, school VARCHAR(100) NOT NULL, comment TEXT NOT NULL, entrydate TIMESTAMP NOT NULL, FOREIGN KEY(table1_id) REFERENCES table1(table1_id)) ENGINE = INNODB ); if($result){ printSuccessful;} else {print Unsuccessful;} Thanks in advance. Cheers. Alugo Abdulazeez. _ Drag n’ drop—Get easy photo sharing with Windows Live™ Photos. http://www.microsoft.com/windows/windowslive/products/photos.aspx -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP-MYSQL Question
abdulazeez alugo wrote: Hi guys, Please can anyone tell me what I'm doing wrong with the code below? It keep returning unsuccessful. Why don't you print out mysql_error() ? It'll tell you right away. /Per -- Per Jessen, Zürich (20.1°C) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] PHP-MYSQL Question
From: p...@computer.org Date: Tue, 7 Apr 2009 15:18:35 +0200 To: php-general@lists.php.net Subject: Re: [PHP] PHP-MYSQL Question abdulazeez alugo wrote: Hi guys, Please can anyone tell me what I'm doing wrong with the code below? It keep returning unsuccessful. Why don't you print out mysql_error() ? It'll tell you right away. /Per Thanks Per, I should have thought of that. Now I owe you a beer. Cheers. _ More than messages–check out the rest of the Windows Live™. http://www.microsoft.com/windows/windowslive/
Re: [PHP] PHP bandwidth control
Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. 2009/4/7 Michael Kubler mdk...@gmail.com DO NOT USE GO-DADDY. Sorry, just had to say that Go-Daddy will cause all sorts of issues when your domain expires, or if you check for a domain but don't purchase it straight away. When you come back a little bit later you'll have to pay hundreds of dollars for the domain (as they registered it while you were gone), instead of the usual $20/yr type thing. Try Planetdomain (although their website is being re-designed at the moment), MelbourneIT (expensive), or even Google (surprisingly cheap). Actually if you search, there's a website that has a list of the different domain registrars and their costs that you could look at. As for quota control you can pipe everything through PHP which is more CPU intensive but will be more accurate in terms of which user was accessing the account. You could also parse the Apache log files (or whatever the web server is), which is more accurate but also slower. For bandwidth you can use something like the bandwidth mod for Apache which will allow you to prevent your webserver from completely saturating your Internet connection, allowing you to still surf the net or play games while people are accessing your site. Michael Kubler *G*rey *P*hoenix *P*roductions http://www.greyphoenix.biz JD wrote: Excellent, thanks both for the suggestions. I'd like to continue hosting it myself if for no other reason than I want to learn how to manage some of the hardware, software and operating systems that I otherwise don't get much exposure to. I'm treating this as a learning experience. I like the idea of the file_get_contents() as it sounds easier to implement, but, again, I'm using this as a learning experience so maybe I'll try and parse out the log files as you suggest. Again, many thanks! Dave -- Original Message -- From: Michael A. Peters mpet...@mac.com To: Yannick Mortier mvmort...@googlemail.com Cc: JD danceintherai...@netzero.com, php-general@lists.php.net Subject: Re: [PHP] PHP bandwidth control Date: Mon, 06 Apr 2009 06:03:12 -0700 Yannick Mortier wrote: 2009/4/6 JD danceintherai...@netzero.com: Hello, I am relatively new to PHP and am trying to make a video/image sharing site for my family to upload and share family videos and pictures. My concern is that because I'm hosting this site at my house, I will quickly exceed my bandwidth limitations each month if all the family members I think will use the site do actually end up using it. What I'd like to do is set up each family member with their own login and track how much bandwidth they use and cap it after a certain amount. The login stuff is easy and I have that figured out, but I haven't been able to figure out a good way to track the bandwidth used by each user that logs in. Is there a good way to do this with PHP? Thanks, Dave Click here for free information on how to reduce your debt by filing for bankruptcy. http://thirdpartyoffers.netzero.net/TGL2231/fc/BLSrjnxXKInZ3kl2SDnqN7ifO3PSaE96m9RMpRCn9agvvsomFpM5Y0grTAM/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php I guess there are multiple ways to engage this problem. It depends how deep you want to log the traffic. If you just want to count the traffic of each image, video etc you could just wrap up each image and video to go through php first with file_get_contents() (look in the php manual there are some examples how to work with this), count how many bytes of data will be sent out and log this in a database or however you want to do this. If the bandwith limit is exceeded you don't deliver the image anymore and display an error message instead. If you want to catch all traffic you must parse the log files from you webserver. To do this you could save the IP with which the login of the user was performed and connect all traffic that was done by that IP to the User. If the traffic limit is exceeded you display an error message. I guess for some family-internal sharing the first approach should be good enough. Just make sure you take some bandwith for the html pages into your calculations. My suggestion would be to do it on a real server and avoid any and all ISP restrictions, present and future. Don't register your domain with your host though, I found it to be a real PITA to switch hosts when you use them as your registrar, getting them to relinquish control of the domain can be a PITA. Instead register with someone like godaddy that lets you specify the nameservers and host elsewhere. Then if you feel like you need to move it
Re: [PHP] Best Practices for Hiding Errors
2009/4/7 Chris dmag...@gmail.com: Igor Escobar wrote: Becarefull, error supression is slow. If it's the only way to stop an error from showing up, what's the problem? php will still generate the warning/notice even if display_errors is disabled - which will be even slower. Plus I never said use it everywhere, I said use it in particular cases and comment your code about why you had to use it. -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php @phpfunction does just the same thing. Basically it's a shortcut to set display_errors to off before the function call and switch it back on after the function call. This is why it is slow. Some better practices would be: Just turn display_errors to off in you php.ini on the production system, if you can. Use ini_set('display_errors', false); in a central include file that gets included into every file. If you want to use the same script on development and production system you can add a SetEnv Directive inside you apache. Something like SetEnv SERVER_ROLE development Should do. Then you can do the following: if($HTTP_ENV_VARS['SERVER_ROLE'] == 'development') { error_reporting(E_ALL); ini_set('display_errors', true); } else { error_reporting(0); ini_set('display_errors', false); } So your code will work on production and on development server. Feel free to ask if you need further help. -- Currently developing a browsergame... http://www.p-game.de Trade - Expand - Fight Follow me on twitter! http://twitter.com/moortier -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Best Practices for Hiding Errors
Intead of displaying errors to page. Its better to use error log file. And as @ error suppressors are expensive its always better if you can avoid using them. I'd also suggest the ZCE(Zend Certification Engineer) Exam Guide for this matter for best practices. Regards Lenin www.twitter.com/nine_L
Re: [PHP] PHP bandwidth control
Awesome, you guys rock! Now I have a bunch of stuff to play around with and the more I read about these functions and try playing around with them the more I think I'll learn about this stuff. This is great! Thanks again! -- Original Message -- From: Yannick Mortier mvmort...@googlemail.com To: Chris dmag...@gmail.com Cc: JD danceintherai...@netzero.com, php-general@lists.php.net Subject: Re: [PHP] PHP bandwidth control Date: Tue, 7 Apr 2009 10:42:01 +0200 2009/4/7 Chris dmag...@gmail.com: I guess there are multiple ways to engage this problem. It depends how deep you want to log the traffic. If you just want to count the traffic of each image, video etc you could just wrap up each image and video to go through php first with file_get_contents() (look in the php manual there are some examples how to work with this), count how many bytes of data will be sent out and log this in a database or however you want to do this. While it's a good suggestion, don't use file_get_contents because it reads the whole file in to memory. If you use it on a 200Meg movie, it uses 200Meg of memory. Use filesize() to work out the size. Then use fpassthru to shove the data through. http://www.php.net/fpassthru -- Postgresql php tutorials http://www.designmagick.com/ Just another small addition I just got from the php manual: You can use readfile() instead of fpassthru() so you don't have to use fopen(). pseudo code updated: if ( used_bandwith + filesize allowed_bandwidth) error_message() else write_in_database(used_bandwith = used_bandwith + filesize) readfile(file) Theres always something to learn in PHP Land. -- Currently developing a browsergame... http://www.p-game.de Trade - Expand - Fight Follow me on twitter! http://twitter.com/moortier Come clean with a brand new shower. Click now! http://thirdpartyoffers.netzero.net/TGL2231/fc/BLSrjnxSDIEW3SA42CWj9n1jKPvjPfebqCKiqTbwOunnXNFT7dskjTO8Tmg/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. Same experience. I suspect what happened is a squatter also thought up the name and registered it - GoDaddy will sell domains for squatters. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
JD wrote: Theres always something to learn in PHP Land. Yeah - and I always seem to find slick new one or two line solutions after I've written a bunch of lines to clumsily do the same thing. I guess that's how it is when you first start to get semi-serious about a language (emphasis on the semi) ;) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Best Practices for Hiding Errors
9el wrote: Intead of displaying errors to page. Its better to use error log file. I believe by default they are sent to the server error log file regardless of your error report setting. It wouldn't surprise me if there's a slick class out there for parsing the error log and extracting php errors to nicely display on a separate page. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] file_get_contents for URLs?
Hey all, I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Any other recommendations? I just came across this code and it's one of the client's biggest complaints. -- Skip Evans Big Sky Penguin, LLC 503 S Baldwin St, #1 Madison WI 53703 608.250.2720 http://bigskypenguin.com Those of you who believe in telekinesis, raise my hand. -- Kurt Vonnegut -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
Gentlemen, First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: 1) which books have a second author? 2) who is the second author ? This is determined by table book_author column ordinal (which can be 1 or 2) - if there is only 1 author for a book then there is no ordinal 2 linked to book_author bookID and authID. The structure of the db: book id title sub_title descr comment bk_cover copyright ISBN language sellers author id first_name last_name book_author authID bookID ordinal categories and publishers are not realy relevant here... The code I have: $SQL = SELECT b.*, c.publisher, a.first_name, a.last_name FROM book AS b LEFT JOIN book_publisher as bp ON b.id = bp.bookID LEFT JOIN publishers AS c ON bp.publishers_id = c.id LEFT JOIN book_author AS ba ON b.id = ba.bookID LEFT JOIN author AS a ON ba.authID = a.id WHERE LEFT(last_name, 1 ) = '$Auth' ; (PLEASE LET ME KNOW IF THERE IS SOMETHING WRONG WITH THE CODE) It gives me these results: *array* 6 = *array* 'id' = string '6' /(length=1)/ 'title' = string 'Nubia.' /(length=6)/ 'sub_title' = string 'Corridor to Africa' /(length=18)/ 'descr' = string '' /(length=0)/ 'comment' = string '' /(length=0)/ 'bk_cover' = string '' /(length=0)/ 'copyright' = string '1977' /(length=4)/ 'ISBN' = string '0691093709' /(length=10)/ 'language' = string 'en' /(length=2)/ 'sellers' = string '' /(length=0)/ 'publisher' = string 'Princeton University Press' /(length=26)/ 'first_name' = string 'William Yewdale' /(length=15)/ 'last_name' = string 'Adams' /(length=5)/ This is the first of 17 books and it would be sufficient if I were not so damned demanding. I know that there are several books in the list that have 2 authors. At least 1 whose last name begins with A is an ordinal=2 (which is not important when it comes to displaying) and there are a couple where there are 2nd authors whose names begin with other letters of the alphabet. It is these that I am targeting. My question is: How do I filter my query to weed out these renegades? Can it be done using a concat as Author1 and another concat as Author2 (in place of the first_name and last_name joins ? Or do I do another query just for the outlaws? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] file_get_contents for URLs?
Well, you might want to do it with curl, you might want to write your own socketscript, or your just check the return variable of file_get_contents() - it'll be false on failure and it won't try to get an invalid URL forever. Guess the error is somewhere else, when your script continues indefinitely. I'm using theis function in that way with a daily cronjob, and the remote server isn't so stable... trust me. ;) But setting the timeout can be done in php.ini or like suggested on the php.net manual: http://www.php.net/manual/en/function.file-get-contents.php#82527 byebye 2009/4/7 Skip Evans s...@bigskypenguin.com: Hey all, I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Any other recommendations? I just came across this code and it's one of the client's biggest complaints. -- Skip Evans Big Sky Penguin, LLC 503 S Baldwin St, #1 Madison WI 53703 608.250.2720 http://bigskypenguin.com Those of you who believe in telekinesis, raise my hand. -- Kurt Vonnegut -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] Out of the blue question..
chris... did you read the entire msg.. this isn't/wasn't a hunt for someone for a project... please re-read.. -Original Message- From: Chris [mailto:dmag...@gmail.com] Sent: Monday, April 06, 2009 8:14 PM To: bruce Cc: php-general@lists.php.net Subject: Re: [PHP] Out of the blue question.. bruce wrote: Hi Ladies/Gents of the list... I've got an issue/question and figured I'd fire it to the list. Over time, I've had a few projects that I've worked on, where I've required someone with skills way beyond mine for a given area. And rather than spend hours trying to figure it out, I've sometimes hired someone to aid for a very short amount of time.. The issues have ranged from serious db optimization, to server security issues, etc... I was wondering, I'm assuming that other developers here have had similar issues. Has anyone thought about doing a group hire of someone with the advanced skills to solve the particular issue for the problem domain. IE if enough people have, or know you're going to have mysql questions... Then we find someone who's good/skilled and more or less have that person on retainer for the group. This of course depends on how many people would want to use the person's skills, and how often we'd need the person, and other issues... I figured that I'd post here to see what the list thinks/thought.. etc... You can post job requests for stuff like this on elance.com, rentacoder.com and I'm sure hundreds of other places. I don't think this list is the appropriate place for it really but that's just my opinion. -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
PJ, On Tue, Apr 7, 2009 at 11:37 AM, PJ af.gour...@videotron.ca wrote: $SQL = SELECT b.*, c.publisher, a.first_name, a.last_name FROM book AS b LEFT JOIN book_publisher as bp ON b.id = bp.bookID LEFT JOIN publishers AS c ON bp.publishers_id = c.id LEFT JOIN book_author AS ba ON b.id = ba.bookID LEFT JOIN author AS a ON ba.authID = a.id WHERE LEFT(last_name, 1 ) = '$Auth' ; (PLEASE LET ME KNOW IF THERE IS SOMETHING WRONG WITH THE CODE) Let me try to clarify what I'm saying about your query. The above query will ONLY return authors who match the WHERE condition, thus have last name starting with A. This query will never find the second author for those books, unless that author's last name also starts with an A. That's why you first need to get a list of the book IDs that match your WHERE condition, and then grab the authors related to those book IDs (whether through two queries or using a sub-query). You've been shown several different ways of doing this through the responses provided. Do the provided queries not work for your test data? - Lex
RE: [PHP] difficult select problem
From: PJ First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: Actually, it appears you simply don't like the accurate answers you have been given. 1) which books have a second author? 2) who is the second author ? This is determined by table book_author column ordinal (which can be 1 or 2) - if there is only 1 author for a book then there is no ordinal 2 linked to book_author bookID and authID. There is no way to do that in a single select. You need to have at least two and possibly three queries to answer your question. First you get a list of authors where their name begins with 'A'. Then you use that result to select a list of all books with more than one author. Then you can use that result to select all authors for them. Everyone has told you this requires processing beyond what SQL can provide. Why is that a problem? Bob McConnell -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] file_get_contents for URLs?
Hey all, Hello. I'm doing some maintenance work on an existing system and there is a piece of code that uses file_get_contents() to read data from a URL, which is fine in theory I suppose. But the problem is sometimes the server where that URL lives is not available, and the system hangs indefinitely. Shouldn't this be done with curl, and if so can it be done so that the call will time out and return control back when the server is not available? Looking at the docs alone, it looks like you can pass a stream as the third argument to file_get_contents(). So create a stream, set the timeout on that (using stream_context_create() stream_context_set_option() ), and then pass it to file_get_contents(). -- Richard Heyes HTML5 Canvas graphing for Firefox, Chrome, Opera and Safari: http://www.rgraph.net (Updated March 28th) -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Out of the blue question..
On Tue, Apr 7, 2009 at 11:56, bruce bedoug...@earthlink.net wrote: chris... did you read the entire msg.. this isn't/wasn't a hunt for someone for a project... please re-read.. It would've been fine even if it was, Bruce. This is the General list. As long as it's PHP-related, you're fine. However, I'd concur with Stut's note, and add that it's probably not even practical to do a group retainer --- this is, without undue bias, one of the best, most diverse, most professional core group of folks in any developer-focused community on the web today. It's all a matter of asking the right questions and taking advantage of the situation - so long as you're willing to *give back* as well. -Original Message- From: Chris [mailto:dmag...@gmail.com] Sent: Monday, April 06, 2009 8:14 PM To: bruce Cc: php-general@lists.php.net Subject: Re: [PHP] Out of the blue question.. bruce wrote: Hi Ladies/Gents of the list... I've got an issue/question and figured I'd fire it to the list. Over time, I've had a few projects that I've worked on, where I've required someone with skills way beyond mine for a given area. And rather than spend hours trying to figure it out, I've sometimes hired someone to aid for a very short amount of time.. The issues have ranged from serious db optimization, to server security issues, etc... I was wondering, I'm assuming that other developers here have had similar issues. Has anyone thought about doing a group hire of someone with the advanced skills to solve the particular issue for the problem domain. IE if enough people have, or know you're going to have mysql questions... Then we find someone who's good/skilled and more or less have that person on retainer for the group. This of course depends on how many people would want to use the person's skills, and how often we'd need the person, and other issues... I figured that I'd post here to see what the list thinks/thought.. etc... You can post job requests for stuff like this on elance.com, rentacoder.com and I'm sure hundreds of other places. I don't think this list is the appropriate place for it really but that's just my opinion. -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- /Daniel P. Brown daniel.br...@parasane.net || danbr...@php.net http://www.parasane.net/ || http://www.pilotpig.net/ 50% Off All Shared Hosting Plans at PilotPig: Use Coupon DOW1 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
Bob McConnell wrote: From: PJ First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: Actually, it appears you simply don't like the accurate answers you have been given. First, let me say that I am pretty fresh to all this. Second, I believe I can get the authors with several more queries as I have done for listings of all the books and by categories, but I am trying to limit the queries thinking that that will speed up the data retrieval. Third, I have tried the answers proposed and have not been able to make them work up to now. Will continue... (it's a bit difficult at the moment as I do not have my entire head about me - nasty cold) Fourth, I don't see any reference to the book_author.ordinal which seems important to me to determine the placement of the authors. 1) which books have a second author? 2) who is the second author ? This is determined by table book_author column ordinal (which can be 1 or 2) - if there is only 1 author for a book then there is no ordinal 2 linked to book_author bookID and authID. There is no way to do that in a single select. You need to have at least two and possibly three queries to answer your question. First you get a list of authors where their name begins with 'A'. Then you use that result to select a list of all books with more than one author. Then you can use that result to select all authors for them. I am a bit confused as to how to use the result in further queries. And I don't understand what the difference is in using: $SQL = SELECT b.id FROM book AS b LEFT JOIN book_author AS ba ON b.id = ba.bookID LEFT JOIN author AS a ON ba.authID = a.id WHERE LEFT(last_name, 1) = '$Auth' ; instead of: $SQL = SELECT b.id FROM book b, book_author ba, author a WHERE b.id = ba.bookID a.id = ba.authID LEFT(a.last_name, 1) = '$Auth' ; It seems simpler without the JOINs; but is there an advantage or is that merely a personanl choice of several possibilities? Everyone has told you this requires processing beyond what SQL can provide. Why is that a problem? It's not a problem. I'm just trying learn what can and cannot be done. For instance, I could just put new columns in the db for author1 and author2. But I suspect it might not be as efficient with a great many books (like thousands). Bob McConnell -- unheralded genius: A clean desk is the sign of a dull mind. - Phil Jourdan --- p...@ptahhotep.com http://www.ptahhotep.com http://www.chiccantine.com/andypantry.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
PJ wrote: Bob McConnell wrote: From: PJ First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: Actually, it appears you simply don't like the accurate answers you have been given. First, let me say that I am pretty fresh to all this. Second, I believe I can get the authors with several more queries as I have done for listings of all the books and by categories, but I am trying to limit the queries thinking that that will speed up the data retrieval. A friend of mine who manages many large scale websites with massive databases says that isn't always the case, especially if you don't have a dedicated SQL server with very fast disks and lots of memory. He's found that in many situations it is faster to do several sql queries and let php sort it out then to use a bunch of joins, subselects, etc. in order to reduce the number of sql queries. Has to do with how sql works on the filesystem, and the IO that can result from sql needing to do a more complex query, and what is fastest varies upon your setup. I think he said sub-selects are the worst because sql has to create a virtual table for the subselect and that can really slow the query down, but I might be mistaken about that. Thus unless he has a problem application that is way too slow on hardware he can't upgrade, he opts for what is easier code to read and maintain. Sometimes that's faster than hard to read queries anyway. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
At 2:46 PM +0100 4/7/09, Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. This is of interest to me as well. My old registrar iyd.com was sold to hover.com and the new guys have some serious problems. I've been thinking about combining my ~70 domain names into a single registrar and GoDaddy looks good thus far. So, I would like to know what problems people have-had/are-having with GoDaddy. I already know what problems I'm having with my current registrar -- which has been a giant step backwards in services at an increased cost. How can people screw up something they bought? Beats me. Cheers, tedd -- --- http://sperling.com http://ancientstones.com http://earthstones.com -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Re: PHP class or functions to manipulate PDF metadata?
Peter Ford wrote: O. Lavell wrote: Peter Ford wrote: [..] I do accept that the metadata should be machine-readable: that part of your project is reasonable and I'm fairly sure that ought to be possible with something simple. The best bet I found so far is PDFTK (http://www.pdfhacks.com/pdftk/) which is a command-line tool that you could presumably call with exec or whatever... Like I said, this is what I am already doing with the pdfinfo utility from xpdf. Sorry - I guess I didn't read that bit carefully enough... No problem at all, I was really glad someone wanted to share their thoughts anyway after it first seemed that no one was interested. [..] So thank you again for pushing me in that direction, even if unintentionally and despite the fact that what I am doing goes against your judgement ;) As I know only too well, you can't always choose your customers (especially if they choose you...) and you certainly can't control all of the sources of data you have to deal with! Exactly. I have spent many hours/days/possibly longer hacking through files that are in one form to get data into another, and PDF is the one that always makes me nervous :( So far you, Tedd and I agree on this. The so-called portable document format is a rather convoluted thing. My judgement is certainly not final, or even particularly important: if I had time I would also look into at least getting the metadata with pure PHP. Good luck... Thank you. If I did have the time (to spare) I would feel almost obliged to try to figure it out. Perhaps in a week or two... -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
2009/4/7 tedd tedd.sperl...@gmail.com: At 2:46 PM +0100 4/7/09, Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. This is of interest to me as well. My old registrar iyd.com was sold to hover.com and the new guys have some serious problems. I've been thinking about combining my ~70 domain names into a single registrar and GoDaddy looks good thus far. So, I would like to know what problems people have-had/are-having with GoDaddy. I already know what problems I'm having with my current registrar -- which has been a giant step backwards in services at an increased cost. How can people screw up something they bought? Beats me. I use gandi.net and can't say enough good things about them - never had a problem in nearly 10 years! -Stuart -- http://stut.net/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP bandwidth control
tedd wrote: At 2:46 PM +0100 4/7/09, Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. This is of interest to me as well. My old registrar iyd.com was sold to hover.com and the new guys have some serious problems. I've been thinking about combining my ~70 domain names into a single registrar and GoDaddy looks good thus far. So, I would like to know what problems people have-had/are-having with GoDaddy. My only problem is that their interface is crappy and inconsistent. However, managing domain names is cake - they do it well. Their domain manager web app is fairly well done. Buying domains can be a PITA as they try to sell you all kinds of stuff with it, and their pages are really busy so you have to scroll down to the continue button for continue after deciding you don't want any of their superfluous stuff. Anyway - within 15 minutes of changing what nameserver should be used with a registered domain name, my ISP nameserver starts using it, every time - and never a problem. e-mail support has been answered within 24 hours, and once with a phone call because the tech didn't understand my question. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
On Tue, Apr 7, 2009 at 1:10 PM, Michael A. Peters mpet...@mac.com wrote: PJ wrote: Bob McConnell wrote: From: PJ First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: Actually, it appears you simply don't like the accurate answers you have been given. First, let me say that I am pretty fresh to all this. Second, I believe I can get the authors with several more queries as I have done for listings of all the books and by categories, but I am trying to limit the queries thinking that that will speed up the data retrieval. A friend of mine who manages many large scale websites with massive databases says that isn't always the case, especially if you don't have a dedicated SQL server with very fast disks and lots of memory. He's found that in many situations it is faster to do several sql queries and let php sort it out then to use a bunch of joins, subselects, etc. in order to reduce the number of sql queries. Has to do with how sql works on the filesystem, and the IO that can result from sql needing to do a more complex query, and what is fastest varies upon your setup. I think he said sub-selects are the worst because sql has to create a virtual table for the subselect and that can really slow the query down, but I might be mistaken about that. Thus unless he has a problem application that is way too slow on hardware he can't upgrade, he opts for what is easier code to read and maintain. Sometimes that's faster than hard to read queries anyway. Simple queries will almost alwasy be faster than a large join. Join query response times can be affected by the order of the join if the primary table is not the largest one amoung other factors. Another thing to consider here is that the data is relatively static. Perhaps you could build a xml representation on the first display of the book and pay the piper once in building the data. From there build an xml snippet ( and store it in the database perhaps in the main books table. Then on future calls to that book to display the same data, request the xml and use that to display all the data. That way you get the search functionality you are looking for and a one time hit to build that data into one common format. That would make the future reads of that data much quicker. -- Bastien Cat, the other other white meat
Re: [PHP] PHP bandwidth control
On Tue, Apr 7, 2009 at 1:36 PM, Michael A. Peters mpet...@mac.com wrote: tedd wrote: At 2:46 PM +0100 4/7/09, Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. This is of interest to me as well. My old registrar iyd.com was sold to hover.com and the new guys have some serious problems. I've been thinking about combining my ~70 domain names into a single registrar and GoDaddy looks good thus far. So, I would like to know what problems people have-had/are-having with GoDaddy. My only problem is that their interface is crappy and inconsistent. However, managing domain names is cake - they do it well. Their domain manager web app is fairly well done. Buying domains can be a PITA as they try to sell you all kinds of stuff with it, and their pages are really busy so you have to scroll down to the continue button for continue after deciding you don't want any of their superfluous stuff. Anyway - within 15 minutes of changing what nameserver should be used with a registered domain name, my ISP nameserver starts using it, every time - and never a problem. e-mail support has been answered within 24 hours, and once with a phone call because the tech didn't understand my question. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php I like and have used www.misk.com for the last three years. Its got a nice interface and domains are $10/year -- Bastien Cat, the other other white meat
Re: [PHP] PHP bandwidth control
I hate the bulky interface of Godaddy.com its too tough for slower connections to work with GoDaddy's control panels. Their domain charge seems bit high as well. But I'm liking www.umbrahosting.com it has good cPanel and controls are good. Their support are very sprint. Lenin www.twitter.com/nine_L
Re: [PHP] PHP bandwidth control
On Tue, Apr 07, 2009 at 10:36:23AM -0700, Michael A. Peters wrote: tedd wrote: At 2:46 PM +0100 4/7/09, Nick Cooper wrote: Sorry to side track the issue, but when did this happen to you on GoDaddy? I have never experienced this problem. I have been using them for two years and I often leave domains in the checkout and come back sometimes days later and they're still $7.95. This is of interest to me as well. My old registrar iyd.com was sold to hover.com and the new guys have some serious problems. I've been thinking about combining my ~70 domain names into a single registrar and GoDaddy looks good thus far. So, I would like to know what problems people have-had/are-having with GoDaddy. http://nodaddy.com I know a bunch of engineers at the company which hosts my LUG's lists and website. These guys are UBERgeeks. At one time they recommended GoDaddy, but no more. These guys register and maintain many many more domains than I ever will. I trust their judgment. See the above link for various reasons why GoDaddy is a poor choice. Paul -- Paul M. Foster -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Possible Server Infection?
On Apr 4, 2009, at 6:51 PM, TG wrote: Anyway, just some thoughts. Good luck! Thanks to TG, Bastien, and Marc. I appreciate the input. Regards, Frank -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] difficult select problem
PJ wrote: Gentlemen, First, let me thank you all for responding and offering suggestions. I appreciate it and I am learning things. However, it looks like my message is not getting across: The problem is not to retrieve only the authors whose last names begin with A: 1) which books have a second author? 2) who is the second author ? This is determined by table book_author column ordinal (which can be 1 or 2) - if there is only 1 author for a book then there is no ordinal 2 linked to book_author bookID and authID. You've been given a few solutions for this and your only reply has been doesn't do what I want. Did you try Jim's suggestion? What didn't it do that you wanted it to? -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] PHP require_once() opens some files but not others in same library
Maybe ask on the zend list - http://framework.zend.com/community/resources since they will be familiar with it. Still sounds like a url is being used for a require/include but *shrug*. The issue resolved itself when I moved the domain to a different server. I had to do this for other reasons and now the previously described issues no longer occur. Cheers Henning -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php