[PHP] Broken pipe
Hi all, I wrote a php script which is running very long queries (hours) on a database. I seem to have a problem to run the code when there are single queries which take long times (like 5 hours for an update query), from the log of the database I received the following code: 2005-09-30 17:12:13 IDT postgres : LOG: 0: duration: 18730038.678 ms statement: UPDATE product_temp SET nleft=(SELECT 2005-09-30 17:12:13 IDT postgres : LOCATION: exec_simple_query, postgres.c:1035 2005-09-30 17:12:13 IDT postgres : LOG: 08006: could not send data to client: Broken pipe 2005-09-30 17:12:13 IDT postgres : LOCATION: internal_flush, pqcomm.c:1050 2005-09-30 17:12:13 IDT postgres : LOG: 08P01: unexpected EOF on client connection 2005-09-30 17:12:13 IDT postgres : LOCATION: SocketBackend, postgres.c:287 2005-09-30 17:12:13 IDT postgres : LOG: 0: disconnection: session time: 6:04:58.52 2005-09-30 17:12:13 IDT postgres : LOCATION: log_disconnections, postgres.c:3403 Now after the 5 hours update it need to echo into a log file a line which say that it ended this command (just for me to know the times), my assumption is that PHP read the code into memory at start and opened the connection to the file, after a time which he waited to any given "life sign" he gave up and closed the connection to the file, and when the code came back to the file it encountered no connection to the file (broken pipe). Am I correct at my assumption? if so how can I set the PHP to wait how much I tell him? Ofcourse if im wrong I would like to know the reason also :) Thanks in advance, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] blob and long blob
timothy johnson wrote: I am building a database that houses photo, I thought everything was going fine til I went to upload a file that was bigger then 2MB. I check my HTML int he form I have it set to 10MB, in php.ini I have it set to 32MB, and then in mysql I am using a longblob so shouldnt that handle like 4GB. Anyone have anyone idea, on why files above 2MB arent working? It passed sometime since I made such a thing but check about: ini_set("memory_limit","18M"); // here its set to 18MB and also I know that PostgreSQL got functions which are specific for dealing with large objects, maybe MySQL got something similar also. Cheers, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Trimming Text
André Medeiros wrote: On Fri, 2005-07-15 at 16:03 +0100, Richard Davey wrote: Hello André, Friday, July 15, 2005, 4:24:23 PM, you wrote: AM> I am trying to trim some text containing HTML tags. What I want to AM> do is to trim the text without trimming the tags or html entities AM> like and such, wich completelly break the design. The problem as I see it, is that while it's easy to trim some text and then check to see if you were inside an HTML tag or not, it becomes MUCH harder to check if you were inside nested tags (for example ) If there are no nested tags then it's much easier.. just trim the string at X characters and then search for the last occurrence of a '>' and the last occurance of '<' - if the first is LESS than the second value, then you're in the middle of a tag. This of course doesn't handle nested tags. Best regards, Richard Davey -- http://www.launchcode.co.uk - PHP Development Services "I do not fear computers. I fear the lack of them." - Isaac Asimov Yeah... that's the point :( Nested tags are very possible. I am not sure how one would go here tho sorry that i didnt have time to read everything word by word but did you try to run strip_tags()? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Echo array string index?
Matt Darby wrote: I have an array setup as such: *$arr['generated text']='generated number';* What would be the best way to echo the key in a loop? Seems pretty easy but I've never attempted... Thanks all! Matt Darby Unless I didnt understood you.. you can easily use foreach() for that. If you want to loop over it again and again and echo it inside you can also just make a for() loop and in it make print_r(), if you want to use the values of the array inside the for() you can again use foreach() inside it just remember that after the foreach() end you need to do reset() to the array to repeat the process again with the for(). Cheers, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] creating of multi dimensional arrays with unknown depth
Hi all! I got a string which include information for categories that I need to build, the string is built like this: $string = "val1~~val2~~val3~~val4~~val5" Now, I want to create from this string a multi dimensional array and not one dimension array as explode('~~', $string) will give me. The desired result from this string is: $array['val1'] = 'whatever'; $array['val1']['val2'] = 'whatever'; $array['val1']['val2']['val3'] = 'whatever'; $array['val1']['val2']['val3']['val4'] = 'whatever'; $array['val1']['val2']['val3']['val4']['val'5] = 'whatever'; My problem is that the number of the dimensions is really unknown and can be changed from 1 to 20 as far as I know. Is there a way to create such a dynamic depth in an automatic way and not by using lots of if's? Thanks in advance, Ben-Nes Yonatan Canaan Surfing Ltd. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP Memory limit exceeded
> Richard Lynch wrote: >>>At the other hand i did find some huge mistake at my side... only the >>>download of the file require 28MB (god knows why! but i wont start to >>>bother >>>u folks about it now) the upload need 16MB so clearly u were correct but >>>im >>>sorry to admit that i didnt understand y its getting tripled... after all >>>the result $res is a result of an INSERT not a SELECT therfore it doesnt >>>return all the data but just an identification if the action went >>>succesful >>>or not... or maybe im wrong here also? >> >> >> PHP probably has to create an extra big chunk of text somewhere in the >> process of sending the data to PostgreSQL. > > or it was simpler to write :) > >> >> >>>About the mysql's function... well its mysql and im using postgresql... i >>>tried to search for something that could help uploading files data to the >>>server but didnt find anything useful. >> >> >> Sorry. I suspect they operate the same internally anyway, in terms of >> transferring data to/from the database, and buffering the query. > > no, load_file() is mysql internal function. the file is read by mysql > server, so it completely bypass php and also client libraries. you need to > have file priviledge. > Hi and thanks again Richard and Marek, Finally I found the solution that I craved for which is large objects at postgresql, I can upload and d/l files with memory consuming of just alittle bit more of the file size itself. Actually I have to admit that I saw that option before but I thought that its quite useless for my needs because they wrote there that after version 7.1 of postgresql the large objects interface is partially obsolete and they also wrote that one of the remaining beneifits of using it is that it can hold up to 2GB!! instead of the new TOAST of 1GB!! so I thought that I dont need it at all if anyway I can get to 1GB but apprantly they didnt wrote anywhere about the memory consumption diffrence... or maybe I dont control the english language as I wish to :) The large objects manual - http://www.postgresql.org/docs/7.4/interactive/largeobjects.html (for ver 7.4) Thanks alot again, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP Memory limit exceeded
>> Ben-Nes Yonatan wrote: >> Hi all, >> >> I got a problem with uploading files which encounter the memory limit >> when their size is not even close to the memory limit itself, let me >> explain. >> >> My server is as follows: >> 1. PHP 4.3.9 >> 2. DB - Postgresql 7.4 >> 3. Apache 1.3.26 >> >> Here is my code that i made for testing the problem (along the code i >> echoed the function memory_get_usage() to know how much memory was >> allocated already for the script): >> >> $imagefile=$_FILES['imagefile']; // recieve the file >> echo memory_get_usage().'';// 118592 memory bytes allocated >> $data = pg_escape_bytea(`cat $imagefile[tmp_name]`); >> echo memory_get_usage().'';// 5570280 memory bytes allocated >> >> $data = "INSERT INTO test_files (bin_data, filename, filesize, filetype) >>VALUES ('$data', '$imagefile[name]', '$imagefile[size]', >> '$imagefile[type]')"; // creating the sql for the insert, i called the >> received value also $data cause i dont want to keep the previous $data >> (after all we want to save our precious memory no? :)) >> echo memory_get_usage().'';// 5570400 memory bytes allocated >> {changed from b4 only alittle} >> >> if ( !$res = pg_query ($this->conn, $data) ) // try to insert the sql >> string >>return 'error'; >> else >>return 'gr8'; >> echo memory_get_usage().'';// 5570648 memory bytes allocated >> {again changed only alittle} >> >> >> Now as far as i see the script needed about 5.5MB of memory to upload a >> file of 4.7MB but thats what so weird here... i receive the memory limit >> error even if the php.ini "memory_limit" is set to 16MB! {twice of the >> default of 8MB!} at 32MB it works fine... but thats way too much.. >> I suspect that the problem is connected to the pg_query function itself >> but i didnt find what made it exactly... > > Do you suspect or that's where the script output ends? > > Even between 2nd and 3rd memory_get_usage() call your script's memory > usage increased rapidly - first the sql query was created, then it was > assigned to $data. So for a moment the file was in memory twice. > > Something similar is with pg_query() - the file was at a moment in $data > string, then it is copied somewhere in the php_pg module and the memory > usage tripples. > > As a solution you might want to implement (or look for implementation of) > mysql's load_file() function. > Hi Marek and Richard thanks for answering, First about the memory consumption i knew what u say though wasnt sure 100% how its working (im self learned so this part wasnt known/clear to me :)), because of that the option of 16MB to the memory limit seemed logical to me. At the other hand i did find some huge mistake at my side... only the download of the file require 28MB (god knows why! but i wont start to bother u folks about it now) the upload need 16MB so clearly u were correct but im sorry to admit that i didnt understand y its getting tripled... after all the result $res is a result of an INSERT not a SELECT therfore it doesnt return all the data but just an identification if the action went succesful or not... or maybe im wrong here also? About the mysql's function... well its mysql and im using postgresql... i tried to search for something that could help uploading files data to the server but didnt find anything useful. I think that i will go on the solution of Richard and just upload the files to the file system though i dont really like this solution cause ill lose the integrity of the database but then again apprantly i dont really have any option do i? :P Thanks again for helping me out, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP Memory limit exceeded
>> Ben-Nes Yonatan wrote: >> Hi all, >> >> I got a problem with uploading files which encounter the memory limit >> when their size is not even close to the memory limit itself, let me >> explain. >> >> My server is as follows: >> 1. PHP 4.3.9 >> 2. DB - Postgresql 7.4 >> 3. Apache 1.3.26 >> >> Here is my code that i made for testing the problem (along the code i >> echoed the function memory_get_usage() to know how much memory was >> allocated already for the script): >> >> $imagefile=$_FILES['imagefile']; // recieve the file >> echo memory_get_usage().'';// 118592 memory bytes allocated >> $data = pg_escape_bytea(`cat $imagefile[tmp_name]`); >> echo memory_get_usage().'';// 5570280 memory bytes allocated >> >> $data = "INSERT INTO test_files (bin_data, filename, filesize, filetype) >>VALUES ('$data', '$imagefile[name]', '$imagefile[size]', >> '$imagefile[type]')"; // creating the sql for the insert, i called the >> received value also $data cause i dont want to keep the previous $data >> (after all we want to save our precious memory no? :)) >> echo memory_get_usage().'';// 5570400 memory bytes allocated >> {changed from b4 only alittle} >> >> if ( !$res = pg_query ($this->conn, $data) ) // try to insert the sql >> string >>return 'error'; >> else >>return 'gr8'; >> echo memory_get_usage().'';// 5570648 memory bytes allocated >> {again changed only alittle} >> >> >> Now as far as i see the script needed about 5.5MB of memory to upload a >> file of 4.7MB but thats what so weird here... i receive the memory limit >> error even if the php.ini "memory_limit" is set to 16MB! {twice of the >> default of 8MB!} at 32MB it works fine... but thats way too much.. >> I suspect that the problem is connected to the pg_query function itself >> but i didnt find what made it exactly... > > Do you suspect or that's where the script output ends? > > Even between 2nd and 3rd memory_get_usage() call your script's memory > usage increased rapidly - first the sql query was created, then it was > assigned to $data. So for a moment the file was in memory twice. > > Something similar is with pg_query() - the file was at a moment in $data > string, then it is copied somewhere in the php_pg module and the memory > usage tripples. > > As a solution you might want to implement (or look for implementation of) > mysql's load_file() function. > Hi Marek and Richard thanks for answering, First about the memory consumption i knew what u say though wasnt sure 100% how its working (im self learned so this part wasnt known/clear to me :)), because of that the option of 16MB to the memory limit seemed logical to me. At the other hand i did find some huge mistake at my side... only the download of the file require 28MB (god knows why! but i wont start to bother u folks about it now) the upload need 16MB so clearly u were correct but im sorry to admit that i didnt understand y its getting tripled... after all the result $res is a result of an INSERT not a SELECT therfore it doesnt return all the data but just an identification if the action went succesful or not... or maybe im wrong here also? About the mysql's function... well its mysql and im using postgresql... i tried to search for something that could help uploading files data to the server but didnt find anything useful. I think that i will go on the solution of Richard and just upload the files to the file system though i dont really like this solution cause ill lose the integrity of the database but then again apprantly i dont really have any option do i? :P Thanks again for helping me out, Ben-Nes Yonatan -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] PHP Memory limit exceeded
Hi all, I got a problem with uploading files which encounter the memory limit when their size is not even close to the memory limit itself, let me explain. My server is as follows: 1. PHP 4.3.9 2. DB - Postgresql 7.4 3. Apache 1.3.26 Here is my code that i made for testing the problem (along the code i echoed the function memory_get_usage() to know how much memory was allocated already for the script): $imagefile=$_FILES['imagefile']; // recieve the file echo memory_get_usage().'';// 118592 memory bytes allocated $data = pg_escape_bytea(`cat $imagefile[tmp_name]`); echo memory_get_usage().'';// 5570280 memory bytes allocated $data = "INSERT INTO test_files (bin_data, filename, filesize, filetype) VALUES ('$data', '$imagefile[name]', '$imagefile[size]', '$imagefile[type]')"; // creating the sql for the insert, i called the received value also $data cause i dont want to keep the previous $data (after all we want to save our precious memory no? :)) echo memory_get_usage().'';// 5570400 memory bytes allocated {changed from b4 only alittle} if ( !$res = pg_query ($this->conn, $data) ) // try to insert the sql string return 'error'; else return 'gr8'; echo memory_get_usage().'';// 5570648 memory bytes allocated {again changed only alittle} Now as far as i see the script needed about 5.5MB of memory to upload a file of 4.7MB but thats what so weird here... i receive the memory limit error even if the php.ini "memory_limit" is set to 16MB! {twice of the default of 8MB!} at 32MB it works fine... but thats way too much.. I suspect that the problem is connected to the pg_query function itself but i didnt find what made it exactly... Any ideas, knowledge or even just solutions ;) will be extremly helpful. Thanks in advance, Ben-Nes Yonatan
[PHP] counting clicks on a flash ad
Hi all! in the current site that im building there are flash banners for advertisment. now the owner of the site want to know how many times a banner got clicked and viewed. here i got a problem... the flash banners direct to another sites (changing all the time) and i cant figure out a way to tell my server that the banner got clicked and to update my db because of it. i thought on redirecting but as far as i know flash files contain their own link and that link cant be changed (created when the flash get compiled). does anyone know a way to count those clicks? or just got a link to a site which explain alittle? with thanks in advance Ben-Nes Yonatan