php-general Digest 6 Nov 2007 07:35:19 -0000 Issue 5112

Topics (messages 264072 through 264092):

Re: Can I make a process run in background?
        264072 by: Paul Scott

Re: Mail function doesn't work
        264073 by: Per Jessen
        264075 by: Jim Lucas

Re: More info on timeout problem
        264074 by: Instruct ICC
        264076 by: Jon Westcot
        264077 by: Instruct ICC
        264078 by: Kristen G. Thorson
        264079 by: Wolf
        264080 by: Jon Westcot
        264081 by: Jon Westcot
        264082 by: Jon Westcot
        264083 by: Daniel Brown
        264086 by: Jochem Maas

Re: Looking for ways to prevent timeout
        264084 by: Jochem Maas

Re: Problem with input name, how can i use . (dot) in a name of a input type 
text?
        264085 by: Jochem Maas

Re: Close a session knowing it's ID (not the current session)
        264087 by: Jochem Maas

Installing php_pgsql.dll to Apache - this is unreal
        264088 by: Martin Zvarík

gzuncompress() Not Working.
        264089 by: Casey

More info on timeout problem, with code
        264090 by: Jon Westcot

Chinese input character count
        264091 by: Ronald Wiplinger

Memory Allocation Error
        264092 by: heavyccasey.gmail.com

Administrivia:

To subscribe to the digest, e-mail:
        [EMAIL PROTECTED]

To unsubscribe from the digest, e-mail:
        [EMAIL PROTECTED]

To post to the list, e-mail:
        [EMAIL PROTECTED]


----------------------------------------------------------------------
--- Begin Message ---

On Mon, 2007-11-05 at 19:20 +0100, Luca Paolella wrote:

> I want the bot to run a process in background (a periodic message,  
> for example) while listening for events (like a user joining a  
> channel or using a certain command) and consequentially executing the  
> corresponding functions, is it possible? and how?
> 

You could try this archaic code I wrote a few years back:

http://www.phpclasses.org/browse/package/2837.html

or search for PHP Daemon and get some ideas there.

--Paul

All Email originating from UWC is covered by disclaimer 
http://www.uwc.ac.za/portal/public/portal_services/disclaimer.htm 

--- End Message ---
--- Begin Message ---
Alberto García Gómez wrote:

> What could happen that my mail function isn't working. I check twice
> my php.ini conf and it's fine. I test sendmail manually and it's OK. I
> also try to send mails with sendmail stoped and started and nothing
> happen

Does your mail-server otherwise work?  


/Per Jessen, Zürich

--- End Message ---
--- Begin Message ---
Alberto García Gómez wrote:
What could happen that my mail function isn't working. I check twice my php.ini 
conf and it's fine. I test sendmail manually and it's OK. I also try to send 
mails with sendmail stoped and started and nothing happen

Este correo ha sido enviado desde el Politécnico de Informática "Carlos Marx" 
de Matanzas.
"La gran batalla se librará en el campo de las ideas"

is your apache process chroot'ed ?

Can PHP see the sendmail binary?

--
Jim Lucas

   "Some men are born to greatness, some achieve greatness,
       and some have greatness thrust upon them."

Twelfth Night, Act II, Scene V
    by William Shakespeare

--- End Message ---
--- Begin Message ---
>     I'm now wondering if some error is occurring that, for some reason, is 
> silently ending the routine.  I'm building what may be a very long SQL INSERT 
> statement for each line in the CSV file that I'm reading; could I be hitting 
> some upper limit for the length of the SQL code?  I'd think that an error 
> would be presented in this case, but maybe I have to do something explicitly 
> to force all errors to display?  Even warnings?
> 
>     Another thing I've noticed is that the "timeout" (I'm not even certain 
> the problem IS a timeout any longer, hence the quotation marks) doesn't 
> happen at the same record every time.  That's why I thought it was a timeout 
> problem at first, and assumed that the varying load on the server would 
> account for the different record numbers processed.  If I were hitting some 
> problem with the SQL statement, I'd expect it to stop at the same record 
> every time.  Or is that misguided thinking, too?

1) When you say, "doesn't happen at the same record every time" are you using 
the same dataset and speaking about the same line number?  Or are you using 
different datasets and noticing that the line number varies?  If it's the same 
dataset, it sounds like "fun" -- as in "a pain in the assets".

2) I'm writing something similar; letting a user upload a CSV file via a 
webpage, then creating an SQL query with many records.  So now I'll be watching 
your thread.  For debugging purposes, create your SQL statement and print it 
out on the webpage (or save it somewhere -- maybe a file).  Don't have your 
webpage script execute the query.  Then see if you get the complete query you 
expect.  Then copy that query into a database tool like phpmyadmin and see if 
you get errors when executing the query.

_________________________________________________________________
Peek-a-boo FREE Tricks & Treats for You!
http://www.reallivemoms.com?ocid=TXT_TAGHM&loc=us

--- End Message ---
--- Begin Message ---
Hi Instruct ICC:

> >     I'm now wondering if some error is occurring that, for some reason,
is
> > silently ending the routine.  I'm building what may be a very long SQL
> > INSERT statement for each line in the CSV file that I'm reading; could
> > I be hitting some upper limit for the length of the SQL code?  I'd think
> > that an error would be presented in this case, but maybe I have to do
> > something explicitly to force all errors to display?  Even warnings?
> >
> >     Another thing I've noticed is that the "timeout" (I'm not even
certain
> > the problem IS a timeout any longer, hence the quotation marks) doesn't
> > happen at the same record every time.  That's why I thought it was a
> > timeout problem at first, and assumed that the varying load on the
server
> > would account for the different record numbers processed.  If I were
> > hitting some problem with the SQL statement, I'd expect it to stop at
> > the same record every time.  Or is that misguided thinking, too?
>
> 1) When you say, "doesn't happen at the same record every time" are you
> using the same dataset and speaking about the same line number?  Or are
> you using different datasets and noticing that the line number varies?  If
it's
> the same dataset, it sounds like "fun" -- as in "a pain in the assets".

    Yup, same dataset.  It took me forever to upload it, so I'm trying to
keep it there until I know it's been successfully loaded.  It's got about
30,000 records in it, and each one has 240 fields.

> 2) I'm writing something similar; letting a user upload a CSV file via a
> webpage, then creating an SQL query with many records.  So now I'll
> be watching your thread.  For debugging purposes, create your SQL
> statement and print it out on the webpage (or save it somewhere --
> maybe a file).  Don't have your webpage script execute the query.
> Then see if you get the complete query you expect.  Then copy that
> query into a database tool like phpmyadmin and see if you get errors
> when executing the query.

    Sounds much like what I'm trying to do.  I have had to give up, for the
time being, on using PHP to upload the datafile; it's about 56 MB in size
and nothing I do seems to let me upload anything larger than a 2MB file. :(

    How do I save the individual query statements to a file?  That may give
me a good option for checking a "log" of activity when the process fails
again.

    Thanks for your suggestions!

        Jon

--- End Message ---
--- Begin Message ---
> Sounds much like what I'm trying to do. I have had to give up, for the
> time being, on using PHP to upload the datafile; it's about 56 MB in size
> and nothing I do seems to let me upload anything larger than a 2MB file. :(

I don't know if it's been mentioned in this thread, but 2M is a default setting 
for upload_max_filesize http://php.he.net/manual/en/ini.core.php
You may also need to adjust post_max_size and memory_limit.  My 
upload_max_filesize is at 5M,, post_max_size at 8M, and memory_limit is at 32M.

>From the docs:
post_max_size integer

Sets max size of post data allowed. This setting also affects file upload. To 
upload large files, this value must be larger than upload_max_filesize.

If memory limit is enabled by your configure script, memory_limit also affects 
file uploading. Generally speaking, memory_limit should be larger than 
post_max_size.

When an integer is used, the value is measured in bytes. You may also use 
shorthand notation as described in this FAQ.

If the size of post data is greater than post_max_size, the $_POST and $_FILES 
superglobals are empty. This can be tracked in various ways, e.g. by passing 
the $_GET variable to the script processing the data, i.e. , and then checking 
if $_GET['processed'] is set.


> How do I save the individual query statements to a file? That may give
> me a good option for checking a "log" of activity when the process fails
> again.


I'm assuming you are building an $sql variable, so you would write that to a 
file instead of executing it in a query.
Look at an example here http://php.he.net/manual/en/function.fwrite.php to 
write data to a file.

_________________________________________________________________
Peek-a-boo FREE Tricks & Treats for You!
http://www.reallivemoms.com?ocid=TXT_TAGHM&loc=us

--- End Message ---
--- Begin Message ---
-----Original Message-----
From: Instruct ICC [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 05, 2007 3:34 PM
To: [EMAIL PROTECTED]
Subject: RE: [PHP] More info on timeout problem


> Sounds much like what I'm trying to do. I have had to give up, for the
> time being, on using PHP to upload the datafile; it's about 56 MB in size
> and nothing I do seems to let me upload anything larger than a 2MB file.
:(

I don't know if it's been mentioned in this thread, but 2M is a default
setting for upload_max_filesize http://php.he.net/manual/en/ini.core.php
You may also need to adjust post_max_size and memory_limit.  My
upload_max_filesize is at 5M,, post_max_size at 8M, and memory_limit is at
32M.

>From the docs:
post_max_size integer

Sets max size of post data allowed. This setting also affects file upload.
To upload large files, this value must be larger than upload_max_filesize.

If memory limit is enabled by your configure script, memory_limit also
affects file uploading. Generally speaking, memory_limit should be larger
than post_max_size.

When an integer is used, the value is measured in bytes. You may also use
shorthand notation as described in this FAQ.

If the size of post data is greater than post_max_size, the $_POST and
$_FILES superglobals are empty. This can be tracked in various ways, e.g. by
passing the $_GET variable to the script processing the data, i.e. , and
then checking if $_GET['processed'] is set.





I'm jumping in here late, so I haven't seen previous posts.  Another
possible place I have seen limiting post/upload sizes:

There is an Apache directive called LimitRequestSize or somesuch which will
take precedence (silently) over any PHP max post size you set.  I found this
set once before in an <apache_root>/conf.d/php.conf file that I eventually
found.

--- End Message ---
--- Begin Message ---
One thing to note, if you have not upped the max file size to be over what you 
are trying to load, the server will hang.

;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;

max_execution_time = 7200     ; Maximum execution time of each script, in 
seconds
max_input_time = 7200   ; Maximum amount of time each script may spend parsing 
request data
memory_limit = 2G      ; Maximum amount of memory a script may consume


; Maximum size of POST data that PHP will accept.
post_max_size = 8M  // CHANGE THIS!!

; Maximum allowed size for uploaded files.
upload_max_filesize = 2M  // CHANGE THIS!!

Also look in any php.ini files in apache's conf.d directory for files that set 
it back to these default limits

You'll notice, I have increased my max execution times, input times, and memory 
limit but not my upload sizes, but that is only due to the server I snagged it 
from not doing uploads.  I have another server which has a 879M upload limit 
and has no problems with large files getting to it.

Wolf

---- Jon Westcot <[EMAIL PROTECTED]> wrote: 
> Hi Instruct ICC:
> 
> > >     I'm now wondering if some error is occurring that, for some reason,
> is
> > > silently ending the routine.  I'm building what may be a very long SQL
> > > INSERT statement for each line in the CSV file that I'm reading; could
> > > I be hitting some upper limit for the length of the SQL code?  I'd think
> > > that an error would be presented in this case, but maybe I have to do
> > > something explicitly to force all errors to display?  Even warnings?
> > >
> > >     Another thing I've noticed is that the "timeout" (I'm not even
> certain
> > > the problem IS a timeout any longer, hence the quotation marks) doesn't
> > > happen at the same record every time.  That's why I thought it was a
> > > timeout problem at first, and assumed that the varying load on the
> server
> > > would account for the different record numbers processed.  If I were
> > > hitting some problem with the SQL statement, I'd expect it to stop at
> > > the same record every time.  Or is that misguided thinking, too?
> >
> > 1) When you say, "doesn't happen at the same record every time" are you
> > using the same dataset and speaking about the same line number?  Or are
> > you using different datasets and noticing that the line number varies?  If
> it's
> > the same dataset, it sounds like "fun" -- as in "a pain in the assets".
> 
>     Yup, same dataset.  It took me forever to upload it, so I'm trying to
> keep it there until I know it's been successfully loaded.  It's got about
> 30,000 records in it, and each one has 240 fields.
> 
> > 2) I'm writing something similar; letting a user upload a CSV file via a
> > webpage, then creating an SQL query with many records.  So now I'll
> > be watching your thread.  For debugging purposes, create your SQL
> > statement and print it out on the webpage (or save it somewhere --
> > maybe a file).  Don't have your webpage script execute the query.
> > Then see if you get the complete query you expect.  Then copy that
> > query into a database tool like phpmyadmin and see if you get errors
> > when executing the query.
> 
>     Sounds much like what I'm trying to do.  I have had to give up, for the
> time being, on using PHP to upload the datafile; it's about 56 MB in size
> and nothing I do seems to let me upload anything larger than a 2MB file. :(
> 
>     How do I save the individual query statements to a file?  That may give
> me a good option for checking a "log" of activity when the process fails
> again.
> 
>     Thanks for your suggestions!
> 
>         Jon
> 
> -- 
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php

--- End Message ---
--- Begin Message ---
Hi Wolf:

    Thanks for the suggestion.  I've tried setting these in a php.ini file,
but that file seems to be constantly ignored, other than the fact that its
presence seems to cause every value to take on its default settings.
::sigh::  I am going to try and put the values into a .htaccess file
instead; at least I seem to have some control over that file.

    I'm really starting to hate shared servers.

    Jon


----- Original Message -----
From: "Wolf" <[EMAIL PROTECTED]>
To: "Jon Westcot" <[EMAIL PROTECTED]>
Cc: "PHP General" <[EMAIL PROTECTED]>
Sent: Monday, November 05, 2007 2:20 PM
Subject: Re: [PHP] More info on timeout problem


> One thing to note, if you have not upped the max file size to be over what
you are trying to load, the server will hang.
>
> ;;;;;;;;;;;;;;;;;;;
> ; Resource Limits ;
> ;;;;;;;;;;;;;;;;;;;
>
> max_execution_time = 7200     ; Maximum execution time of each script, in
seconds
> max_input_time = 7200   ; Maximum amount of time each script may spend
parsing request data
> memory_limit = 2G      ; Maximum amount of memory a script may consume
>
>
> ; Maximum size of POST data that PHP will accept.
> post_max_size = 8M  // CHANGE THIS!!
>
> ; Maximum allowed size for uploaded files.
> upload_max_filesize = 2M  // CHANGE THIS!!
>
> Also look in any php.ini files in apache's conf.d directory for files that
set it back to these default limits
>
> You'll notice, I have increased my max execution times, input times, and
memory limit but not my upload sizes, but that is only due to the server I
snagged it from not doing uploads.  I have another server which has a 879M
upload limit and has no problems with large files getting to it.
>
> Wolf
>
> ---- Jon Westcot <[EMAIL PROTECTED]> wrote:
> > Hi Instruct ICC:
> >
> > > >     I'm now wondering if some error is occurring that, for some
reason,
> > is
> > > > silently ending the routine.  I'm building what may be a very long
SQL
> > > > INSERT statement for each line in the CSV file that I'm reading;
could
> > > > I be hitting some upper limit for the length of the SQL code?  I'd
think
> > > > that an error would be presented in this case, but maybe I have to
do
> > > > something explicitly to force all errors to display?  Even warnings?
> > > >
> > > >     Another thing I've noticed is that the "timeout" (I'm not even
> > certain
> > > > the problem IS a timeout any longer, hence the quotation marks)
doesn't
> > > > happen at the same record every time.  That's why I thought it was a
> > > > timeout problem at first, and assumed that the varying load on the
> > server
> > > > would account for the different record numbers processed.  If I were
> > > > hitting some problem with the SQL statement, I'd expect it to stop
at
> > > > the same record every time.  Or is that misguided thinking, too?
> > >
> > > 1) When you say, "doesn't happen at the same record every time" are
you
> > > using the same dataset and speaking about the same line number?  Or
are
> > > you using different datasets and noticing that the line number varies?
If
> > it's
> > > the same dataset, it sounds like "fun" -- as in "a pain in the
assets".
> >
> >     Yup, same dataset.  It took me forever to upload it, so I'm trying
to
> > keep it there until I know it's been successfully loaded.  It's got
about
> > 30,000 records in it, and each one has 240 fields.
> >
> > > 2) I'm writing something similar; letting a user upload a CSV file via
a
> > > webpage, then creating an SQL query with many records.  So now I'll
> > > be watching your thread.  For debugging purposes, create your SQL
> > > statement and print it out on the webpage (or save it somewhere --
> > > maybe a file).  Don't have your webpage script execute the query.
> > > Then see if you get the complete query you expect.  Then copy that
> > > query into a database tool like phpmyadmin and see if you get errors
> > > when executing the query.
> >
> >     Sounds much like what I'm trying to do.  I have had to give up, for
the
> > time being, on using PHP to upload the datafile; it's about 56 MB in
size
> > and nothing I do seems to let me upload anything larger than a 2MB file.
:(
> >
> >     How do I save the individual query statements to a file?  That may
give
> > me a good option for checking a "log" of activity when the process fails
> > again.
> >
> >     Thanks for your suggestions!
> >
> >         Jon
> >
> > --
> > PHP General Mailing List (http://www.php.net/)
> > To unsubscribe, visit: http://www.php.net/unsub.php
>
>

--- End Message ---
--- Begin Message ---
Hi Instruct ICC:

> > How do I save the individual query statements to a file? That may give
> > me a good option for checking a "log" of activity when the process fails
> > again.
>
> I'm assuming you are building an $sql variable, so you would write that to
> a file instead of executing it in a query. Look at an example here
> http://php.he.net/manual/en/function.fwrite.php to write data to a file.

    Thanks for the link; I should have guessed that fwrite() was the
command, but I'm starting to get so cross-eyed and synapse-tied here that my
thinking is really muzzy.

    But get this -- I tried your suggestion and am writing the SQL query
text to a file and am not doing anything with updating the data in MySQL at
all.  And it still "times" out, in constantly random areas.

    If it happened in the exact same place every time, I could live with
that, but it's not.  It's entirely random (or at least I haven't seen the
pattern yet), and that just makes me ill.

    Jon

--- End Message ---
--- Begin Message ---
Hi Kristen:

> I'm jumping in here late, so I haven't seen previous posts.  Another
> possible place I have seen limiting post/upload sizes:
>
> There is an Apache directive called LimitRequestSize or somesuch which
will
> take precedence (silently) over any PHP max post size you set.  I found
this
> set once before in an <apache_root>/conf.d/php.conf file that I eventually
> found.

    I just conducted another test of my process, this time not even creating
the SQL INSERT string in the program, but just trying to read every record
from the input file and just writing a count of records to the file.  It
worked, much to my surprise and pleasure.  But then, when I put back the
code to build the string, it timed out again.

    I'm starting to believe that I'm either using up some resource that
isn't being released, or that the directive you mentioned, LimitRequestSize,
is being trounced upon.  Any ideas on how to find out the value currently
set for this directive?  Or how I can override it?  Can I override it within
my own .htaccess file?

    Thanks for the suggestion!

    Jon

--- End Message ---
--- Begin Message ---
On 11/5/07, Jon Westcot <[EMAIL PROTECTED]> wrote:
>     I'm really starting to hate shared servers.

    In general, keep in mind that a shared server is for low-level
activities.  If you're doing something that big, you may want to move
to a VPS, a smaller hosting company such as <plug
shame="False">PilotPig.net, which might be able to set up an
environment to match your needs, but still cost about the same as a
shared hosting plan</plug>, or even a dedicated server.  In fact, if
you check out the banner at the top of the webpage for the CentOS
project, you'll see that a dedicated server runs about $49/mo.  Then
you have to add on cPanel/Plesk/Helm/whatever you want, and manage it,
but if you're comfortable with that, it's not that expensive.

-- 
Daniel P. Brown
[office] (570-) 587-7080 Ext. 272
[mobile] (570-) 766-8107

Give a man a fish, he'll eat for a day.  Then you'll find out he was
allergic and is hospitalized.  See?  No good deed goes unpunished....

--- End Message ---
--- Begin Message ---
Jon Westcot wrote:
> Hi all:

just show us the code.

--- End Message ---
--- Begin Message ---
Nathan Nobbe wrote:
> On 11/5/07, Jochem Maas <[EMAIL PROTECTED]> wrote:

...

yes yes yes to all that, except for one thing. Im personally of
the opinion such 'import' operation should involve no human interaction
and garanteed to complete (e.g. auto resume), save for possibly
initializing a process. the way I see it you what to try to garantee the
import will be atomic in such cases.

there is nothing to stop you having a fancy UI that polls the server
to check the job table (as exampled earlier in this thread) for the status
of a job (and subsequently, possibly retrieve some processed output).

the two processes ('init & 'review and 'run job') should be independent, any
form of control (e.g. a cancellation) should happen via some sort of IPC
mechanism and should be optional apart from possibly initialization (depending
on business requirements). successful import completion should not have to
rely on the user having to press 'continue' or even stay on the page or anything
of that nature.

also consider that with regard to such tasks it's not efficient to
have some one staring at a progress bar. and annoying, after a few seconds,
for the user, regardless of how pretty.

> 
> -nathan
> 

--- End Message ---
--- Begin Message ---
Jônata Tyska Carvalho wrote:
> with regard to things not to do, don't f'ing reply off-list (unless asked),
> etiquette asks that you keep the conversation on the mailing list. if you
> wAnt to call me an ass because you don't like the way I tried to help that's
> fine
> but please do it in public :-)
> 
> sorry but i thought when im hitting the reply button it was replying to the
> list not for the last user that replied it. reply all is the right button to
> hit.

yeah. list idiosyncrasity I guess. sorry for being grouchy (another list
idiosyncrasity I guess).

> 
> 
> On Nov 5, 2007 2:01 PM, Jochem Maas <[EMAIL PROTECTED]> wrote:
> 
>> Jônata Tyska Carvalho wrote:
>>> Well if i cant, i cant! But dont say i dont need!!!!
>> why not?
>>
>>>  Im working with a
>>> framework that works in that way. Need to put the name of the column of
>> please name the framework - I like to know what to avoid :-)
>>
>>> my database like the name of the input. And i CANNOT alter the source
>>> code of framework.
>> why not?
>>
>>  Thanks for your time anyway.
>>
>> you could use the auto_prepend_file to unbork the $_POST array prior to
>> this application running - you can set the auto_prepend_file directive in
>> the relevant webserver config.
>>
>> with regard to things not to do, don't f'ing reply off-list (unless
>> asked),
>> etiquette asks that you keep the conversation on the mailing list. if you
>> wAnt to call me an ass because you don't like the way I tried to help
>> that's fine
>> but please do it in public :-)
>>
>>> On Nov 5, 2007 1:44 PM, Jochem Maas <[EMAIL PROTECTED]
>>> <mailto:[EMAIL PROTECTED]>> wrote:
>>>
>>>     Jônata Tyska Carvalho wrote:
>>>     > then, there is no way to do it works??? i just wanna use a name of
>>>     input
>>>     > like table.name <http://table.name> < http://table.name>, if i to
>>>     need to write another
>>>     > solution i know how to do it, but right now i need to use the name
>> in
>>>     > that way. =/
>>>
>>>     NO YOU DON'T - you need to work around the problem and make your
>>>     code work.
>>>     you may want 'table.name <http://table.name>' but alas you can't
>>>     have it. so use some super simple
>>>     character substitution in order to work around. easy enough if
>>>     rather annoying.
>>>
>>>     then again unless your developing something likwe phpmyadmin or a
>>>     custom report generation tool, then you probably shouldn't be
>>>     [needing to]
>>>     placing table names anywhere in output that goes to the browser ..
>>>     it just doesn't
>>>     seem right or necessary ... that said you may have a perfectly sound
>>>     reason :-)
>>>
>>>     >
>>>     > On Nov 5, 2007 11:22 AM, Jochem Maas <[EMAIL PROTECTED]
>>>     <mailto:[EMAIL PROTECTED]>
>>>     > <mailto:[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>>
>> wrote:
>>>     >
>>>     >     Stut wrote:
>>>     >     > Jônata Tyska Carvalho wrote:
>>>     >     >> Im having a big problem because the name of one input type
>>>     text
>>>     >     that is '
>>>     >     >> table.name <http://table.name> <http://table.name>' in my
>>>     html, becomes 'table_name'
>>>     >     in php, it is a kind of
>>>     >     >> bug??
>>>     >     >> =S
>>>     >     >>
>>>     >     >> <form method="post">
>>>     >     >> <input type="text" name" table.name <http://table.name>
>>>     <http://table.name>">
>>>     >     >> </form>
>>>     >     >>
>>>     >     >> in PHP we have:
>>>     >     >>
>>>     >     >> $_POST["table_name"] instead of $_POST[" table.name
>>>     <http://table.name>
>>>     >     < http://table.name>"]
>>>     >     >>
>>>     >     >> someone knows some way to put this to work?? i wanna send
>>>     >     'table.name <http://table.name> <http://table.name>'
>>>     >     >> and
>>>     >     >> receive in php ' table.name <http://table.name> <
>>>     http://table.name>'!
>>>     >     >
>>>     >     > I don't know for certain but that's likely happening because
>> a
>>>     >     period is
>>>     >     > not valid in a PHP variable name. One alternative would be
>>>     to use
>>>     >     > table[name] instead. This will lead to
>> $_POST['table']['name'].
>>>     >
>>>     >     I think Stut is correct - the period is a concatenation
>> operator.
>>>     >     also there are plenty of alertnatives to the Stuts suggested
>>>     >     'table[name]' naming approach.
>>>     >
>>>     >     that said given the following code:
>>>     >
>>>     >            $f = "my.bad";
>>>     >            $$f = "MY BAD";
>>>     >            echo $f, "\n", $$f, "\n";
>>>     >
>>>     >     ... I personally feel that the $_POST should just contain
>>>     >     'table.name <http://table.name> <http://table.name
>>>     <http://table.name>>' - which is not an illegal array key
>>>     >     - most likely the reason it is (the var name)
>>>     >     transformed is due to BC, namely with register_globals set to
>>>     ON php
>>>     >     is required to automatically
>>>     >     create a variable $table.name (which is not legal).
>>>     >
>>>     >     >
>>>     >     > -Stut
>>>     >     >
>>>     >
>>>     >
>>>     >
>>>     >
>>>     > --
>>>     > Jônata Tyska Carvalho
>>>     > -----------------------------------------------------------------
>>>     > -- Técnico em Informática pelo Colégio Técnico Industrial (CTI)
>>>     > -- Graduando em Engenharia de Computação
>>>     > Fundação Universidade Federal de Rio Grande (FURG)
>>>
>>>
>>>
>>>
>>> --
>>> Jônata Tyska Carvalho
>>> -----------------------------------------------------------------
>>> -- Técnico em Informática pelo Colégio Técnico Industrial (CTI)
>>> -- Graduando em Engenharia de Computação
>>> Fundação Universidade Federal de Rio Grande (FURG)
>>
> 
> 

--- End Message ---
--- Begin Message ---
Richard Heyes wrote:
> <?php
>     unlink('/tmp/sess_' . session_id());

unlink(session_save_path().'/'.session_id()); // no?

> ?>
> 
> You'll need to know the session_id of the session you want to close. The
> code above closes/ends the current users session, but simply substitute
> the desired session id (of course you'll need to know this in advance)
> for the call to session_id().
> 

--- End Message ---
--- Begin Message ---
Hi,
  I am trying now for almost 2 hours to get this working:

I had Apache and PHP with modules installed as CGI (also tryed as module).
Now I added extension (in php.ini) php_pgsql.dll and also installed postgreSQL.

And it shows me error when starting apache: "module could not be found".

The path is 100% ok, I also tryed 3 different php_pgsql.dll and one shown different error dialog with that i need to change some settings in postgre (but that was probably caused because that DLL file was for PHP 4, and I have PHP 5 - the newest version).

Please help!
Maritn

--- End Message ---
--- Begin Message ---

Hello, list!

I'm trying to translate some code from Python to PHP so it will work on my server.

The Python code is:
x = zlib.decompressobj()
x.decompress(blah) # blah is a 100KB string

When I try gzuncompress() on the same data (I checked), it returns a "Data error". I also tried gzinflate() and many user-created gzdecode () functions, with no luck.

After many hours of Googling and test scripts, I have found little information.

I think the problem could be a memory problem. When I try to decompress a shorter string, it works perfectly.

The data fed to the decompression function is the same and therefore should be valid.

When I looked at the user comments for gzuncompress(), it says that it might not decompress some zlib data.

Any ideas to how to fix this problem?

Thanks,
Casey

--- End Message ---
--- Begin Message ---
Hi all:

    As requested, here's the code:

<?php
if(isset($_POST['process'])){
    $old_session_gc_maxlifetime = "";
    $old_max_execution_time = "";
    $old_max_input_time = "";
    $old_session_gc_maxlifetime = ini_set("session.gc_maxlifetime","1800");
    $old_max_execution_time = ini_set("max_execution_time","1800");   // 30
minutes
    $old_max_input_time = ini_set("max_input_time","1800");   // 30
minutes -- doesn't work

    echo "<p>Session.gc_maxlifetime: " . ini_get("session.gc_maxlifetime") .
"</p>\n";  // shows 1800
    echo "<p>Max execution time: " . ini_get("max_execution_time") .
"</p>\n";  // shows 1800
    echo "<p>Max input time:     " . ini_get("max_input_time") . "</p>\n";
// shows -1

    ignore_user_abort(TRUE);
    set_time_limit(0);

    $query = mysql_query("TRUNCATE evall;");

    echo "<p>Results of Truncate: $query</p>\n";

    $myfile_replace = "uploads/evall.csv";
    $handle = @fopen($myfile_replace, 'rb');
    $save_handle = @fopen("tmp/sql_calls.txt", "wb");

    $process_count = 0;
    if(!$handle) {
        echo "<p>The file ($myfile_replace) could not be opened.<br
/></p>\n";
        flush();
    } else {
        echo "<p>The file ($myfile_replace) opened correctly.<br /></p>\n";
        flush();

        $headings = fgetcsv($handle, 10000, ",");  // Just ignore the first
row returned.
        $row = 0;
        while (($data = fgetcsv($handle, 10000, ",")) !== FALSE) {
            $row++;
            $num = count($data);
            $insert_query = "INSERT INTO evall VALUES(";
            for ($c=0; $c < $num; $c++) {
                if($c > 0) {
                    $insert_query .= ",";
                }
                $insert_query .= '"' . $data[$c] . '"';
            }
            $insert_query .= ");";

            if(fwrite($save_handle, $row . "; " . strlen($insert_query) . ":
" . "\n") === FALSE) {
                echo "<p>The query could not be written.</p>\n";
            } else {
                $process_count++;
            }
            if($row % 1000 == 0) {
                echo "$row records processed so far<br />\n";
                flush();
            }
        }
        echo "<p>File import completed. $row records read; $process_count
records added.</p>\n";
        flush();
        fclose($save_handle);
        fclose($handle);
    }

    ini_set("session.gc_maxlifetime",$old_session_gc_maxlifetime);
    ini_set("max_execution_time",$old_max_execution_time);
    ini_set("max_input_time",$old_max_input_time);

}
?>
<form name="form" enctype="multipart/form-data" action="<?php echo
$_SERVER['PHP_SELF']; ?>" method="POST" >
    <p>Version 1.9 -- The file uploading process presupposes that the user
has uploaded the EVAll.CSV
    file to the &quot;uploads&quot; folder. If this has been done and you're
ready to process
    it, click the &lt;Process&gt; button.</p>
  <input type="submit" name="process" value="Process" /><br/><br/>
</form>

--- End Message ---
--- Begin Message ---
I thought I did it correct to define in the header:
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
to display chinese characters correct. It works.

However, if you type in a form directly, than each character will be
translated to UTF-8 with a sequence of #....; 
I could then count the numbers of # and had the amount of Chinese
characters.

Now I found that there are other Chinese characters in my database,
which I cannot read, but on the web they are displayed correct Chinesse.

How can I count these Chinese characters?

bye

Ronald

--- End Message ---
--- Begin Message ---
Hi!

I have a script that reads a 120 MB remote file. This raises a Memory
Allocation Error unless I use:
     ini_set('memory_limit', '130M');

I doubt this is good for my server... I tried both fopen and
file_get_contents. This used to work fine in PHP 4 until I upgraded to
PHP 5.

Any ideas?

Thanks.

--- End Message ---

Reply via email to