[PHP] Ob_Flush issue

2008-02-26 Thread Ritesh Nadhani
Hello

I have a sample code like: http://pastebin.ca/919386

I have around 4000 rows returned so it should show me partial output
at client after each 100 rows but it never does. I am only getting the
output after full completion.

Though if you remove the step code and output after every row then i
can see the update.

My phpinfo(): http://craig.cs.uiowa.edu/smt/phpinfo.php

Any idea what might be the problem? I want to show a status message
after every 100 rows processed..

-- 
Ritesh
http://www.riteshn.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Copying 1000s files and showing the progress

2008-02-14 Thread Ritesh Nadhani
Thanks all.

I will work on all the options and let you know how it went :)

On Thu, Feb 14, 2008 at 10:37 AM, Nathan Rixham <[EMAIL PROTECTED]> wrote:
>
> Jochem Maas wrote:
>  > Ritesh Nadhani schreef:
>  >> On Feb 13, 2008 6:03 PM, Richard Lynch <[EMAIL PROTECTED]> wrote:
>  >>> On Wed, February 13, 2008 4:28 am, Ritesh Nadhani wrote:
>  >>>> I have a situation where I have to copy something like 1000 files one
>  >>>> by one to a temporary folder. Tar it using the system tar command and
>  >>>> let the user download the tar file.
>  >>>>
>  >>>> Now while the copy is going on at server, I want to show some progress
>  >>>> to the user at client side. Most of the tutorial I found on net was
>  >>>> about showing progress while a file is being uploaded from client to
>  >>>> server. In this case the client has the info but for my case, the
>  >>>> client has no info.
>  >>>>
>  >>>> A similar was problem was solved at
>  >>>> http://menno.b10m.net/blog/blosxom/perl/cgi-upload-hook.html but its
>  >>>> in PERL and uses some form of hook. I have no clue how to do it in
>  >>>> PHP.
>  >>>>
>  >>>> Any clues or right direction would be awesome.
>  >>> First of all, don't do that. :-)
>  >>>
>  >>> Instead, set up a "job" system of what should be copied/tarred, and
>  >>> then notify the user via email.
>  >>>
>  >>> Don't make the user sit there waiting for the computer!
>  >>>
>  >>> If you absolutely HAVE to do this due to a pointy-haired boss...
>  >>>
>  >>>   >>>   $path = "/full/path/to/1000s/of/files";
>  >>>   $dir = opendir($path) or die("Change that path");
>  >>>   $tmp = tmpname(); //or whatever...
>  >>>   while (($file = readdir($dir)) !== false){
>  >>> echo "$file\n";
>  >>> copy("$path/$file", "/tmp/$tmp/$path");
>  >>>   }
>  >>>   exec("tar -cf /tmp/$tmp.tar /tmp/$tmp/", $output, $error);
>  >>>   echo implode("\n", $output);
>  >>>   if ($error){
>  >>> //handle error here!
>  >>> die("OS Error: $error");
>  >>>   }
>  >>> ?>
>  >>>
>  >>> shameless plug:
>  >>> //handle error here could perhaps use this:
>  >>> http://l-i-e.com/perror.
>  >>>
>  >>> --
>  >>> Some people have a "gift" link here.
>  >>> Know what I want?
>  >>> I want you to buy a CD from some indie artist.
>  >>> http://cdbaby.com/from/lynch
>  >>> Yeah, I get a buck. So?
>  >>>
>  >>>
>  >>
>  >> I was actually doing what you just gave the code for. As of now, I was
>  >> doing something like:
>  >>
>  >> """"
>  >> for file in files:
>  >>copy from source to folder
>  >>echo "Copying files encapsulated in ob_flush()"
>  >>
>  >> tar the file which hardly takes time on the filessyetm but does take
>  >> some time
>  >>
>  >> Provide the link to the tar
>  >> """
>  >>
>  >> So at the client side it was like:
>  >>
>  >> Copying file #1
>  >> Copying file #2
>  >> 
>  >> Download link
>  >>
>  >> I though I could apply some funkiness to it by using some AJAX based
>  >> progress bar for which the example showed some sort of hooking and all
>  >> which I thought was too much for such a job. I will talk to my boss
>  >> regarding this and do the necessary.
>  >>
>  >> BTW, whats the issue with AJAX based approach? Any particular reason
>  >> other then it seems to be a hack rather then an elegant solution
>  >> (which is more then enough reason not to implement it...but I wonder
>  >> if there is a technical reason to it too)?
>  >
>  > the problem with the AJAX approach is the fact that there is absolutely
>  > no reason for a user to sit staring at the screen. hit 'Go' get an
>  > email when it's done, do something else in the mean time.
>  >
>  > of course a PHB might demand functionality that gives him/her an excuse to
>  > watch a progress bar ... in which case why not waste man hours making it a
>  > funky web2.0 deal ... heck go the whole hog and use 'comet technique' to
>  > push update info to the browser.
>  >
>  >>
>
>  Can't see any problem with going down the ajax route myself (depending
>  on how many concurrent users you expect) - A simple ajax style poll
>  every second would suffice.
>
>  You could always use the fancily named yet really old "comet technique"
>  aswell.. slap the script in a "magic" iframe and ob_flush whenever you
>  have new data to send.
>
>  To be honeswt, both are trade-offs on a technology which isn't around
>  yet. With ajax you've got to spawn a new thread for every request (say 1
>  per second, 100 concurrent users, that's 100 requests per second minimum
>  on your server) OR with comet you've got 100 long lasting worker threads
>  going.
>
>  If you do go comet, it's worth investigating mod event for apache 2.*
>  http://httpd.apache.org/docs/2.2/mod/event.html
>
>  Nath
>
>
>
>  --
>  PHP General Mailing List (http://www.php.net/)
>  To unsubscribe, visit: http://www.php.net/unsub.php
>
>



-- 
Ritesh
http://www.riteshn.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Copying 1000s files and showing the progress

2008-02-13 Thread Ritesh Nadhani
On Feb 13, 2008 6:03 PM, Richard Lynch <[EMAIL PROTECTED]> wrote:
>
> On Wed, February 13, 2008 4:28 am, Ritesh Nadhani wrote:
> > I have a situation where I have to copy something like 1000 files one
> > by one to a temporary folder. Tar it using the system tar command and
> > let the user download the tar file.
> >
> > Now while the copy is going on at server, I want to show some progress
> > to the user at client side. Most of the tutorial I found on net was
> > about showing progress while a file is being uploaded from client to
> > server. In this case the client has the info but for my case, the
> > client has no info.
> >
> > A similar was problem was solved at
> > http://menno.b10m.net/blog/blosxom/perl/cgi-upload-hook.html but its
> > in PERL and uses some form of hook. I have no clue how to do it in
> > PHP.
> >
> > Any clues or right direction would be awesome.
>
> First of all, don't do that. :-)
>
> Instead, set up a "job" system of what should be copied/tarred, and
> then notify the user via email.
>
> Don't make the user sit there waiting for the computer!
>
> If you absolutely HAVE to do this due to a pointy-haired boss...
>
>$path = "/full/path/to/1000s/of/files";
>   $dir = opendir($path) or die("Change that path");
>   $tmp = tmpname(); //or whatever...
>   while (($file = readdir($dir)) !== false){
> echo "$file\n";
> copy("$path/$file", "/tmp/$tmp/$path");
>   }
>   exec("tar -cf /tmp/$tmp.tar /tmp/$tmp/", $output, $error);
>   echo implode("\n", $output);
>   if ($error){
> //handle error here!
> die("OS Error: $error");
>   }
> ?>
>
> shameless plug:
> //handle error here could perhaps use this:
> http://l-i-e.com/perror.
>
> --
> Some people have a "gift" link here.
> Know what I want?
> I want you to buy a CD from some indie artist.
> http://cdbaby.com/from/lynch
> Yeah, I get a buck. So?
>
>

I was actually doing what you just gave the code for. As of now, I was
doing something like:

""""
for file in files:
   copy from source to folder
   echo "Copying files encapsulated in ob_flush()"

tar the file which hardly takes time on the filessyetm but does take some time

Provide the link to the tar
"""

So at the client side it was like:

Copying file #1
Copying file #2

Download link

I though I could apply some funkiness to it by using some AJAX based
progress bar for which the example showed some sort of hooking and all
which I thought was too much for such a job. I will talk to my boss
regarding this and do the necessary.

BTW, whats the issue with AJAX based approach? Any particular reason
other then it seems to be a hack rather then an elegant solution
(which is more then enough reason not to implement it...but I wonder
if there is a technical reason to it too)?

-- 
Ritesh
http://www.riteshn.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] Copying 1000s files and showing the progress

2008-02-13 Thread Ritesh Nadhani
Hello All

I have a situation where I have to copy something like 1000 files one
by one to a temporary folder. Tar it using the system tar command and
let the user download the tar file.

Now while the copy is going on at server, I want to show some progress
to the user at client side. Most of the tutorial I found on net was
about showing progress while a file is being uploaded from client to
server. In this case the client has the info but for my case, the
client has no info.

A similar was problem was solved at
http://menno.b10m.net/blog/blosxom/perl/cgi-upload-hook.html but its
in PERL and uses some form of hook. I have no clue how to do it in
PHP.

Any clues or right direction would be awesome.

-- 
Ritesh
http://www.riteshn.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP with XML database

2007-01-29 Thread Ritesh Nadhani

Thank you Bernhard and Richard

Your comments have been really helpful. I have a meeting with the
professor on Wednesday and I will lay down all this points in front of
him then.

On 1/29/07, Richard Lynch <[EMAIL PROTECTED]> wrote:

On Fri, January 26, 2007 6:10 pm, Ritesh Nadhani wrote:
> Hello
>
> Bernhard Zwischenbrugger wrote:
>> Hi
>>
>> Some questions
>>
>>> As part of my research under my professor I have to implement a web
>>> interface to their benchmarking data.
>>>
>>> PHP is the chosen web language but we are little worried about the
>>> database. The benchmark data comes to us in XML format (e.g.
>>> http://www.matf.bg.ac.yu/~filip/ArgoLib/smt-lib-xml/Examples/FolEq1.xml).
>>> We have to implement an interface to query them, get data, update
>>> etc.
>>
>> You can parse the XML, extract the data, put it to an SQL DB and
>> move
>> the XML to /dev/null (delete it).
>> If you do that, you don't need an XML DB.
>> Is this possible?
>>
>
> No. My professor is dead against that. Many people have suggested me
> doing that. Why? Are XML databases incorrectly implemented or are bad?

I think it's safe to say that at least some XML databases are
incorrectly implemented, or even bad. :-)

The answers you keep getting are because without a driving reason to
use the features specific to XML, using XML as the database backend is
probably a Bad Idea.

So unless you explain WHY you want the XML db, and unless those
reasons make sense to need an XML db, you'll keep getting people
telling you, effectively, to "Don't do that."

>>> We even can change schema in the form of attributes. . The data
>>> size
>>> would be around 100 MB each XML with around 100 different XMLs.
>>
>> What do you mean by "different XMLs"?
>> Are you looking for a maschine that makes SQL Tables from XML?
>> What is inside of the 100MB XML? Your example is a MathML Formula.
>>
>
> By different XMLs I meants, different XML files. So we can have 40 XML
> files with around 50-100 MB each.
>
> Yes, they will have lot sof those MathML formulas. Its benchmarking
> data
> from some theoritical group that my professor works with. They have
> all
> their database in XML so relational database is not possible otherwise
> we will have to convert them between XML and relational all the time.

How exactly will your converted XML get folded back into the other
data stores?...

This is your big challenge, upon which everything else hinges.

>>> The load would be max 5-10 users any given time, batch updates once
>>> a
>>> month and heavy load probably 2-3 times a month. Mission
>>> criticality is
>>> not important, we can get it down sometimes. Which db would you
>>> suggest?

You have to define "heavy load" unless your max 5-10 *is* the "heavy
load" in which case you'd have to work hard to screw this up enough to
not be able to handle the load, I think...

Though I dunno for sure what performance of XML backend is like...

>>> I did Google research and as of now - I like eXist, Sedna (they
>>> seem to
>>> have good PHP wrapper support) and Timber. Another thing would be
>>> good
>>> documentation and support.

I've never even heard of eXist nor Sedna, which is why I'm suggesting
you choose something a bit (okay a LOT) more mainstream like MySQL,
where you'll have zillions of fellow users to help out.

> So my question is: out of the six systems listed above, which one you
> would suggest? Has anybody used any of the above system before? What
> are
> your experiences?

I've never actually been silly enough to insist on storying actual
data as XML and pretending it's a DB, rather than storing it as DB and
then writing scripts to output it as if it were XML...

Unless you've got shared data segments scattered all across the world,
forcing it all into XML is probably the First Mistake.  But since
that's your professors' problem, and not yours, I can see how you've
got little choice in the matter.

--
Some people have a "gift" link here.
Know what I want?
I want you to buy a CD from some starving artist.
http://cdbaby.com/browse/from/lynch
Yeah, I get a buck. So?





--
Ritesh
http://www.riteshn.com

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP with XML database

2007-01-26 Thread Ritesh Nadhani

Hello

Bernhard Zwischenbrugger wrote:

Hi

Some questions

As part of my research under my professor I have to implement a web 
interface to their benchmarking data.


PHP is the chosen web language but we are little worried about the 
database. The benchmark data comes to us in XML format (e.g. 
http://www.matf.bg.ac.yu/~filip/ArgoLib/smt-lib-xml/Examples/FolEq1.xml).

We have to implement an interface to query them, get data, update etc.


You can parse the XML, extract the data, put it to an SQL DB and move
the XML to /dev/null (delete it).
If you do that, you don't need an XML DB.
Is this possible?



No. My professor is dead against that. Many people have suggested me 
doing that. Why? Are XML databases incorrectly implemented or are bad?


We even can change schema in the form of attributes. . The data size 
would be around 100 MB each XML with around 100 different XMLs.


What do you mean by "different XMLs"?
Are you looking for a maschine that makes SQL Tables from XML?
What is inside of the 100MB XML? Your example is a MathML Formula.



By different XMLs I meants, different XML files. So we can have 40 XML 
files with around 50-100 MB each.


Yes, they will have lot sof those MathML formulas. Its benchmarking data 
from some theoritical group that my professor works with. They have all 
their database in XML so relational database is not possible otherwise 
we will have to convert them between XML and relational all the time.


The load would be max 5-10 users any given time, batch updates once a 
month and heavy load probably 2-3 times a month. Mission criticality is 
not important, we can get it down sometimes. Which db would you suggest?


I did Google research and as of now - I like eXist, Sedna (they seem to 
have good PHP wrapper support) and Timber. Another thing would be good 
documentation and support.


With an XML DB you can query data using XPATH. Is that the thing you
want? Oracle supports that for example.



Yeah but looking at 
http://www.oracle.com/technology/tech/xml/xmldb/index.html, I could not 
find whether its free. I might be wrong but the info is not easily found 
there.


One the contrary IBMs offering at 
http://www-306.ibm.com/software/data/db2/express/download.html looks FREE.


The problem is that both the above two database are beasts in themselves 
and I just require 10% of what they do :)


We looked into Berkeley DB also but their support for PHP is not that 
great. We have compile the module by ourselves etc (this is not a 
problem though as we have the technical know how to do that) but lot of 
people have suggested that its not a very stable system. I have not done 
any benchmarking on it but I dont want to change my underlying DB couple 
of months down the line just because we found out its not stable.


Apart from the above big three, the other free and reasonable good 
implementation seems to be eXist, Timber and Sedna. This I am just 
saying by reading their website. I have not used them.


So my question is: out of the six systems listed above, which one you 
would suggest? Has anybody used any of the above system before? What are 
your experiences?



Bernhard



Ritesh






--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] PHP with XML database

2007-01-26 Thread Ritesh Nadhani

Hello all

As part of my research under my professor I have to implement a web 
interface to their benchmarking data.


PHP is the chosen web language but we are little worried about the 
database. The benchmark data comes to us in XML format (e.g. 
http://www.matf.bg.ac.yu/~filip/ArgoLib/smt-lib-xml/Examples/FolEq1.xml).

We have to implement an interface to query them, get data, update etc.

We even can change schema in the form of attributes. . The data size 
would be around 100 MB each XML with around 100 different XMLs.


The load would be max 5-10 users any given time, batch updates once a 
month and heavy load probably 2-3 times a month. Mission criticality is 
not important, we can get it down sometimes. Which db would you suggest?


I did Google research and as of now - I like eXist, Sedna (they seem to 
have good PHP wrapper support) and Timber. Another thing would be good 
documentation and support.


Any suggestions?

Ritesh

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] [ANN]Webyog releases FREE edition of SQLyog

2004-12-05 Thread Ritesh Nadhani
Hello,
Webyog, the creator of SQLyog - the most popular GUI for MySQL has released 
SQLyog v4.0.

Starting from v4.0, SQLyog is available in two editions: SQLyog and SQLyog 
Enterprise.

SQLyog is FREE for personal and commercial use.
SQLyog contains all features of SQLyog Enterprise - except the following 
Power Tools:

* HTTP / SSH Tunneling - Manage MySQL even if your ISP disallows remote
connections
* Data Synchronization - Zero install MySQL Replication
* Schema Synchronization - Keep test and production databases in sync
* Notification Services - Send formatted resulsets over email at regular
intervals
* ODBC Import - Wizard driven painless migration to MySQL
Please visit the following link for to view the feature comparison between
SQLyog and SQLyog Enterprise.
http://www.webyog.com/sqlyog/featurematrix.html
Thanks for your attention!
Regards,
Ritesh
http://www.webyog.com 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Big table dump stopping in between

2004-11-08 Thread Ritesh Nadhani
Hello,
- Original Message - 
From: "Jason Wong" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Monday, November 08, 2004 9:59 PM
Subject: Re: [PHP] Big table dump stopping in between


On Monday 08 November 2004 07:14, Ritesh Nadhani wrote:
Please do not top post.
I dont know what you mean by that. I just pressed reply all in my outlook.
Here is page that shows phpinfo() in my webserver.
http://www.webyog.com/indexsam.php
As I can see, the configure command has '--enable-safe-mode'  but in the
PHP core configuration - safe_mode is set to off.
It's probably because safe_mode is disabled in php.ini.
The max_execution_time is given to be 30 by default, but in my PHP i have
set set_time_limit(30)
If you want to give your script infinite time to run (ie as much time as 
it
needs) use:

 set_time_limit(0);
Sorry for my previous post but I am actually doing set_time_limit(0) at top 
of my script.

at the beginning of your script.
--
Jason Wong -> Gremlins Associates -> www.gremlins.biz
Open Source Software Systems Integrators
* Web Design & Hosting * Internet & Intranet Applications Development *
--
Search the list archives before you post
http://marc.theaimsgroup.com/?l=php-general
--
/*
/* now make a new head in the exact same spot */
-- Larry Wall in cons.c from the perl source code
*/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Big table dump stopping in between

2004-11-07 Thread Ritesh Nadhani
Hello,
Here is page that shows phpinfo() in my webserver.
http://www.webyog.com/indexsam.php
As I can see, the configure command has '--enable-safe-mode'  but in the PHP 
core configuration - safe_mode is set to off.

The max_execution_time is given to be 30 by default, but in my PHP i have 
set set_time_limit(30)

Do you find anything unusual?
Regards,
Ritesh
- Original Message - 
From: <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Saturday, November 06, 2004 11:41 AM
Subject: RE: [PHP] Big table dump stopping in between


Why on earth would you want to echo 80k rows?? :)
Anyways...
1) If you have safe mode on, set_time_limit doesn't work.
2) You might check your max_execution_time value defined in the php.ini
Nate
-Original Message-
From: Ritesh Nadhani [mailto:[EMAIL PROTECTED]
Sent: Friday, November 05, 2004 9:43 PM
To: [EMAIL PROTECTED]
Subject: [PHP] Big table dump stopping in between
Hello,
I have a PHP page.
In the script I connect to one our tables having more then 80K rows and 
dump

the data using echo command.
When I run this page from my browser, the process always stops half way 
thru

and only half the data is dumped. I can reproduce this problem everytime. 
I
am on a 512Kbps DSL line.

By stopping i mean from IE, about 50K rows are dumped and then the process
stops.
I have even tried: set_time_limit (0) so that we have unlimited timeout 
but
its still giving the same error.

Do I need to configure something in the server so that complete data is
dumped?
Ritesh
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: Re: [PHP] Big table dump stopping in between

2004-11-06 Thread Ritesh Nadhani
Hello,
> 
> From: Jason Wong <[EMAIL PROTECTED]>
> Date: 2004/11/06 Sat PM 06:01:28 EST
> To: [EMAIL PROTECTED]
> Subject: Re: [PHP] Big table dump stopping in between
> 
> On Saturday 06 November 2004 05:43, Ritesh Nadhani wrote:
> 
> > In the script I connect to one our tables having more then 80K rows and
> > dump the data using echo command.
> >
> > When I run this page from my browser, the process always stops half way
> > thru and only half the data is dumped. I can reproduce this problem
> > everytime. I am on a 512Kbps DSL line.
> >
> > By stopping i mean from IE, about 50K rows are dumped and then the process
> > stops.
> >
> > I have even tried: set_time_limit (0) so that we have unlimited timeout but
> > its still giving the same error.
> 
> Do you actually get an error message?
> 

No.

> > Do I need to configure something in the server so that complete data is
> > dumped?
> 
> Have you ruled out the possibility that it's IE that's barfing? Try a 
> different browser and/or try using a download utility to just download the 
> page to file, or link to that page from another page so you can right-click 
> "Save link as...".
> 

I even tried an HTTP utility but it is also stopping half way thru. I think its a 
problem with ISPs server. Looks like some server option has to be configured. 

> -- 
> Jason Wong -> Gremlins Associates -> www.gremlins.biz
> Open Source Software Systems Integrators
> * Web Design & Hosting * Internet & Intranet Applications Development *
> --
> Search the list archives before you post
> http://marc.theaimsgroup.com/?l=php-general
> --
> /*
> Accuracy, n.:
>  The vice of being right
> */
> 
> -- 
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
> 
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Big table dump stopping in between

2004-11-05 Thread Ritesh Nadhani
Hello,
I wont be doing it everytime. Just need to strees test something. Just 
wanted to know how my app behaves when 80K+ rows are dumped.

I will look into the options you have given and let you know the result.
Regards,
Ritesh
- Original Message - 
From: <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Saturday, November 06, 2004 11:41 AM
Subject: RE: [PHP] Big table dump stopping in between


Why on earth would you want to echo 80k rows?? :)
Anyways...
1) If you have safe mode on, set_time_limit doesn't work.
2) You might check your max_execution_time value defined in the php.ini
Nate
-Original Message-----
From: Ritesh Nadhani [mailto:[EMAIL PROTECTED]
Sent: Friday, November 05, 2004 9:43 PM
To: [EMAIL PROTECTED]
Subject: [PHP] Big table dump stopping in between
Hello,
I have a PHP page.
In the script I connect to one our tables having more then 80K rows and 
dump

the data using echo command.
When I run this page from my browser, the process always stops half way 
thru

and only half the data is dumped. I can reproduce this problem everytime. 
I
am on a 512Kbps DSL line.

By stopping i mean from IE, about 50K rows are dumped and then the process
stops.
I have even tried: set_time_limit (0) so that we have unlimited timeout 
but
its still giving the same error.

Do I need to configure something in the server so that complete data is
dumped?
Ritesh
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Big table dump stopping in between

2004-11-05 Thread Ritesh Nadhani
Hello,
I have a PHP page.
In the script I connect to one our tables having more then 80K rows and dump 
the data using echo command.

When I run this page from my browser, the process always stops half way thru 
and only half the data is dumped. I can reproduce this problem everytime. I 
am on a 512Kbps DSL line.

By stopping i mean from IE, about 50K rows are dumped and then the process 
stops.

I have even tried: set_time_limit (0) so that we have unlimited timeout but 
its still giving the same error.

Do I need to configure something in the server so that complete data is 
dumped?

Ritesh 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] A new GUI client for mySQL

2002-05-19 Thread Ritesh Nadhani

greetings...
Webyog has released mySQLyog, a win32 based Query
analyzer. Its FREE and gives all the features that a
mysql developer can ask for by a query analyzer.
please visit www.webyog.com
you can execute query of results of more than 10
records, supports multiple query execution and you can
export your data into XML, HTML and CSV and can also
import form a text file.
It also allows you to excute last query and lets you
add your favourite query in a personal folder with one
click of mouse. So you dont have to save it and open
it.
Please take a look at it and feel free to send your
suggestions to [EMAIL PROTECTED]
Ritesh


__
Do You Yahoo!?
LAUNCH - Your Yahoo! Music Experience
http://launch.yahoo.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php