php-general Digest 1 Feb 2010 00:28:47 -0000 Issue 6568

Topics (messages 301702 through 301704):

Re: In need of PHP to JS collapsable array printing routine?
        301702 by: Rene Veerman

flush() in conjunction with gzipped content
        301703 by: Richard

Appalling Dreamweaver performance
        301704 by: clancy_1.cybec.com.au

Administrivia:

To subscribe to the digest, e-mail:
        [email protected]

To unsubscribe from the digest, e-mail:
        [email protected]

To post to the list, e-mail:
        [email protected]


----------------------------------------------------------------------
--- Begin Message ---
http://mediabeez.ws/htmlMicroscope/

I'll b cleaning up & releasing the 1.3.0 code today / early next week..

On Sat, Jan 30, 2010 at 12:54 AM, Daevid Vincent <[email protected]> wrote:
> I'm  wondering if anyone has a PHP debug-type routine that will take a PHP
> array and output it to the web page, but make all the dimensions of the
> array collapsable, ideally showing each sub-key (or index) as the name to
> click to expand it again.
>
> I'm dealing with some rather huge datasets in multi-dimensional hashes and
> printing them out with a beautified print_r() is just not cutting it
> anymore. I need to collapse them down to wrap my head around them. Some of
> them have 6 or more dimensions!
>
> I use jQuery for JS if that helps (in case your routine requires that too).
>
>

--- End Message ---
--- Begin Message ---
Hi,

This page: http://developer.yahoo.com/performance/rules.html#flush
recommends using a flush() call just after the </head> tag. However,
does this even have an effect when using ob_gzhandler() ?

Cheers.

-- 
Richard Heyes
HTML5 canvas graphing: RGraph - http://www.rgraph.net (updated 31st January)
Lots of PHP and Javascript code - http://www.phpguru.org

--- End Message ---
--- Begin Message ---
I use Dreamweaver as my editor, mainly because I'm familiar with it, although I 
only use
about 1% of its capabilities. However it generally handles long files well.  
The other day
I downloaded the two shortest of Brian Dunning's sets of test data *.  I opened 
the
shortest in Dreamweaver, had a quick look at it, and realised I would have to 
replace the
quote, comma, quote separators with semicolons, as part of converting the files 
to my
format.

So I thought I would do that while I was working out what else I had to do. I 
entered the
old separator and the replacement in the 'Find and replace' window, then hit 
'Replace
all', expecting the job be done in a few seconds. First I thought nothing was 
happening,
but then I realised it was trudging through the file, as if it was wading 
through waist
high molasses.

So I closed the results window, and opened another file, but a few seconds 
later focus
switched back to the original file. I tried a couple of times more, but each 
time it
returned to the original window. I watched in morbid fascination for a bit, 
then decided I
would let it go, just to see how long it took.

The file contained 500 lines, and was about 80 K. It was taking five seconds to 
process
each line, and eventually finished in about 40 minutes.

The problem appeared to be the results processing. I have only looked at the 
results list
about twice, out of idle curiosity, but never saw any thing that I thought 
could be
remotely useful. I would like to be able to turn results logging off 
altogether, as it
wastes real estate (and time!), but this appears to be impossible.  

On this occasion the program was apparently writing a new line every time it 
replaced a
separator (9 times in each line), and then when it finished processing a line 
it would
erase all the intermediate result lines, and write a new one for the whole 
line.  At the
same time it reopened the results window if I had closed it, and return focus 
to the file
being processed.

I then wrote a PHP program to read the file, split it, clean up and re-arrange 
the various
elements, enter them into an array in my format, and finally save it as a file 
my program
could handle.

 After I had got this running on the 500 line file I used it to process the 
5000 line
file. The whole process was done in the blink of an eye -- literally a fraction 
of a
second.


* http://www.briandunning.com/sample-data/


--- End Message ---

Reply via email to