Is there a reason why the entire dataset is needed all at once?  Some
sort of pagination scheme would help.
 
Jeff
 
-----Original Message-----
From: flexcoders@yahoogroups.com [mailto:[EMAIL PROTECTED] On
Behalf Of Tracy Spratt
Sent: Wednesday, May 21, 2008 3:40 PM
To: flexcoders@yahoogroups.com
Subject: RE: [flexcoders] 5,000 record dataset needs love



        Are you certain the bottleneck is the "processing" as opposed to
the rendering?

        Tracy

         

        
________________________________


        From: flexcoders@yahoogroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of Tom Longson
        Sent: Tuesday, May 20, 2008 10:53 PM
        To: flexcoders@yahoogroups.com
        Subject: [flexcoders] 5,000 record dataset needs love

         

        Dear Super Smart Flex Developer Mailing List,
        
        We are currently having major issues processing a dataset that
is
        essential for our skunkworks web site. The dataset is stored as
JSON,
        consists of 5000 records, and has numerous strings. It is 1.4mb
        uncompressed / 85kb compressed. Processing the data, which
involves
        creating a custom object to hold it, currently takes a much as
60 seconds.
        
        We are in the process of attacking this beast to make it run
faster,
        and we are considering the following approaches:
        
        1. Create an index for repetitive strings within the dataset to
avoid
        repetitive strings because integer assignment is faster than
string
        assignment (we assume).
        2. Try substituting XML for JSON (no idea if this would be
faster or not).
        3. Attempt to deliver an actionscript binary blob to the
application
        from PHP (not even sure if that's possible... ASON?).
        4. Create a compiled swf with our data postprocessed and attempt
to
        access it (again, not sure if that's possible).
        5. <insert your solution here>
        
        Your expert, snarky, helpful advice is much appreciated,
        Tom

        

         

Reply via email to