this will be significantly faster

var html = [];
$.each(data, function(i, row) {
     html.push('<tr><td>data from json</td></tr>');


});


$("tbody").append(html.join(''));
$("table").formatTable();

On Feb 5, 12:56 pm, James <james.gp....@gmail.com> wrote:
> I see. Thanks for the tip. I'll try to work on that!
>
> On Feb 5, 10:38 am, Ricardo Tomasi <ricardob...@gmail.com> wrote:
>
>
>
> > Concatenating into a string is already much faster than appending in
> > each loop, there is not much room for improvement left. What you can
> > do improve user experience though is split that into a recursive
> > function over a setTimeout, so that the browser doesn't freeze and you
> > can display a nice loading animation.
>
> > James wrote:
> > > I need tips on optimizing a large DOM insert to lessen the "freeze" on
> > > the browser.
>
> > > Scenario:
> > > I receive a large amount of JSON 'data' through AJAX from a database
> > > (sorted the way I want viewed), and loop through them to add to a JS
> > > string, and insert that chunk of string into a tbody of a table. Then,
> > > I run a plug-in that formats the table (with pagination, etc.).
> > > Simplified sample code:
>
> > > var html = '';
> > > $.each(data, function(i, row) {
> > >      html += '<tr><td>data from json</td></tr>';
> > > });
> > > $("tbody").append(html);
> > > $("table").formatTable();
>
> > > formatTable() requires that the table has to be "completed" before it
> > > can be executed.
> > > Is there any way I can optimize this better? I think I've read
> > > somewhere that making a string too long is not good, but I've also
> > > read that updating the DOM on each iteration is even worst.
>
> > > Any advice would be appreciated!- Hide quoted text -
>
> - Show quoted text -

Reply via email to