.each isn't the best choice in this case, a good old-fashioned for
loop will be significantly faster.

Also, use an array of strings instead of a single one and string
concatenation.  Using array.push () is faster than using the +=
operator on a string. Once the loop finishes use join () to get the
array's contents as a single string.

var myArray = [];

for (var i=0; i < data.length; i++)
{
    myArray.push ('<tr><td>data from json</td></tr>');
}

$('tbody').append (myArray.join (''));



On Feb 5, 7:03 pm, James <james.gp....@gmail.com> wrote:
> I need tips on optimizing a large DOM insert to lessen the "freeze" on
> the browser.
>
> Scenario:
> I receive a large amount of JSON 'data' through AJAX from a database
> (sorted the way I want viewed), and loop through them to add to a JS
> string, and insert that chunk of string into a tbody of a table. Then,
> I run a plug-in that formats the table (with pagination, etc.).
> Simplified sample code:
>
> var html = '';
> $.each(data, function(i, row) {
>      html += '<tr><td>data from json</td></tr>';});
>
> $("tbody").append(html);
> $("table").formatTable();
>
> formatTable() requires that the table has to be "completed" before it
> can be executed.
> Is there any way I can optimize this better? I think I've read
> somewhere that making a string too long is not good, but I've also
> read that updating the DOM on each iteration is even worst.
>
> Any advice would be appreciated!

Reply via email to