I did a little search and came across this page:

http://www.soapatterns.org/asynchronous_queuing.asp

While it may not appear to be an exact answer, it does tend to suggest some possibilities here. For instance, if each of the Ajax requests were added to a "queue" for processing, and the queue handles everything and fires an event/callback when it's done. This is similar in some ways to what I suggested earlier, but maybe a little more robust in the approach.

Still digging.

Shawn

Shawn wrote:

I have a similar situation and am watching this thread in hopes something useful may come of it.

What I've done for now is the nested calls. I use a common errorHandler() function that takes in the element to show the message with and the message itself. The message tends to be the .responseText of the XHR object more often than not. But this still feels clumsy.

What I've considered doing is to create a temporary structure that would contain flags for each of the sub processes. The success function for each sub process would set its flag as being "complete" and then call the final update function. This final update function would check each of the flags and when all are set do whatever final processing (if any) is needed.

For instance, I need to a) get a list of employees, b) get a list of tasks for all employees, c) render a table for the employees tasks and apply some code to give the table a fixed header and column. I already have existing routines to get the employees and tasks, so can reuse these here. So my code may potentially look like this:

var mydata= {
  flags: { employees: false, tasks: false },
  employees = [],
  tasks = [],
  render: function () {
    //don't continue if we don't have enough data
    if ( !mydata.flags.employees || !mydata.flags.tasks ) { return; }

    //code to render the output goes here.
    ...
  }
};

$.ajax({
  url: "getEmployees.php",
  dataType: "json",
  success: function (json) {
    mydata.employees = json;
    mydata.flags.employees = true;
    mydata.render();
  }
});

$.ajax({
  url: "getTasks.php",
  dataType: "json",
  success: function (json) {
    mydata.tasks = json;
    mydata.flags.tasks = true;
    mydata.render();
  }
})


obviously this is not throughly thought out yet, and has room for optimizations (I don't think the flags are really required if you just check the data properties instead). But what this would allow is the two ajax requests to be run concurrently and still have the same output. Which ever Ajax request finishes last fires the rendering (and it doesn't matter which).

The nested approach would take twice as long (by may be more reliable?) as the second request isn't made until AFTER the first has been completed. This may be a moot point for this sample problem, but I know I have a couple places where 4 or 5 Ajax requests are needed to complete one task.

The problem here is that event handling is not taken care of automagically. Each Ajax request would need to handle this itself. But this entire process could in theory be wrapped up into a single function. I guess the real question is if that function can be abstracted enough to make it generic. Right now, I don't see how.

I don't think this idea is revolutionary or anything. Probably kinda obvious and/or flawed. But perhaps it'll get you (us?) moving in a positive direction.

Now I'm curious though and want to try out some things... I'll report back if I have any success...

Shawn


h0tzen wrote:


On 5 Mrz., 15:40, J Moore <[EMAIL PROTECTED]> wrote:
wouldn't nesting the methods work? e.g.
unfortunately not as some methods have to be invoked in parallel.
generally exactly this nesting looks fine with no real code behind
but it is just cruel if you imagine having error-handling, rollbacks
and business logic etc within.

what im looking for is some pattern abstracting the async-callbacks
or just a real-world example/solution of someone having the same
issues with logic involving multiple, dependent ajax-calls.

thanks,
kai

Reply via email to