While developing rich internet applications which contain a lot of Ajax requests, sooner or later you come to the idea of controlling requests flow. Users are continuously interacting with site(s) GUI that causes massive Ajax requests which are passed through load balances, firewalls, virtual servers etc. Problem might happen on any layer - from dead lock in database to exceptions in firewall; at the same time, user does not know and care about all that complex issues on the background.
There are several ways to manage Ajax requests flow and I will show on of them: controlling request queue with JavaScript and jQuery.
The idea is not to bomb overloaded server with tons of requests. Instead, we assume if requests are processed to long - there is some issue on the server side and we should react somehow. With decribed approach we are organizing the queue of requests, but quick ones can still be processed simultaneously.
The idea is not to bomb overloaded server with tons of requests. Instead, we assume if requests are processed to long - there is some issue on the server side and we should react somehow. With decribed approach we are organizing the queue of requests, but quick ones can still be processed simultaneously.
First, let's synthesize idea with tests. Let's imagine we have some web service that processes one request per two seconds. Our web client is continuously sending requests to the service.
module("Test.RequestController.js"); asyncTest("Test: sending requests", 6, function () { var url = "/TestWebService"; var data = { "testData": "data" }; var controller = new RequestController(); controller.ajax({'url':url,'data':data}).done(function () { ok("done1"); }); controller.ajax({'url':url,'data':data}).done(function () { ok("done2"); }); setTimeout(function () { controller.ajax({'url':url,'data':data}).done(function () { ok("done3"); }); controller.ajax({'url':url,'data':data}).done(function () { ok("done4"); }); }, 1000); setTimeout(function () { controller.ajax({'url':url,'data':data}).done(function () { ok("done5"); }); controller.ajax({'url':url,'data':data}).done(function () { ok("done6"); start(); }); }, 2000); });
Our test service has two seconds delay. But, let's say after tree seconds delay, user should be warned about the problem and/or ajax requests should be redirected. We can wrap jQuery.ajax function with our Request controller which encapsulates required logic.
First, let's define constructor with preset data.
First, let's define constructor with preset data.
RequestController = function () { this.requestCounter = 0; // Number of too long requests this.timeout = 3000; // Default timeout for "long" requests - 3 sec this.maxLongRequests = 3; // Max amount of simultaneous long requests this.requestPromise = jQuery(this).promise(); };
Then we should wrap jQuery ajax with our queue:
RequestController.prototype.ajax = function(ajaxParams) { var ajaxRequest = function() { var isLongRequest = false; var ajaxPromise = jQuery.ajax(ajaxParams).always(function() { if (isLongRequest) { this.requestCounter--; } }.bind(this)); setTimeout(function() { if (ajaxPromise.state() == "pending") { this.requestCounter++; isLongRequest = true; } }.bind(this), this.timeout); return ajaxPromise; }.bind(this); if (this.requestCounter >= this.maxLongRequests) { // show warning to the user // and warn system admin or redirect request return this.requestPromise = this.requestPromise.pipe(ajaxRequest); } else { return this.requestPromise = ajaxRequest(); } };
The idea is simple enough: we have a wrapper function that calls jQuery.ajax; on request complete/fail we check: if request was too "long" we just decrease the pending requests counter. On setTimeout function we are checking whether request is finished; if not - request can be considered as long: we increase the counter and switch the flag of long request.
Finally, we check whether flow reached maximum allowed number of long requests: if not - just return request promise to the caller. If it reached - we can send a warning and put request to the queue described earlier.
The request flow look like this:
Finally, we check whether flow reached maximum allowed number of long requests: if not - just return request promise to the caller. If it reached - we can send a warning and put request to the queue described earlier.
The request flow look like this: