I'm wondering what the consensus is on how many simultaneous asynchronous ajax requests is generally allowable.
The reason I ask, is I'm working on a personal web app. For the most part I keep my requests down to one. However there are a few situations where I send up to 4 requests simultaneously. This causes a bit of delay, as the browser will only handle 2 at a time.
The delay is not a problem in terms of usability, for now. And it'll be awhile before I have to worry about scalability, if ever. But I am trying to adhere to best practices, as much as is reasonable. What are your thoughts? Is 4 requests a reasonable number?
I'm pretty sure the browser limits the number of connections you can have anyway.
If you have Firefox, type in about:config and look for network.http.max-connections-per-server and that will tell you your maximum. I'm almost positive that this will be the limit for AJAX connections as well. I think IE is limited to 2. I'm not sure about Chrome or Opera.
Edit:
In Firefox 23 the preference with name network.http.max-connections-per-server doesn't exist, but there is a network.http.max-persistent-connections-per-server and the default value is 6.
That really depends on if it works like that properly. If the logic of the application is built that 4 simultaneos requests make sense, do it like this. If the logic isn't disturbed by packing multiple requests into one request you can do that, but only if it won't make the code more complicated. Keep it as simple and straight forward until you have problems, then you can start to optimize.
But you can ask yourself if the design of the app can be improved that there is no need for multiple requests.
Also check it on a really slow connection. Simultaneous http requests are not necessarily executed on the server in the proper order and they might also return in a different order. That might give problems you'll experience only on slower lines.
It's tough to answer without knowing some details. If you're just firing the requests and forgetting about them, then 4 requests could be fine, as could 20 - as long as the user experience isn't harmed by slow performance. But if you have to collect information back from the service, then coordinating those responses could get tricky. That may be something to consider.
The previous answer from Christian has a good point - check it on a slow connection. Fiddler can help with that as it allows you to test slow connections by simulating different connection speeds (56K and up).
You could also consider firing a single async request that could contain one or more messages to a controlling service, which could then hand the messages off to the appropriate services, collect the results and then return back to the client. Having multiple async requests being fired and then returning at different times could present a choppy experience for the user as each response is then rendered on the page at different times.
In my experience 1 is the best number, but I'll agree there may be some rare situations that might require simultaneous calls.
If I recall, IE is the only browser that still limits connections to 2. This causes your requests to be queue, and if your first 2 requests take longer than expected or timeout, the other two requests will automatically fail. In some cases you also get the annoying "allow script to continue" dialog in IE.
If your user can't really do anything until all 4 requests come back (especially with IE's bogged down JavaScript performance) I would create a transport object that contains the data for all requests and then a returning transport object that can be parsed and delegated on return.
I'm not an expert in networking, but probably four wouldn't be much of a problem for a small to medium application, however, the larger it gets the higher the server load which could eventually cause problems. This really doesn't answer your questions, but here is a suggestion. If delay is not a problem why don't you use an queue.
var request = []//a queue of the requests to be sent to the server
request[request.length] = //whatever you want to send to the server
startSend();
function startSend(){//if nothing is in the queue go ahead and send this one
if(request.length===1){
send();
}
}
function send(){//the ajax call to the server using the first request in queue
var sendData = request[0];
//code to send the data
//then when you get the response (I can't remember exactly the code for it)
//send it to a function to process the data
}
function process(data){
request.splice(0,1);
if(request.length>0){//check to see if you need to do another ajax call
send();
}
//process data
}
This is probably isn't the best way to do it, but that's the idea you could probably modify it to do 2 requests instead of just one. Also, maybe you could modify it to send as many requests as their are in the queue as one request. Then the server splits them up and processes each one and sends the data back. All at once or even as it gets it since the server can flush the data several times. You just have to make sure your are parsing the response text correctly.
Related
At the risk of getting roasted for not posting code, what is the best way for getting around the 6 concurrent call limitation for ajax requests?
So currently I'm working with an application that can have up to 40 or so ajax requests on page load. For background, most of these requests are for various graphs, hidden behind tabs. There are some requests that can be triggered by the user (such as updating the name of an entity without refreshing the page). This means that with the limitation on concurrent requests the user won't be able to change anything until there's only 5 other requests running, and that's an unacceptable user experience.
This may mean that the app is structured badly, but most of the things loading are not required right away.
Anyway, I've looked a bit into fetch() and webworkers but can't find any information on whether these would help get around the limitation.
One solution would be to put resources on different subdomains, but this makes the backend API unnecessarily complicated (and it's a browser issue, not a server issue).
I've considered these approaches:
delay requests until the user actively needs them (IMO this is a bad user experience because they will have to wait a little bit a lot, which is annoying)
create a queuing system that leaves open one spot for user initiated requests (I'm not sure how to implement this, but it should be doable)
restructure the API so that more data is returned per request (this again is mainly a backend solution that feels a little dirty and unRESTful. Also it won't necessarily improve the load time)
chaining calls such as with Multiple Async AJAX Calls Best Practice (however given there are an unpredictable number of calls on unrelated endpoints so I don't think this is all that practical here)
webworkers? (again, not sure if this could help, since this is used to multithread js)
fetch()? (I can't find info on whether this is subject to the same limitation)
This is very much opinion based.
40 requests is not unreasonable but depending on your server and site setup it can take quite a while.
With that many calls I would bundle some of them together in a initializePage=X call. This does involve some serverside work.
Delay requests is not necessarily bad, depending on your estimated time to deliver. If possible you could present some kind of animation or "expected result" until the response ticks in, to keep the user entertained. The same applies to Queing your requests.
Restructuring your code to return everything in a bundle could also greatly speed up your site if you run a lot of initialization on your server (like security checks).
If performance is still a concern you can look into connections that provide faster results such as EventSource or WebSocket. Such a faster connection also allows for a more flexible approach to chaining. EventSource, for instance, supports events, so you could set several events on a single, bundled request and fire them as the server returns data.
Webworkers is not the answer, as the problem here is connection speed and concurrent connection limits.
I don't think we can answer this question directly. Several of the solutions you have mentioned are viable but vary by level of effort. If you are willing to adjust architecture you can consider a GraphQL approach which can wrap the bundling for you. I would also say that you can maintain REST but have a special proxy service that bundles data for you. I'd also say, don't let RESTfullness dictate or force how you develop.
Also, delaying requests until the user needs them seems like the appropriate choice to me. It's the basis for why we have "above the fold" CSS styling and infinite scrolling. Load what is needed right now first and defer the stuff that might not actually matter when it needs to be.
Concurrency of AJAX calls would come into picture if these requests are called from one thread. If WebWorker is used with AJAX then no issues at all, reason being each instance of webworker will be isolated, in a thread that is not in the main thread.
I would call that as JaxWeb and I will be pushing a git repo in coming week where you may find pure JS code that takes care of it. This is being tested right now, but yeah it does solve the problem.
Example:
Add below code in JaxWeb.js
onmessage = function (e) {
var JaxWeb = function (e) {
return {
requestChannel: {},
get_csrf_token: function () {
return this._csrf_token;
},
set_csrf_token: function (_csrf_token = null) {
this._csrf_token = _csrf_token;
},
prepare: (function ( e ) {
this.requestChannel = new XMLHttpRequest();
this.requestChannel.onreadystatechange = function () {
if (this.readyState == 4 && this.status == 200) {
postMessage(JSON.parse(this.responseText));
}
};
this.requestChannel.open(e.data.method, e.data.callname, true);
this.requestChannel.setRequestHeader("X-CSRF-TOKEN", e.data.token);
var postData = '';
if (e.data.data)
postData = JSON.stringify(e.data.data);
this.requestChannel.send(postData);
})(e)
}
};
return JaxWeb(e);
}
Usage:
jaxWebGetServerResponse = function () {
var wk2 = new Worker('path_to_jaxweb_js/JaxWeb.js');
wk2.postMessage({
"callname": '<url end point>',
"method": '<your http method>',
"data": ''
});
wk2.onmessage = function (serverResponse) {
//
//process results
//with data that is received from server
}
};
//Invoke the function
jaxWebGetServerResponse();
I need some advice on handling an issue programmatically.
I have a web interface on PHP and Javascript(jQuery), on which authorized user needs to control multiple distant entities by performing some actions.
Each entity consists of 10 steps and the progress is represented by a progress bar on which each step is a web service call which needs around 0.5-1,5 to execute. So for the first service for example to be completed 10 different web services are called and when each is fulfilled progress bar proceeds by 10%. Each action is performed with an ajax request.
The problem is, that in the same interface, i have to give the option to the user to control simultaneouly around 800 different entities, and each of them consists of 10 steps which make approximately 10x800 = 8000 ajax calls.
Performing 800 requests per step doesn't seem a good idea, because browser struggles to serve them but often hangs due to the excessive load.
I ve thought to make some kind of limited batch action, but haven't settled which option would serve me better.
For instance, shall i use a counter and perform a setTimeout/setInterval every X number of calls? Shall i abandon this approach and use javascript workers?
I ve read similar threads over stack overflow, suggesting for example handling them on the server side. This doesn't seem good option in my case, because on the one hand there has to be a flow on the progress, on the other hand, performing on the server side 800 requests (~0,5-1,5sec) would mean that the user would have to wait without any information at lest ~6 minutes
Also other suggest using $.when but i doubt whether this would serve this case as well, since i need to limit the total batch, and not just the response of each request.
Say A sends a message to B and waits for a callback and then A probably sends new messages and B also sends many messages to A.
What I mean other message exchanges happen during this time before callback happens.
Does it create a race condition or block other message sending until the first callback is complete or does it enforce the order of callbacks so that say the callbacks of message 1,2,3,4,5 always arrive in the same order as the message was sent out?
Assistance would be much appreciated.
Well, the question involves a number of concepts - so hard to answer fully. I will try to t least give partial response or insight.
If you had provided details, why it matters for your purpose - it could help to better target the answer.
One of the advantages of nodejs is that it is a single-threaded, event-driven, non-blocking I/O model - which means there is almost no or minimum blocking (at least theoretically). See conceptual model here
However, some trivial blocking should happen due to transport, consistency etc [?]. But it shouldn't be a problem as this will extremely insignificant and happens in all programs no matter what language it uses.
Secondly about sockets. The concept of socket considers thatit can be blocking or non-blocking depending on your purpose. Blocking and Non-blocking sockets
Blocking doesnt necessarily mean it is bad.
Thirdly, even if there is no blocking, still events dont really happen in parallel. I mean even if A and B send messages to each other very frequently - there is a time gap between them - although trivial for humans. That difference can even be expressed even in millionth of second. Can you really send over million messages in a second? So, even if callback has some impact - you should ignore it for the purpose of your program. Also, even if they occur at the same time, javascript can do one thing at a time - so at the end when you receive, you should do them one at a time. For example, if you want to display or alert a message, they will be one at a time.
As to ordering of the messages, Node.js is a single event loop. So, my understanding is it runs a non-stop loop and waits for events, and emits information in the order the events occur. For example Understanding nodejs event loop
while(new Date().getTime() < now + 1000) { // do nothing }
So, for your purpose, I would say unless B sends a message between A sending a message and server receiving it, you should receive a callback before anything else. Simply ordering happens in the order the nodejs server receives it. Hope it helps.
I have unpleasant situation when one of my "long" response in some way blocks another AJAX requests.
I call simultaneously 3 different resources:
var list = ['/api/filters','/api/criteria/brands','/api/criteria/genders']
list.forEach(function(item){$.post(item)})
On server side I could see the following times in logfile:
GET /api/filters 304 51ms
GET /api/criteria/genders 200 1ms
GET /api/criteria/brands 200 0ms
Thats look cool for me, but in browser the picture is absolutely different.
picture with google chrome network tab
So it looks like browser wait for answer on first ( long request ) and only afterwards receive last 2 results.
What could be the reason for this behavior?
Every browser just handles a specific amount of simultaneous requests at a time. If you fire 10 ajax requests at the same time, the browser put them on a stack and handles one after the other.
You can find more information about the concurrent requests (because that includes images, javascript, etc as well) in browsern in this question.
The node server runs is single threaded and any piece of code that uses CPU cycles blocks the entire process.
As a result, if GET /api/filters does a lot of CPU intensive computations, it will block any other requests till it completes. Adding some more information about what it actually does can help in putting together a better answer.
In case you have IO operations in there, then try to make them asynchronous. That will allow node to serve the other URLs while the first one is doing IO.
I had to develop a newsletter manager with JS + PHP + MYSQL and I would like to know a few things on browser timing out the JS functions. If I'm running a recursive function that delays a call to itself (while PHP returns a list of email), how can I be sure that the browser won't timeout this JS function ?
I'm asking this, because I remember using a similar newsletter manager, that while doing the ajax requests, after a few calls, it stopped without any apparent reason. I know JS is not meant for this, and I should use Crontab on server, but, I can't assume the users server handles cron, so I had to stick with JS + php.
PS - This didn't happened on this app yet, I'm just trying to prevent the worse of the scenarios (since I've tested a newsletter manager, that worked the same as this one I'm developing). Since my dummy email list is small and the delays between calls are also small, this works just fine, but let's imagine a 1,000 contact list, with a delay between sends of 120 seconds: Sending 30 emails for each 2 minutes.
By the way, why this ? Well, many hosting servers has a limit on emails sent per day or hour and this helps preventing violating that policy.
from the mootools standpoint, there are several possible solutions here.
request.periodical - http://mootools.net/docs/more/Request/Request.Periodical
has plenty of options that allow for handling batches of jobs, look at it like a more complex .periodical (setInterval) that understands async nature of the result and can compensate for lag etc. I think it can literally do what you set in your requirements out of the box, all you need is an oncomplete callback that clears up the done from your pending array (for eg).
request.queue - http://mootools.net/docs/more/Request/Request.Queue
basically, setup all your requests to handle the chunks of data and pass them on to Request.Queue to handle sequentially. Probably less sophisticated from the point of view of sending rate control.
How about a meta refresh. That will not cause a timeout in your javascript function. You Just reload your page after a specific time and then send the next emails out. Adding a parameter to the URL you can find out which "round" you are on.
Can this do the job for you?
You need to use setTimeOut. The code needs to yield control to the UI thread and let the browser become responsive to avoid the script from being stopped.
Read this post by Nick Z.
http://www.nczonline.net/blog/2009/01/13/speed-up-your-javascript-part-1/
There is also something the W3C Spec about this called "Efficient Script Yielding" I'm not sure how far along it is or if any browsers support it.
https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/setImmediate/Overview.html
You could also try HTML5 Web Workers.