I'm writing a webapp (Firefox-compatible only) which uses long polling (via jQuery's ajax abilities) to send more-or-less constant updates from the server to the client. I'm concerned about the effects of leaving this running for long periods of time, say, all day or overnight. The basic code skeleton is this:
function processResults(xml)
{
// do stuff with the xml from the server
}
function fetch()
{
setTimeout(function ()
{
$.ajax({
type: 'GET',
url: 'foo/bar/baz',
dataType: 'xml',
success: function (xml)
{
processResults(xml);
fetch();
},
error: function (xhr, type, exception)
{
if (xhr.status === 0)
{
console.log('XMLHttpRequest cancelled');
}
else
{
console.debug(xhr);
fetch();
}
}
});
}, 500);
}
(The half-second "sleep" is so that the client doesn't hammer the server if the updates are coming back to the client quickly - which they usually are.)
After leaving this running overnight, it tends to make Firefox crawl. I'd been thinking that this could be partially caused by a large stack depth since I've basically written an infinitely recursive function. However, if I use Firebug and throw a breakpoint into fetch, it looks like this is not the case. The stack that Firebug shows me is only about 4 or 5 frames deep, even after an hour.
One of the solutions I'm considering is changing my recursive function to an iterative one, but I can't figure out how I would insert the delay in between Ajax requests without spinning. I've looked at the JS 1.7 "yield" keyword but I can't quite wrap my head around it, to figure out if it's what I need here.
Is the best solution just to do a hard refresh on the page periodically, say, once every hour? Is there a better/leaner long-polling design pattern that won't put a hurt on the browser even after running for 8 or 12 hours? Or should I just skip the long polling altogether and use a different "constant update" pattern since I usually know how frequently the server will have a response for me?
It's also possible that it's FireBug. You're console.logging stuff, which means you probably have a network monitor tab open, etc, which means every request is stored in memory.
Try disabling it, see if that helps.
I suspect that memory is leaking from processResults().
I have been using very similar code to yours in a long-polling web application, which is able to run uninterrupted for weeks without a page refresh.
Your stack should not be deep, because fetch() returns immediately. You do not have an infinitely recursive loop.
You may want to use the Firefox Leak Monitor Add-on to assist you in finding memory leaks.
The stack depth of 4-5 is correct. setTimeout and $.ajax are asynchronous calls, which return immediately. The callback is later called by the browser with an empty call stack. Since you cannot implement long polling in a synchronous way, you must use this recursive approach. There is no way to make it iterative.
I suspect the reason for this slow down is that your code has a memory leak. The leak could either be in $.ajax by jQuery (very unlikely) or in your processResults call.
It is a bad idea to call fetch() from inside the method itself. Recursivity is better used when you expect that at some point the method will reach an end and the results will start to be send to the caller. The thing is, when you call the method recursively it keeps the caller method open and using memory. If you are only 3-4 frames deep, it is because jQuery or the browser are somehow "fixing" what you've done.
Recent releases of jquery support long-polling by default. This way you can be sure that yhou are not deppending on browser's intelligence to deal with your infinite recursive call. When calling the $.ajax() method you could use the code below to do a long poll combined with a safe wait of 500 miliseconds before a new call.
function myLongPoll(){
setTimeout(function(){
$.ajax({
type:'POST',
dataType: 'JSON',
url: 'http://my.domain.com/action',
data: {},
cache: false,
success:function(data){
//do something with the result
},
complete: myLongPoll,
async : false,
timeout: 5000
});
//Doesn't matter how long it took the ajax call, 1 milisec or
//5 seconds (timeout), the next call will only happen after 2 seconds
}, 2000);
This way you can be sure that the $.ajax() call is closed before the next one starts. This can be proved by adding a simple console.log() at the prior and another after your $.ajax() call.
Related
Disclaimer. A friend of mine looking for a job of senior JS programmer sent the question to me. It's not a real problem then, but since I can imagine where and how it could become real, I've decided to post it here.
The question (a test task). It follows in my words, I can quote it here, if you think I got it wrong. How to write a function which sends asynchronously requests to a given array of URLs, concatenates the result of each request and returns the concatenated string? Oh, and there is another limitation: IE9+, current FF, current Chrome. The friend's answer (as polite as possible): no can do.
My answer was the same. Since there are no threads in browser JS (it's not NodeJS) and there is no sleep function, you cannot wait until all requests are processed. Web workers? They aren't supported in IE9. Also, they wouldn't help anyway. You can send the requests one-by-one, using sync flag of XMLHttpRequest.open but (here is my suggestion) if all requests are being sent to the same server which does some math that can be executed on a single CPU core only, your penalty is x4/x8/x16 times. Anyway, it's prohibited by the test task. Of course, you can concatenate the results in a callback function, but it's prohibited as well, since you must return the result.
But I'm not a JS guru, so I forwarded the question to my another friend who is (I think so). He suggested creating additional browser tabs, one per URL, which would send the request and write the result in its title. The main tab would loop thru the tabs, waiting until all the titles aren't empty. Since the tabs are executed independently, it should work. Then he tried the solution and said it works in IE only (with some side effects). In other words, no solution.
But the employer replied to my first friend with a statement the solution exists, though refused to send JS code of the function.
So, is the question a some kind of trolling? Or there is a solution I will be able to use if I ever face a situation when I really MUST concatenate async requests results (I know it's a bad idea in JS).
Here's a discussion of various options:
Use synchronous Ajax and return the result. Synchronous ajax is a horrible idea and the challenge said to use async requests so presumably this is a no-go, but I include it here because it does let you directly return the result.
If you use async ajax in the same window, then you simply cannot return the result directly. You can call a callback when the result is done or you can return a promise which will then call a .then() handler callback when the result is available. You cannot spin and wait for the async ajax to finish because the Ajax complication can't get back to you until you return and let the event queue get to the next events.
If you put Ajax into a webWorker (either synchronous or asynchronous), you can code the webWorker however you want, but the only way it can communicate back to the main thread is via a message and that message can't be received by the main thread until you return back from your original function to get to the next messages in the event queue. Again, you can't spin and wait for the message from the webWorker because it won't get back to you until AFTER the current thread of execution finishes. So, you have to return from your function BEFORE you can get the result from the webWorker.
You can put the Ajax into an iFrame or another window and then communicate back to the current window from the other window when it is done. This has all the same issues as the previous solutions in that you won't be able to receive communication back from the iFrame or other window until after the current function has finished and returned so that events can get processed off the event queue. So, you have to return from your function BEFORE you can get the result from another window.
You can put the Ajax into an iFrame or another window and then poll some variable in that window from your main window. There is a possibility that this might work in some browsers, but I was unable to build a successful test to prove it could work.
1st thought:
function concatenatesResults(urls, cb) {
var temp = [], i = urls.length
urls.forEach(function (url, key) {
//its async so not block the foreach
$.ajax({
url: url,
success: function (data) {
temp[key] = data //be sure it is in a good order
i--
if (i === 0) cb(temp.join("")) //if this is teh last one return the data
}
})
})
}
concatenatesResults([/* URLS*/], function(data){console.log(data)})
So i have a web application which is making around 14-15 AJAX calls to some APIs. The problem is the amount of time all the AJAX calls takes is nearly 3x than the time in which each individual API shows me response when I type its URL in the browser.
I am making all the AJAX calls instantly inside the DOM Ready event.
The thing is how can i speed up this process of making 15 AJAX calls together, getting the response as fast as possible and Manipulating the DOM accordingly.
Few Points which i have in my mind:
All the AJAX calls should be ASYNC in nature. (Already doing it).
Don't make all the AJAX calls at the same time. Induce some sort of timeout as making all the AJAX calls at the same time may block the bandwidth and slows down the turn around time of the process.
Reducing the number of API calls by any means. (Already doing it).
Manipulate the DOM as minimal as possible. (Already doing it).
Setting cache:true in AJAX setup. I don't think that will really help, still i am doing it wherever i am sure content will update really slow.
Any suggestions will be valuable! Thanks.
The way i am making AJAX calls
$(document).ready(function(){
loadSentimentModule();
loadCountModule();
loadEntitiesModule();
// Some more function calls.
});
function loadSentimentModule(){
$.ajax({
url:someurl,
cache:true,
dataType:"json",
success: function(data){
// Based on data manipulating DOM.
}
}
// Same kind of function defintions for all the functions.
You may not issue the ajax call directly, but queue them and let a manager control the queue, see here: Queue ajax requests using jQuery.queue()
I recomend you to use the async.js module on client. May be it this what are you looking for.
I have a notification zone that I want to be updated when someone sends a message for example using jQuery or Ajax (my database is in a soap server) I want to do the soap call every second or so, how can I do that?
You could use a simple setInterval structure to execute AJAX calls at predefined intervals. Something like this:
setInterval(function(){
$.get('ajax_responder.php',dataObj,function(){
// ajax callback
// here is where you would update the user with any new notifications.
});
},5000);
The previous code will execute an AJAX request every 5000 miliseconds (every 5 seconds).
References:
$.get()
setInterval()
Instead of setInterval(), I would strongly suggest to use setTimeout().
MDN explanation:
If there is a possibility that your logic could take longer to execute
than the interval time, it is recommended that you recursively call a
named function using window.setTimeout. For example, if using
setInterval to poll a remote server every 5 seconds, network latency,
an unresponsive server, and a host of other issues could prevent the
request from completing in its alloted time. As such, you may find
yourself with queued up XHR requests that won't necessarily return in
order.
For such cases, a recursive setTimeout pattern is preferred:
(function loop(){
setTimeout(function(){
// logic here
// recurse
loop();
}, 1000);
})();
In the above snippet, a named function loop is declared and is immediately executed.
loop is recursively called
inside setTimeout after the logic has completed executing. While this
pattern does not guarantee execution on a fixed interval, it does
guarantee that the previous interval has completed before recursing.
The best way for real-time web is node.js.
Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.
But you can do it by setInterval or setTimeout, put an Ajax call in your interval.
var intval = setInterval( function()
{
$.get('url.php', {data1: "value1", data2: "value2"},
function(response)
{
// response
});
}, 1000);
I have a script that pings a series of urls with a GET method. I only want to ping them each once and do not expect a response. My script works in Chrome and Safari, but Firefox won't complete the later requests.
Is there a way to trigger Firefox to make a series of calls (five, to be precise) once each, and not care if they fail? It seems that Firefox won't complete the series of requests when the first ones fail.
I'm working in javascript and jQuery, with a little jQuery.ajax() thrown in. I've searched, to no avail and have reached the limit of my beginner's skill set. Any insight would be appreciated.
(If you're interested in the full scope, there's code at jquery-based standalone port knocker)
Thank you.
Update:
After further research, I believe the issue is that Firefox isn't handling the calls truly asynchronously. I have versions of code making the pings with img calls, iframe url calls, and ajax calls to work in Chrome and Safari, but in Firefox they're not behaving as I need them to.
Our server monitoring for the knock sequence should see requests come sequentially to ports 1, 2, 3, 4, 5 (as it does when using Chrome or Safari) but in Firefox, no matter which method I've tried, I see the first attempt ping port 1 twice, then port 2, and on subsequent attempts I only see it ping port 1. My status updates appear as expected, but the server isn't receiving the calls in the order it needs them. It seems that Firefox is retrying failed calls rather than executing each one once, in sequence, which is what I need it to do.
Here is a sample of my script using a simple jquery.ajax call method. It works in Safari and Chrome, but doesn't achieve the desired result in Firefox. While all my code runs and I can see the status updates (generated with the jquery.append function), the request aren't sent once each, sequentially to my server.
<script src="http://code.jquery.com/jquery-latest.js"></script>
<script type="text/javascript">
$(document).ready(function(){
$('button').click(function(){
$('#knocks').append('<p>Knocking...</p>');
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:1111'});
$('#knocks').append("<p>Knock 1 of 5 complete...</p>");
}, 500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:2222'});
$('#knocks').append("<p>Knock 2 of 5 complete...</p>");
}, 3500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:3333'});
$('#knocks').append("<p>Knock 3 of 5 complete...</p>");
}, 6500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:4444'});
$('#knocks').append("<p>Knock 4 of 5 complete...</p>");
}, 9500)
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:5555'});
$('#knocks').append("<p>Knock 5 of 5 complete...</p>");
}, 12000);
setTimeout(function(){
$('#knocks').append("<p>Knocking is complete... <br>Proceed to site: <a href='http://example-url.sample-url.com'>http://example-url.sample-url.com</a></p>");
}, 13000);
});
});
</script>
Seeing there's no real answer to your question and you'd probably want to move on, I though I'd give you some suggestions as a starting point.
For your function calls to truly execute in a sequential order (or synchronous, in-order, blocking,...) you will have to make sure you issue all the subsequent function calls (AJAX requests in your case) once the proceeding requests completed (either succeed, or failed, in which case you might not want to proceed with the next in-order call and issue a completely separate response).
The way you're doing it now isn't considered synchronous, instead it is actually asynchronous, delayed (or 'in the background' with a timeout). This might cause all kinds of problems when you expect your AJAX calls to execute synchronously (blocking) at your server end. From browsers re-issuing failed or timed-out requests (for various reasons, depending on their feature set and how they handle failed requests, caching,...) to preemptively issuing requests and caching results when some pre-fetchers are enabled (or however they're calling it in FF), then re-issuing them again if pre-fetcher failed. I believe this is similar to what you observed in Firefox and might be the main culprit for this unexpected behavior. As you can't control what features end user enables or disables in their browser, or what new features they implement in future versions, you can't really expect your server calls to execute asynchronous by delaying their calls with setTimeout, even if they appear to be doing so in other browsers (probably because of your server responding fast enough for them to appear as such).
In your code, the second call would only appear to be executing synchronously (waiting for the first one to complete) for up to half a second, the third request for up to 3 seconds and a half, and so on. But even if setTimeout was blocking execution (which it doesn't), which external request would it be waiting for? The first one, or the second one? I think you get what I'm trying to say and why your code doesn't work as expected.
Instead, you should either issue subsequent AJAX calls through your server's response (which is actually the point of using AJAX, otherwise there's no need for it), or preferably, create an external listener function that will handle these calls according to the status and/or return values of your previous external calls. If you need to handle failed requests as well and continue execution regardless, then the external listener (with preset execution stack timeout) is the way to go, as you obviously wouldn't be able to depend on response of failed requests.
You see, browsers have no problems issuing multiple concurrent requests and delaying them with setTimout doesn't stop pre-fetchers to try and cache their responses for later use either. It also doesn't issue requests in a blocking manner, the next one waiting for the previous one to finish, as you expected them to. Most will be happy to utilize up to a certain number of concurrent connection (~10 on client machines, a lot more on servers) in an effort to speed-up the download and/or page rendering process, and some obviously have even more advanced caching mechanism in place for this very same reason, Firefox being merely one of them.
I hope this clears things up a bit and you'll be able to rewrite your code to work as expected. As we have no knowledge of how your server-side code is supposed to work, you'll have to write it yourself, though. There are however plenty of threads on SE discussing similar techniques that you might decide on using, and you can always ask another question if you get stuck and we'll be glad to help.
Cheers!
Well, first I want to say I'm a bit new in the world of Internet dev.
Anyway, I'm trying to know if its possible to run two pieces of code in parallel using javascript.
What I really need is to call two methods that are in a remote server. I pass, for both, a callback function that will be executed soon the data I want is ready. As the server running these functions take a time to answer, I'm trying to find a way to call both methods at the same time without need to wait till the first finishes to call the second.
Does methods like setTimeout run concurrently, for example
setTimeout(func1, 0);
setTimeout(func2, 0);
...
function func1()
{
webMethod1(function() {alert("function 1 returned"); } );
}
function func1()
{
webMethod2(function() {alert("function 2 returned"); } );
}
Edited
I've just found this article that may be very cool for the realease of next browsers: Javascript web workers
There is one single thread of execution in Javascript in normal WebBrowsers: your timer handlers will be called serially. Your approach using timers will work in the case you present.
There is a nice piece of documentation on timers by John Resig (author of the very popular jQuery javascript framework - if you are new to Web development, I would suggest you look it up).
Now, if you are referring to HTML5 based browsers, at some point, they should have threading support.
Yes, that's exactly how web requests through AJAX work. No need to setTimeout to 0, you can just call them one by one, and make an AJAX request, and it'll be executed asynchronously, allowing you to pass a callback function to be invoked when the request completes.
The means of creating an AJAX request differs some depending on what browser you're running. If you're going to build something that depends considerably upon AJAX, and you want it to work across multiple browsers, you're best off with a library. Here's how it's done in jQuery, for instance:
$.ajax({ url: '/webrequesturl', success: function(result) {
// this will be called upon a successful request
} });
$.ajax({ url: '/webrequest2url', success: function(result) {
// this will be called upon a successful request
// this may or may not be called before the above one, depending on how long it takes for the requests to finish.
} });
Well, JavaScript is single-threaded, the two timers will run sequentially one after the other, even if you don't notice it.
I would recommend you to give a look to the following article, it really explains how timers and asynchronous events work, it will also help you to understand the single-threaded nature of JavaScript:
How JavaScript Timers Work
And as an alternative you could give a look to WebWorkers, is a way to run scripts in separate background threads, but they are only supported by modern browsers.
What you are looking for is asynchronous client-server communication (keyword: async). Asynchronous functions return straight away, but the provided callback will be executed after the specified condition is satisfied.
So, if the function that sends a request to the server is asynchronous, this would let you send both requests to the server without waiting for one to respond.
Using setTimeout may work, as this will schedule both request-sending functions to be called. However, some browsers only run one thread of Javascript at a time, so the result would be that one of the scheduled functions would run and block (waiting for a reply) and the other scheduled function would wait until the first was done to start running.
It is advisable to use async support from your server communication library. For instance jQuery uses async by default.
It depends on the JavaScript engine.