So i have a web application which is making around 14-15 AJAX calls to some APIs. The problem is the amount of time all the AJAX calls takes is nearly 3x than the time in which each individual API shows me response when I type its URL in the browser.
I am making all the AJAX calls instantly inside the DOM Ready event.
The thing is how can i speed up this process of making 15 AJAX calls together, getting the response as fast as possible and Manipulating the DOM accordingly.
Few Points which i have in my mind:
All the AJAX calls should be ASYNC in nature. (Already doing it).
Don't make all the AJAX calls at the same time. Induce some sort of timeout as making all the AJAX calls at the same time may block the bandwidth and slows down the turn around time of the process.
Reducing the number of API calls by any means. (Already doing it).
Manipulate the DOM as minimal as possible. (Already doing it).
Setting cache:true in AJAX setup. I don't think that will really help, still i am doing it wherever i am sure content will update really slow.
Any suggestions will be valuable! Thanks.
The way i am making AJAX calls
$(document).ready(function(){
loadSentimentModule();
loadCountModule();
loadEntitiesModule();
// Some more function calls.
});
function loadSentimentModule(){
$.ajax({
url:someurl,
cache:true,
dataType:"json",
success: function(data){
// Based on data manipulating DOM.
}
}
// Same kind of function defintions for all the functions.
You may not issue the ajax call directly, but queue them and let a manager control the queue, see here: Queue ajax requests using jQuery.queue()
I recomend you to use the async.js module on client. May be it this what are you looking for.
Related
I have some slow OData calls which need to present some sort of visual indicator to the user that something is happening.
I've read the API reference and seen functions like attachRequestSent(), setBusy(), BusyDialog, BusyIndicator, etc.
I tried using them accordingly but did not work for me. The problem seems to be oModel.create causing the whole app to hang while it executes. No loading indicators or anything can run since the app is frozen until the create function has returned.
Edit: I have set up an asynchronous batch read OData call. I have then wrapped the code for handling the received data in a function and called that function inside the success function in the batch call.
This works; the view loads and I see a busy indicator before the fields are populated with the data
oModel.submitBatch(/*fnSuccess*/);
Is this a good way to do it, or is there a better way which is more standard?
Before the Odata call, display the busy indicator (locks the entire app screen). with
sap.ui.core.BusyIndicator.show(0);
Then, in both the success and failure handlers of the odata call, hide it with
sap.ui.core.BusyIndicator.hide();
It does not work with implicit calls (when for instance you bind to an odata model), for this you can use the request sent events, but the idea is the same.
Edit: You also need to give a small delay to allow the indicator to appear, then use setTimeout to call your odata after a small delay.
doStuffWithIndicator: function(){
sap.ui.core.BusyIndicator.show(0);
setTimeout(function(){
doStuff();
sap.ui.core.BusyIndicator.hide();
}, 20);
},
checkout this thread: SAPUI5 Wait for an Deferred-Object // wait for .done() function
Javascript is asynchrone, this means the code will be processed further no matter if you make an call (which might take longer). Therefore before calling an OData Service you need to tell your JS to wait for it (make it synchrone) via an deferred object.
The main problem seems to be oModel.create causing the whole app to hang while it executes. No loading indicators or anything can run since the app is frozen until the create function has returned.
Sounds like you've been using the now-deprecated sap.ui.model.odata.ODataModel the whole time, which sends mostly synchronous XHRs. Synchronous XHRs block the main thread (== UI thread) until the browser receives the response. The browser isn't then able to update the DOM to display the busy indicator during the round trip.
If anyone reading this has the same issue, please migrate to the newer equivalent model: sap/ui/model/odata/v2/ODataModel. It sends only asynchronous requests (AJAX), allowing the UI thread to handle other tasks (such as adding the busy indicator to the DOM) while the browser waits for the response.
I have a web application that is growing more complex. It makes heavy use of JavaScript based HTML generation and AJAX calls, and herein lies my problem:
Since I can't know how long an ajax call might take getting back to client side, I don't know when the callback gets actually executed. The user might have at that point navigated away from the element that originally caused the AJAX event, in which case this callback can cause some havoc. Is there a way to "expire" old callbacks ?
Are there any libraries that would offer that functionality? (I am using jQuery now but am not 100% familiar with it).
Thanks,
You might want to look into Ajax Queue Manager. There are params you can set to abort old requests before sending a new one. I think that might be what your looking for.
Well, the simple answer is to check for the proper state of your app within your callback functions, before they do whatever it is they are doing that causes problems. For example, you could make sure that certain elements are still being hovered over.
I was just reading another question about jQuery's synchronous ajax call, and I got to wondering:
What circumstances make a synchronous version of an ajax call beneficial/necessary?
Ideally I'd like an example, and why synchronous is better than standard ajax.
The only reasonable example I can think of (that can't be worked around another way) is making a call in window.onbeforeunload, where you need it to be synchronous, or the page will move on and the request will never complete.
In this specific case using standard/asynchronous behavior, you're all but assured the request will die too early to have any impact, or ever contact the server.
I'm not saying I'm in favor of doing this, quite the opposite (as it negatively impacts the user's browsing speed). But...there's not much option here.
In sum, please do not use synchronous requests as #Brandon says: they are a cheap/easy/quick way to avoid making a callback. In addition, modern browsers show warnings if synchronous requests are made and we do not like that. Make your world asynchronous.
synchronous ajax is often used to retrieve a valued from the server which is required to further continue processing of client side code. in such case, the ajax call will block until the call returns with the desired value. example:
a javascript function needs to compute salary for an employee:
step1 : get the employee id from the form
step2 : make a sync server call passing the emp.id to get his salary/hour
step3 : multiply salary rate by number of working hours
as you can see, total salary cannot be computed unless the server call is finished so this should be a sync function, although if using jquery, one could handle onSuccess to compute the salary asynchronously but processing will continue in this if you have a message box to display the salary, it will appear empty...
I would venture a guess that it'd be good in a scenario where you want to perform some ajax calls but you have one call that relies on the results of another call. If you perform them synchronously you can wait for the independent to finish before the dependent call fires.
I'm writing a webapp (Firefox-compatible only) which uses long polling (via jQuery's ajax abilities) to send more-or-less constant updates from the server to the client. I'm concerned about the effects of leaving this running for long periods of time, say, all day or overnight. The basic code skeleton is this:
function processResults(xml)
{
// do stuff with the xml from the server
}
function fetch()
{
setTimeout(function ()
{
$.ajax({
type: 'GET',
url: 'foo/bar/baz',
dataType: 'xml',
success: function (xml)
{
processResults(xml);
fetch();
},
error: function (xhr, type, exception)
{
if (xhr.status === 0)
{
console.log('XMLHttpRequest cancelled');
}
else
{
console.debug(xhr);
fetch();
}
}
});
}, 500);
}
(The half-second "sleep" is so that the client doesn't hammer the server if the updates are coming back to the client quickly - which they usually are.)
After leaving this running overnight, it tends to make Firefox crawl. I'd been thinking that this could be partially caused by a large stack depth since I've basically written an infinitely recursive function. However, if I use Firebug and throw a breakpoint into fetch, it looks like this is not the case. The stack that Firebug shows me is only about 4 or 5 frames deep, even after an hour.
One of the solutions I'm considering is changing my recursive function to an iterative one, but I can't figure out how I would insert the delay in between Ajax requests without spinning. I've looked at the JS 1.7 "yield" keyword but I can't quite wrap my head around it, to figure out if it's what I need here.
Is the best solution just to do a hard refresh on the page periodically, say, once every hour? Is there a better/leaner long-polling design pattern that won't put a hurt on the browser even after running for 8 or 12 hours? Or should I just skip the long polling altogether and use a different "constant update" pattern since I usually know how frequently the server will have a response for me?
It's also possible that it's FireBug. You're console.logging stuff, which means you probably have a network monitor tab open, etc, which means every request is stored in memory.
Try disabling it, see if that helps.
I suspect that memory is leaking from processResults().
I have been using very similar code to yours in a long-polling web application, which is able to run uninterrupted for weeks without a page refresh.
Your stack should not be deep, because fetch() returns immediately. You do not have an infinitely recursive loop.
You may want to use the Firefox Leak Monitor Add-on to assist you in finding memory leaks.
The stack depth of 4-5 is correct. setTimeout and $.ajax are asynchronous calls, which return immediately. The callback is later called by the browser with an empty call stack. Since you cannot implement long polling in a synchronous way, you must use this recursive approach. There is no way to make it iterative.
I suspect the reason for this slow down is that your code has a memory leak. The leak could either be in $.ajax by jQuery (very unlikely) or in your processResults call.
It is a bad idea to call fetch() from inside the method itself. Recursivity is better used when you expect that at some point the method will reach an end and the results will start to be send to the caller. The thing is, when you call the method recursively it keeps the caller method open and using memory. If you are only 3-4 frames deep, it is because jQuery or the browser are somehow "fixing" what you've done.
Recent releases of jquery support long-polling by default. This way you can be sure that yhou are not deppending on browser's intelligence to deal with your infinite recursive call. When calling the $.ajax() method you could use the code below to do a long poll combined with a safe wait of 500 miliseconds before a new call.
function myLongPoll(){
setTimeout(function(){
$.ajax({
type:'POST',
dataType: 'JSON',
url: 'http://my.domain.com/action',
data: {},
cache: false,
success:function(data){
//do something with the result
},
complete: myLongPoll,
async : false,
timeout: 5000
});
//Doesn't matter how long it took the ajax call, 1 milisec or
//5 seconds (timeout), the next call will only happen after 2 seconds
}, 2000);
This way you can be sure that the $.ajax() call is closed before the next one starts. This can be proved by adding a simple console.log() at the prior and another after your $.ajax() call.
Well, first I want to say I'm a bit new in the world of Internet dev.
Anyway, I'm trying to know if its possible to run two pieces of code in parallel using javascript.
What I really need is to call two methods that are in a remote server. I pass, for both, a callback function that will be executed soon the data I want is ready. As the server running these functions take a time to answer, I'm trying to find a way to call both methods at the same time without need to wait till the first finishes to call the second.
Does methods like setTimeout run concurrently, for example
setTimeout(func1, 0);
setTimeout(func2, 0);
...
function func1()
{
webMethod1(function() {alert("function 1 returned"); } );
}
function func1()
{
webMethod2(function() {alert("function 2 returned"); } );
}
Edited
I've just found this article that may be very cool for the realease of next browsers: Javascript web workers
There is one single thread of execution in Javascript in normal WebBrowsers: your timer handlers will be called serially. Your approach using timers will work in the case you present.
There is a nice piece of documentation on timers by John Resig (author of the very popular jQuery javascript framework - if you are new to Web development, I would suggest you look it up).
Now, if you are referring to HTML5 based browsers, at some point, they should have threading support.
Yes, that's exactly how web requests through AJAX work. No need to setTimeout to 0, you can just call them one by one, and make an AJAX request, and it'll be executed asynchronously, allowing you to pass a callback function to be invoked when the request completes.
The means of creating an AJAX request differs some depending on what browser you're running. If you're going to build something that depends considerably upon AJAX, and you want it to work across multiple browsers, you're best off with a library. Here's how it's done in jQuery, for instance:
$.ajax({ url: '/webrequesturl', success: function(result) {
// this will be called upon a successful request
} });
$.ajax({ url: '/webrequest2url', success: function(result) {
// this will be called upon a successful request
// this may or may not be called before the above one, depending on how long it takes for the requests to finish.
} });
Well, JavaScript is single-threaded, the two timers will run sequentially one after the other, even if you don't notice it.
I would recommend you to give a look to the following article, it really explains how timers and asynchronous events work, it will also help you to understand the single-threaded nature of JavaScript:
How JavaScript Timers Work
And as an alternative you could give a look to WebWorkers, is a way to run scripts in separate background threads, but they are only supported by modern browsers.
What you are looking for is asynchronous client-server communication (keyword: async). Asynchronous functions return straight away, but the provided callback will be executed after the specified condition is satisfied.
So, if the function that sends a request to the server is asynchronous, this would let you send both requests to the server without waiting for one to respond.
Using setTimeout may work, as this will schedule both request-sending functions to be called. However, some browsers only run one thread of Javascript at a time, so the result would be that one of the scheduled functions would run and block (waiting for a reply) and the other scheduled function would wait until the first was done to start running.
It is advisable to use async support from your server communication library. For instance jQuery uses async by default.
It depends on the JavaScript engine.