I've got a custom javascript autocomplete script that hits the server with multiple asynchronous ajax requests. (Everytime a key gets pressed.)
I've noticed that sometimes an earlier ajax request will be returned after a later requests, which messes things up.
The way I handle this now is I have a counter that increments for each ajax request. Requests that come back with a lower count get ignored.
I'm wondering: Is this proper? Or is there a better way of dealing with this issue?
Thanks in advance,
Travis
You can store a "global" currentAjaxRequest, which holds the structure of the last XHR request. Then you can abort the current request when you make a new one.
For example:
var currentAjaxRequest = null;
function autoCompleteStuff() {
if(currentAjaxRequest !== null) {
currentAjaxRequest.abort();
}
currentAjaxRequest = $.get(..., function(...) {
currentAjaxRequest = null;
...
});
}
To avoid naming conflicts, wrap that in an anonymous, instantly-executed function, if needed.
Related
I have some third party library whose events I'm listening. I get a chance to modify data which that library is going to append in the UI. It is all fine until that data modification is synchronous. As soon as I involve Ajax callbacks/promises, this fails to work. Let me put an example to show case the problem.
Below is how I'm listening to a event:-
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
});
// Above code doesn't wait for ajax call to complete, it just go away and
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...').then(function (data) {
data.someProperty = res.thatProperty;
return true;
});
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
});
I cannot change/alter the third party library. All I have is to listen to event and alter that data.
Any better solutions. Nope. I can't use async/wait, generators, because I want to have it supported for ES5 browsers.
You cannot make a synchronous function wait for an asynchronous response, it's simply not possible by definition. Your options pretty much are:
BAD IDEA: Make a synchronous AJAX request. Again: BAD IDEA. Not only will this block the entire browser, it is also a deprecated practice and should not be used in new code, or indeed ever.
Fetch the asynchronous data first and store it locally, so it's available synchronously when needed. That obviously only works if you have an idea what data you'll be needing ahead of time.
Alter the 3rd party library to add support for asynchronous callbacks, or request that of the vendor.
Find some hackaround where you'll probably let the library work with incomplete data first and then update it when the asynchronous data is available. That obviously depends a lot on the specifics of that library and the task being done.
Does the gotResults callback function really need to return anything else than true? If not, then you could just write regular asynchronous code without this library knowing about it. Let me explain myself by rewriting your pseudocode:
d.on('gotResults', function (data) {
// If alter data directly it works fine.
data.title = 'newTitle';
// Above code alters the text correctly.
//I want some properties to be grabbed from elsewhere so I make an Ajax call.
$.ajax('http://someurl...', {data.id}, function (res) {
data.someProperty = res.thatProperty;
// Above code doesn't wait for ajax call to complete, it just go away and
// EDIT: now it should render properly
renders page without data change.
// Yes I tried promises but doesn't help
return fetch('http://someurl...');
// Above code also triggers the url and gets away. Doesn't wait for then to complete.
}).then(function (data) {
data.someProperty = res.thatProperty;
// maybe render again here?
}).catch(function(err) {
handleError(err); // handle errors so the don't disappear silently
});
return true; // this line runs before any of the above asynchronous code but do we care?
});
I am working on a network architecture which gives late response to a GET query. I wish to draw the image once I am able to receive from the server. I am trying to increase delay time of display function so that it can be drawn once fetched from server. I am using canvas to display picture from a particular URI. Here is the portion of code which I need to delay running :
var canvas = document.getElementById('myCanvas');
var context = canvas.getContext('2d');
var img = new Image;
var strDataURI = nameStr;
img.onload = function(){
context.drawImage(img,0,0, 150,150); // Or at whatever offset you like
};
img.src = strDataURI;
Please help. Thanks in advance.
There are several options for this. If you insist on making this a timer then you can use setTimeout().
window.setTimeout(function() { // function code here }, 3000);
You could also set your ajax call to be synchronous instead of asynchronous. This will cause other functions to wait until it is complete before running.
$.ajax({
async: false
});
Finally you could put the draw function in the complete of your ajax call. This function is run after the ajax call is completed.
$.ajax({
complete: function(result) {
// code to perform the draw
}
});
Try setTimeout (reference here http://www.w3schools.com/js/js_timing.asp)
Suggestion
My suggestion is to NOT delay it at all. As chockleyc said, waiting for the response would be the best option.
Cases
There are the following possible scenarios:
You are making the GET request and waiting for the response
You are not making the request manually, and it is simply loaded with page
You make the GET request manually
If you are making the GET query yourself, my strong recommendation is to use Promise like this:
var getRequest = new Promise(function(resolve, reject){
//We call resolve(...) when what we were doing async succeeded, and reject(...) when it failed.
//In this example, we use setTimeout(...) to simulate async code.
var oReq = new XMLHttpRequest();
oReq.onload = function(e) {
resolve(oReq.response);
}
oReq.open("GET", "www.bananas.com");
oReq.send();
});
and then you would use it like:
var getRequest()
.then(console.log);
Which in this case would print the response. If you are not familiar I recommend the MDN Promises documentation.
Another alternative is to simply use the XMLHttp from JavaScript without promises. You can read more examples in the MDN documentation as well or if it is too confusing for you give a try to the kirupa tutorial.
You don't make the GET request manually
In this case, I recommend you listen to the GET request, and then perform a specific action once its response arrives. A good solution for this can be found in this response:
https://stackoverflow.com/a/3597640/1337392
Where you will find a mini library that will listen to ALL GET requests. This way, every time you receive a GET request you can filter it by what you want and execute your code.
If the previous code is too complex for you, you can also have a look at this much simpler alternative:
Listen to serve response JavaScript
You really insist in using a timer
The worst solution by far. Why? try answering these questions:
What happens if the image doesn't arrive before you expect? Instead of 1 second, what if it takes 2?
How do you know for sure exactly how long the GET request will take? Will you make a median of 100 requests? What if you get that one request that is outside the median?
Assuming you always wait the longest possible time (to ensure everything works) why should the majority of your users have a weaker experience?
But if you insist on it, then both answers from "chockleyc" and from "Johannes P" should clarify all your questions.
That's all for now, I hope it helps !
I'm writing a browser extension for chrome that uses Web SQL for local storage. The code for both of these components seems to heavily rely on async operations. I have a good understanding of asynchronous operations but not a whole lot of experience writing code that relies this heavily on them.
For Example:
var CSUID = "";
//this is an async callback for handling browser tab updates
function checkForValidUrl(tabId, changeInfo, tab) {
getCookies("http://www.cleansnipe.com/", "CSUID", handleCookie);
if(CSUID != ""){ //this could be in handleCookie if i could access the tab
//do stuff with the tab
}
}
function handleCookie(cookie) {
if (cookie != "" && cookie != null) {
CSUID = cookie;
}
}
In order to overcome the lack of ability to pass/return variables into/from these handlers I find myself creating global variables and setting them in the handlers. Of course this doesn't work as expected because the variable is often accessed before the callback has executed.
What is the best practice for handling this situation? I thought to use global flags/counters with a while loop to pause execute but this seems messy prone to application hang.
If jQuery is an option, it has a beautiful system of what it calls deferred objects. It allows for graceful and effective management of asynchronous situations - or indeed synchronous, as the cases may be or vary.
(Deferreds aren't limited to just jQuery, but jQuery has a nice API for them).
Here's a simple example purely to demonstrate the concept.
//get data func
function get_data(immediate) {
//if immediate, return something synchonously
if (immediate)
return 'some static data';
//else get the data from something over AJAX
else
return $.get('some_url');
}
//two requests two get data - one asynchronous, one synchronous
var data1 = get_data(true), data2 = get_data();
//do something when both resolved
$.when(data1, data2).done(function(data1, data2) {
//callback code here...
});
Deferreds don't have to involve AJAX; you can create your own deferred objects (jQuery's AJAX requests automatically make and return them) and resolve/reject them manually. I did a blog post on this a few months back - it might help.
I have a series of consecutively-named pages (URLs, like: http://example.com/book/1, http://example.com/book/2, etc.) but I have no way of knowing how many pages there are in advance. I need to retrieve (a particular part of) each page, keep the obtained info in order, miss no page, and request a minimum amount of null pages.
Currently, I have a recursive asynchronous function which is a bit like this:
pages = []
getPage = (page = 1) ->
xhr.get "http://example.com/book/#{1}", (response) ->
if isValid response
pages.push response
getPage page++
else
event.trigger "haveallpages"
getPage()
xhr.get and event.trigger is pseudo-code and are currently jQuery methods (but that may change). isValid is also pseudo-code, in reality the test in defined within the function, but it's complex and not relevant to the question.
This works well but is slow as only one request is processed at a time. What I'm looking for is a way to make better use of the asynchronous nature of XHRs and retrieve the complete list in less time. Is there a pattern which could help me here? Or a better algorithm?
Just fire simultaneous requests while keeping count of them. There is no need to guess the upper bound, simply stop when requests start to fail like in your original code.
This will generate at most concurrency-1 wasted requests:
pages = []
concurrency = 5
currentPage = 0
haveAllPages = false
getPage = (p) ->
xhr.get "http://example.com/book/#{p}", (response) ->
if isValid response
pages.push response
getPage ++currentPage if not haveAllPages
else
haveAllPages = true
while concurrency--
getPage ++currentPage
I have a simple Javascript function:
makeRequest();
It does a bunch of stuff and places a bunch of content into the DOM.
I make a few calls like so:
makeRequest('food');
makeRequest('shopping');
However, they both fire so quickly that they are stepping on each other's toes. Ultimately I need it to have the functionality of.
makeRequest('food');
wait....
makeRequest('shopping'); only if makeRequest('food') has finished
Thoughts on getting these to execute only one at a time?
Thanks!
If these functions actually do an AJAX request, you are better keeping them asynchronous. You can make a synchronous AJAX request but it will stop the browser from responding and lead to bad user experience.
If what you require if that these AJAX requests are made one after the other because they depend on each other, you should investigate your function to see if it provides a callback mechanism.
makeRequest('food', function()
{
// called when food request is done
makeRequest('shopping');
});
Using jQuery, it looks something like that
$.get("/food", function(food)
{
// do something with food
$.get("/shopping", function(shopping)
{
// do something with shopping
});
});
I would recommend that you simply write them asynchronously--for example, call makeRequest('shopping'); from the AJAX completion handler of the first call.
If you do not want to write your code asynchronously, see Javascript Strands
I suppose that you have a callback method that takes care of the response for the request? Once it has done that, let it make the next request.
Declare an array for the queue, and a flag to keep track of the status:
var queue = [], requestRunning = false;
In the makeRequest method:
if (requestRunning) {
queue.push(requestParameter);
} else {
requestRunning = true;
// do the request
}
In the callback method, after taking care of the response:
if (queue.length > 0) {
var requestParameter = queue.splice(0,1)[0];
// do the request
} else {
requestRunning = false;
}