Extjs: two parallel ajax call - javascript

my code creates two ajax call at the same time (i assume the parallelism would be more efficient). I want to load a table if both calls succeed. What's the proper way of doing this?

var succeeded = {};
function callBackOne(){
succeeded.one = true;
// your other stuff
if (succeeded.two) { bothHaveSucceeded());
}
function callBackTwo(){
succeeded.two = true;
// your other stuff
if (succeeded.one) { bothHaveSucceeded());
}

I'd use a delayed task personally:
var success = {
one: false,
two: false
};
// Task
var task = new Ext.util.DelayedTask(function(){
// Check for success
if (success.one && success.two) {
// Callback
doCallback();
} else {
task.delay(500);
}
});
task.delay(500);
// First
Ext.Ajax.request({
...
success: function() {
success.one = true;
}
...
});
// Second
Ext.Ajax.request({
...
success: function() {
success.two = true;
}
...
});
The task acts like a thread and will check on the status of the requests and sleep for every 500ms until they both complete.

Old question, but well, as I stumbled upon it...
I'd use the excellent async library by Caolan, particularly here you'll want to use async.parallel.
The examples written on the GitHub doc are worth a read.
https://github.com/caolan/async#parallel

Share an integer variable that each callback checks:
// count variable
var numReturns = 0;
// same call back used for each Ajax request:
function callback() {
numReturns++;
if (numReturns === 2) {
progress();
}
}
If you need different callbacks, have each callback fire an event which does the same thing.

Related

Timing a set of functions with asyncronous subroutines

I have two functions periodically called via setInterval. The goal is to defer Function B until Function A is done (and vis versa). Currently, Function A will start, complete some of its subroutines, but not reach the end before Function B begins.
I've tried passing Function B as an argument of Function A. I am not sure if that was sufficient to create a callback. I also tried jQuery's $.when(setInterval(functionA, 10000)).then(setInterval(functionB, 5000)).
How do I ask JavaScript to wait for functions/blocks of code to finish? Thank you in advance.
Edit: Below is code very similar to my original. Sorry for not being concise.
Function A, getFruits(): There is a remote JSON that changes on its own (fruits.json). getFruits() does two things: 1) It empties an array, [allFruits] (just in case); 2) It adds all the names of fruit currently in the remote JSON to [allFruits]. Now, [allFruits] is an instanced copy of the remote JSON. Before this question, I only called getFruits() once, at startup; in other words, I did not use setInterval for getFruits().
Function B, checkFruits(): Now checkFruits() periodically (setInterval(checkFruits, 5000)) compares [allFruits] to the remote version. If any fruit was added to the remote version, checkFruits appends [allFruits] with those fruits' names; it also runs useful code (i.e. pushes the new names to an array [queue]).
For this implementation, it is important to create an initial list so only new (post-startup) fruit trigger the useful code of checkFruits(). Moreover, it is important only to add (never subtract) names from [allFruits] within a session. This is to prevent a new fruit from triggering the useful code more than once per session.
Problem: Now I want to make getFruits() (Function A) periodic. Because getFruits() empties [allFruits], it will allow the names that built up to again trigger useful code (but only once in between invocations of getFruits()). However, when I use setInterval(getFruits, 10000), there are times (in this example, always) when getFruits() overlaps with checkFruits(). When that happens, I notice only part of getFruits() finishes before checkFruits() starts. The console.log() messages appear in this order: 'getFruits() start:', 'checkFruits():', 'getFruits() end:'. Furthermore, my useful code is ran before getFruits() finishes (this is what is really undesired), and [allFruits] gets duplicates. This would not occur if getFruits() completely finished before checkFruits() jumped in.
debugging = true;
var debug = function() {
if (debugging){
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var getFruits = function() {
allFruits = []; // Empty the list
debug('getFruits() start:', 'allFruits =', allFruits, 'queue =', queue);
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
debug('getFruits() end:', 'data =', data, 'allFruits =', allFruits, 'queue =', queue);
},
});
}
var checkFruits = function() {
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
if (allFruits.indexOf(element.name) === -1) {
queue.push(['fruit', element.name]);
allFruits.push(element.name);
}
});
debug('checkFruits():', 'data =', data, 'allFruits =', allFruits, 'queue =', queue);
}
});
}
getFruits();
setInterval(checkFruits, 5000);
// setInterval(getFruits, 10000); // When I try this, checkFruits() does not wait for getFruits() to finish.
The analogy of my actual remote resource is fruits.json. fruits.json can simply be the following:
{"fruits":[{"name":"apple","color":"red"},{"name":"banana","color":"yellow"},{"name":"tangerine","color":"orange"}]}
Again, the actual, remote JSON changes independently.
What you have here are two methods that each do asynchronouse stuff. Here are some good stack overflow posts on what that means.
Easy to understand definition of "asynchronous event"?
Does async programming mean multi-threading?
Are JavaScript functions asynchronous?
We have no idea how long it will take for an asynchronous call to finish. In your case, the AJAX request could take up to a few seconds depending on network speeds so regardless of when each of these methods are executed you CANNOT know which one will finish first. So what to do? Well, generally when you write/use an asynchronous method (like $.ajax) you give it a callback that will be executed when the asynchronous work is finished. And you have done this in the form of the success callback. And here is the good news. The success callbacks are SYNCHRONOUS (note the missing a). This means that the "useful code" in the success callback that needs to be run when a request finishes will complete (so long as none of it is async) before the "other useful code" in the other success callback is executed at all. And this works no matter which request finishes first. Each success callback will always wait for the other. So I think what was confusing you was your debug statements. If you add the following statements to your code the execution flow may make more sense:
debugging = true;
var debug = function() {
if (debugging) {
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var getFruits = function() {
debug("getFruits: make request");
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
debug("getFruits: start processing");
allFruits = []; // Empty the list
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
debug('getFruits: finished processing');
},
});
debug("getFruits: request sent, now we wait for a response.");
}
var checkFruits = function() {
debug("checkFruits: make request");
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
debug("checkFruits: start processing");
data.fruits.forEach(function(element) {
if (allFruits.indexOf(element.name) === -1) {
queue.push(['fruit', element.name]);
allFruits.push(element.name);
}
});
debug("checkFruits: finished processing");
}
});
debug("checkFruits: request sent, now we wait for a response.");
}
getFruits();
setInterval(checkFruits, 5000);
// setInterval(getFruits, 10000); // When I try this, checkFruits() does not wait for getFruits() to finish.
After thinking about it I believe the only reason things may not have been behaving as expected is because you're emptying the allFruits array outside of the callback. If you move it as I have done I would think everything should work fine.
Now, I don't know why you need to re-initialize the data since each time you make the request your getting the latest information but lets roll with it. Since both methods make the same request lets consolidate that into a single method. No need to duplicate code ;). And since all of your examples have the getFruits running twice as slow as the checkFruits we could easily add a counter to accomplish the same sequence of events like so:
debugging = true;
var debug = function() {
if (debugging) {
console.log.apply(console, arguments)
};
}
var allFruits = [];
var queue = [];
var count = 0;
var doOneThing = function(data) {
//do stuff
}
var doAnotherThing= function(data) {
//do other stuff
}
var requestFruits = function() {
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
// if count is even...or, do this every other time.
if (count % 2 === 0) {
count++;
doOneThing(data);
}
// do this everytime
doAnotherThing(data);
},
});
}
setInterval(requestFruits, 5000);
Hope this helps. Cheers.
your last code example first executes setInterval(functionA), and when the deferred execution of functionA is setup, executes setInterval(functionB), meaning that B will called +- 5 seconds after that line is executed, while functionA is called +- 10 seconds.
edit to reflect your additional information:
setInterval(function(){
functionA();
functionB();
}, 10000)
setTimeout(function(){
setInterval(functionB, 10000)
}, 5000)
This is a crude answer. I sense that callbacks can achieve this, but I am not sure how to code them, especially involving setInterval.
I create two global variables, getFruitsIsBusy = false and checkFruitsIsBusy = false. I create an IF for both getFruits() and checkFruits(). Here is getFruits():
var getFruits = function() {
if (checkFruitsIsBusy) { // New
setTimeout(getFruits, 100); // New
return; // New
} else { // New
getFruitsIsBusy = true // New
allFruits = []; // Empty the list
debug('getFruits() start:', 'allFruits =', allFruits, 'queue =', queue);
$.ajax({
url: 'fruits.json',
dataType: 'json',
success: function(data) {
data.fruits.forEach(function(element) {
allFruits.push(element.name);
});
getFruitsIsBusy = false // New; in the success function
debug('getFruits() end:', 'data =', data, 'allFruits =', allFruits, 'queue =', queue)
},
});
}
}
If also using this paradigm for checkFruits(), it seems both functions will wait for each other to finish.
Based on an analysis of the timing of two functions (A and B), consider the following solution (Chionglo, 2016):
Keep state information for each of function A and function B. The state of each function should be set within each of the respective functions.
Create a wrapper function for each of function A and function B. The wrapper function calls on the respective function, and then checks for the state of the respective function.
a. The check in wrapper function A: if function A has reached is final state, clear the interval associated with wrapper function A and schedule an interval for wrapper function B.
b. The check in wrapper function B: if function B has reached its final state, clear the interval associated with wrapper function B.
To begin the process, schedule an interval for wrapper function A.
Sample code:
var ac = Math.round(4*Math.random())+4;
var bc = Math.round(6*Math.random())+6;
var ai;
var Astate = false;
var Bstate = false;
function A() {
// Do your thing for A here.
// The following changes the “state of A” and then determines if the final state has been reached.
ac -= 1;
if (ac<1) Astate = true;
else Astate = false;
}
function B() {
// Do your thing for B here.
// The following changes the “state of B” and then determines if the final state has been reached.
bc -= 1;
if (bc<1) Bstate = true;
else Bstate = false;
}
ai = setInterval("processA()", 1000);
function processA() {
A();
if (Astate) {
clearInterval(ai);
ai = setInterval("processB()", 500);
}
}
function processB() {
B();
if (Bstate) {
clearInterval(ai);
ai = undefined;
}
}
Reference
Chionglo, J. F. (2016). An analysis for timing a set of functions. Available at http://www.aespen.ca/AEnswers/1458200332.pdf.

jQuery Deferred/Promises dynamic array not executing callbacks in correct order

Grateful for any insight into what I'm misunderstanding here. My requirement is as follows:
I have an array of URLs. I want to fire off an AJAX request for each URL simultaneously, and as soon as the first request completes, call the first callback. Then, if and when the second request completes, call that callback, and so on.
Option 1:
for (var i = 0; i < myUrlArray.length; i++) {
$.ajax({
url: myUrlArray[i]
}).done(function(response) {
// Do something with response
});
}
Obviously this doesn't work, as there is no guarantee the responses will complete in the correct order.
Option 2:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i]
}));
}
$.when.apply($, promises).then(function() {
// Do something with each response
});
This should work, but the downside is that it waits until all AJAX requests have completed, before firing any of the callbacks.
Ideally, I should be able to call the first callback as soon as it's complete, then chain the second callback to execute whenever that response is received (or immediately if it's already resolved), then the third, and so on.
The array length is completely variable and could contain any number of requests at any given time, so just hard coding the callback chain isn't an option.
My attempt:
var promises = [];
for (var i = 0; i < myUrlArray.length; i++) {
promises.push($.ajax({
url: myUrlArray[i] // Add each AJAX Deferred to the promises array
}));
}
(function handleAJAX() {
var promise;
if (promises.length) {
promise = promises.shift(); // Grab the first one in the stack
promise.then(function(response) { // Set up 'done' callback
// Do something with response
if (promises.length) {
handleAJAX(); // Move onto the next one
}
});
}
}());
The problem is that the callbacks execute in a completely random order! For example, if I add 'home.html', 'page2.html', 'page3.html' to the array, the order of responses won't necessarily be 'home.html', 'page2.html', 'page3.html'.
I'm obviously fundamentally misunderstanding something about the way promises work. Any help gratefully appreciated!
Cheers
EDIT
OK, now I'm even more confused. I made this JSFiddle with one array using Alnitak's answer and another using JoeFletch's answer and neither of them work as I would expect! Can anyone see what is going on here?
EDIT 2
Got it working! Based on JoeFletch's answer below, I adapted the solution as follows:
var i, responseArr = [];
for (i = 0; i < myUrlArray.length; i++) {
responseArr.push('0'); // <-- Add 'unprocessed' flag for each pending request
(function(ii) {
$.ajax({
url: myUrlArray[ii]
}).done(function(response) {
responseArr[ii] = response; // <-- Store response in array
}).fail(function(xhr, status, error) {
responseArr[ii] = 'ERROR';
}).always(function(response) {
for (var iii = 0; iii < responseArr.length; iii++) { // <-- Loop through entire response array from the beginning
if (responseArr[iii] === '0') {
return; // As soon as we hit an 'unprocessed' request, exit loop
}
else if (responseArr[iii] !== 'done') {
$('#target').append(responseArr[iii]); // <-- Do actual callback DOM append stuff
responseArr[iii] = 'done'; // <-- Set 'complete' flag for this request
}
}
});
}(i)); // <-- pass current value of i into closure to encapsulate
}
TL;DR: I don't understand jQuery promises, got it working without them. :)
Don't forget that you don't need to register the callbacks straight away.
I think this would work, the main difference with your code being that I've used .done rather than .then and refactored a few lines.
var promises = myUrlArray.map(function(url) {
return $.ajax({url: url});
});
(function serialize() {
var def = promises.shift();
if (def) {
def.done(function() {
callback.apply(null, arguments);
serialize();
});
}
})();
Here's my attempt at solving this. I updated my answer to include error handling for a failed .ajax call. I also moved some code to the complete method of the .ajax call.
var urlArr = ["url1", "url2"];
var responseArr = [];
for(var i = 0; i < length; i++) {
responseArr.push("0");//0 meaning unprocessed to the DOM
}
$.each(urlArr, function(i, url){
$.ajax({
url: url,
success: function(data){
responseArr[i] = data;
},
error: function (xhr, status, error) {
responseArr[i] = "Failed Response";//enter whatever you want to place here to notify the end user
},
complete: function() {
$.each(responseArr, function(i, element){
if (responseArr[i] == "0") {
return;
}
else if (responseArr[i] != "done")
{
//do something with the response
responseArr[i] = "done";
}
});
}
});
})
Asynchronous requests aren't guaranteed to finish in the same order that they are sent. some may take longer than others depending on server load and the amount of data being transferred.
The only options are either to wait until they are all done, only send one at a time, or just deal with them being called possibly out of order.

Any way to do a synchronous PageMethods call?

I'm trying to do this:
function DelBatch()
{var userInfo = get_cookie("UserInfo");
PageMethods.DeleteBatchJSWM(userInfo, function(result)
{window.location = "BatchOperations.aspx";});
}
But it still runs asynchronously. I need the browser to actually wait until my code-behind is finished executing, then it can be refreshed
There's a listbox loaded with values that were just deleted from the database, they shouldn't be visible. Problem I have is the window location refreshes before the code-behind is executed, and nothing seems like it was deleted to the user.
Call it using jQuery ajax instead? It features an option (async) where you can select sync/async mode: http://api.jquery.com/jQuery.ajax/
This excellent article tells you how best to call PageMethods from jQuery: http://encosia.com/using-jquery-to-directly-call-aspnet-ajax-page-methods/
Essentially, all you will need to do is this:
$.ajax({
type: "POST",
async: false,
url: "yourpage.aspx/DeleteBatchJSWM",
data: "{ put json representation of userInfo here }",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function(msg) {
window.location = "BatchOperations.aspx";
}
});
Look at Crockford's JSON stringify for a json formatting solution.
If you want to avoid using jQuery, a work around would be to use another PageMethod in which you check the status of the operation using the javascript setInterval function. It is a little messy, but it does the job if you want zero jQuery and it mimics the synchronicity you seek. I use it for large operations in which I want to update a progress bar to the client or something. Here would be an example of how you would do this given what code you posted:
function DelBatch()
{
var userInfo = get_cookie("UserInfo");
PageMethods.DeleteBatchJSWM(userInfo, function(result) {window.location = "BatchOperations.aspx";});
var status;
//Check to see if it has completed every second
var myInterval = setInterval(function ()
{
PageMethods.CheckDeleteBatchStatus(OnSuccess);
if (status == "Finished")
{
clearInterval(myInterval);
//Finished Deleting. Call your window refresh here
WindowRefresh();
}
}, 1000);
function OnSuccess(result)
{
status = result;
}
}
Code Behind:
[WebMethod]
public static string CheckDeleteBatchStatus()
{
string status = GetDeleteBatchStatus(); //some function to get the status of your operation
return status;
}
I came across this site:
http://abhijit-j-shetty.blogspot.com/2011/04/aspnet-ajax-calling-pagemethods.html
that had a great method for handling Synchronous PageMethod calls.
The javascript code is as follows:
// Make sure page methods operate synchronously
XMLHttpRequest.prototype.original_open = XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function (method, url, async, user, password) {
async = false;
var eventArgs = Array.prototype.slice.call(arguments);
var q = 0;
return this.original_open.apply(this, eventArgs);
}
// Make a generic WebMethod caller:
function WebMethodCall(FunctionName, callingobj) {
var OnSuccess = function (result, userContext, methodName) {
callingobj.push(result);
}
var OnFailure = function (error, userContext, methodName) {
callingobj.push(error.get_message());
}
PageMethods[FunctionName](OnSuccess, OnFailure);
}
// OK, this is kludgy, but here goes. In order to have a synchronous PageMethod call
// we need an object that persists in the namespace to stuff the result value into (like an array)
// Essentially I'm emulating a ByRef call.
// ThisResult is an empty list. The WebMethodCall function sticks a value into that list.
// The code that makes the PageMethods get called synchronously is in Common.js
// Use the functions
var ThisResult = []; // This must be of a type which persists in the namespace
WebMethodCall('HelloWorld', ThisResult);
return ThisResult[0];
Using jQuery was first recommended back in 2009.
Another (extremely verbose) option is implementing a synchronous WebRequestExecutor as shown here (2007-07-04), and perfected here (2007-10-30). The gist of the technique is to copy the ASP.NET AJAX Sys.Net.XMLHttpExecutor as a new class named Sys.Net.XMLHttpSyncExecutor and change the call to xmlHttpRequest.open to pass false as the last parameter to force synchronous operation.
The synchronous executor can be plugged into all requests using WebRequestManager like this:
Sys.Net.WebRequestManager.set_defaultExecutorType('Sys.Net.XMLHttpSyncExecutor');
or you may want to switch it up per-request just before it is invoked:
Sys.Net.WebRequestManager.add_invokingRequest(function(sender, args) {
if (iFeelLikeRunningThisRequestSynchronously) {
args.get_webRequest().set_executor(new Sys.Net.XMLHttpSyncExecutor());
}});
This discussion is the source for most of these links and a few more.
I wrote this, that lets you call a PageMethod synchronously. It also will just return the result of the method, and throw an error that can be caught in a try/catch block, so you don't need to worry about supplying onSuccess and onError functions.
function synchronusPageMethod(method) {
XMLHttpRequest.prototype.original_open = XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function (method, url, async, user, password) {
async = false;
var eventArgs = Array.prototype.slice.call(arguments);
return this.original_open.apply(this, eventArgs);
};
var result;
var error;
var args = Array.prototype.slice.call(arguments).slice(1);
args.push(function (res) {
result = res;
});
args.push(function (err) {
error = err;
});
method.apply(null, args);
XMLHttpRequest.prototype.open = XMLHttpRequest.prototype.original_open;
if (error !== undefined) {
throw error;
} else {
return result;
}
}
Use it like this:
try {
var result = synchronusPageMethod(PageMethods.myMethod, argument0, argument1);
console.log(result);
} catch(error) {
console.log(error);
}

Good way of avoiding a second ajax call

So I'm doing a an ajax call in this function somewhat like this:
function getCount() {
$.get("/People/getCount", function (data) {
if (data && data != "") {
// lots of code in here
}
What I'm doing in another function is making a second call like this:
function worldPeople() {
return $.get("/People/getCount", function (data) {
if (data != 0) {
var target = $("#worldNumbers").find("span");
target.html(data.length).digits();
}
})
}
So I really would like to avoid making that second call. Is there any good way in avoiding that? Maybe do some chaining or such, reusing the callback from the first one? I've heard that its bad practice to do several calls.
Regards
Would like to thank all who answered. In the end did not use any of the solutions, I solved it in another way. I'm sure most of the examples you gave me were really good. Do not know how to do with accepting answers. Accept all or none?! Thanks!
You could create a simple data store:
App.store = function () {
this.people = null;
this.count
loadPeople = function () {
if(this.people === null) {
$.get("/People/getCount", function (data) {
if (data != 0) {
this.count = (data.length).digits();
this.people = data;
}
}
};
}
What about store count of peoples in hidden field? And than check this field before sending request.
You can achieve this by handling your Ajax requests using some sort of cache. I use a cache that saves the information retrieved based on the url it called. If another function sets off the same request the cache returns the alraedy fetched data.
What you do need to do as well though is check if the data is outdated so you can refetch it if necessary.
Well, you can just send the function pointer to the function that executes $.get
basically you would then do this:
function worldPeople() {
getCountFromServer(function(data){
//do sth with data
});
}
function getCount() {
getCountFromServer(function(data){
//do sth with data
});
}
function getCountFromServer(callback) {
return $.get("/People/getCount", function (data) {
if (data)
callback(data);
});
}
I generally use a caching module pattern for this kind of thing:
// create a quick singleton to store cached data
var People = (function() {
// private variable to act as cache
var count;
// function to get cached data
// note: You have to assume it's always asynchronous
function getCount(callback) {
// have we loaded the data yet?
if (count===undefined) {
// cache miss: load the data, store it, do the callback
$.get("/People/getCount", function (data) {
count = data;
callback(data);
}
} else {
// cache hit - no need to reload
callback(count);
}
}
// provide access to the getter function
return {
getCount: getCount
};
}());
The first time you hit the cache, it'll load from the server; the second time it will load from the private variable.
// will load the data asynchronously
People.getCount(function(count) {
alert("First hit: " + count);
});
// will use the cached data
People.getCount(function(count) {
alert("Second hit: " + count);
});
Depending on the complexity you want to support, you could add additional features like expiring the cache after a particular interval, caching multiple calls (potentially keyed to the AJAX URL), etc. I like to keep the API simple and not reference the AJAX URLs - that way your cache acts like an abstracted service layer, and you can create other cache implementation to work with different data sources - useful for things like stubbing out data before you've implemented your server-side AJAX handlers.

Returning only the latest callback in Javascript/jQuery?

I'm trying to build a simple autocomplete list:
DOM:
<input id="example"/>
<div id="results"></div>
Javascript:
$('#example').keyup(function(e) {
$('#results').empty();
$.getJSON('Search?input=' + $('#example').val(), function(response) {
// This attaches the results to the results div
updateAutocomplete(response);
});
});
This works, except as the user is typing, I might receive the callbacks in different order. Is there anyway around this? I thought about attaching a timestamp to the results and doing a quick comparison (that way if an earlier response comes later, it'll get rejected). This must be a common problem, what's the best way around this?
You can store and cancel the previous request as you go, like this:
var xhr;
$('#example').keyup(function(e) {
$('#results').empty();
if(xhr) {
xhr.abort();
xhr = null; //cleanup
}
xhr = $.getJSON('Search?input=' + $('#example').val(), function(response) {
updateAutocomplete(response);
});
});
$.getJSON() returns the XmlHttpRequest it creates, so we're just hanging onto a reference to it and aborting if needed.
A delay will be helpful. So, lets see what this posts says: jquery keyup delay?
var delay = (function(){
var timer = 0;
return function(callback, ms){
clearTimeout (timer);
timer = setTimeout(callback, ms);
};
})();
Usage:
$('input').keyup(function() {
delay(function(){
alert('Time elapsed!');
}, 1000 );
});
(code by CMS)
I've typically opted for a few things...
have a small delay before sending (200-250ms)... (e.g. queue the queries) if another keystroke comes in, ignore the old queries.
store the latest keyword you are querying on... in your results, be sure to return the keyword and before displaying ensure that the keyword matches the latest.
Just use closure for callback function with incrementing identifier for each callback instance. Something like this:
var id = 0;
function next_autocomplete_handler() {
var handler_id = ++id;
return function(response) {
if (handler_id == id) // is this latest latest?
updateAutocomplete(response);
};
}
$('#example').keyup(
function(e) {
$('#results').empty();
$.getJSON('Search?input=' + $('#example').val(),
next_autocomplate_handler());
});

Categories