Hello i have sequence like this
$(document).ready(function() {
$("#ctl00_txtsearch").autocomplete({
source: function(request, response) {
$.ajax({
// Here is the code of autocomplete which is requesting
// data and binding as autocomplete
});
});
});
var aa=bindonload();
});
Here is the another function which i want to call on page load
function bindonload() {
$.get( "minicart.aspx#mydatacontent", function( data ) {
var resourceContent = data;
var mini=$(resourceContent).find('div#pnlminicart');
$('#smallcart').html(mini);
});
return false;
}
So , my actual problem is when page is loaded
first of all
bindonload()
called and then autocomplete if textbox have some values right?
But when page is loaded and suddenly i started to write into autocomplete textbox then untill bindlonload function gets executed autocomplete will not work.
I don't have idea how to handle it i have used async:true but its not working i don't want to wait for second process
Thanks in advance....
Well..what i guess..should be..you loaddata() should not be taking too much of time to load.
If there is any way to optimize , look of that.
If you ajax request has a dependency on the other, then you can not make it parallel
If you really intend to make parallel ajax requests, you have to make use of the following:
$.when($.ajax("URL1"), $.ajax("URL2"))
.then(myFunc, myFailure);
Hope it helps..
Note : The Ajax calls should not be dependent
Updated:
$.when( $.ajax( "/page1.php" ), $.ajax( "/page2.php" ) ).done(function( a1, a2 )
{
// a1 and a2 are arguments resolved for the page1 and page2 ajax requests, respectively.
// Each argument is an array with the following structure:
[ data, statusText, jqXHR ]
var data = a1[ 0 ] + a2[ 0 ]; // a1[ 0 ] = "Whip", a2[ 0 ] = " It" if ( /Whip It/.test( data ) ) { alert( "We got what we came for!" );
}});
In the above example , you can see two ajax requests executes parallel
After the two requests are done, i.e on sucess of two functions , the add operation is being performed
Similarly , you can replace $.ajax("/page1.php") with your loaddata() and then
$.ajax("page2.php") with your Auto Complete request.
Both of them will execute Parallely
Related
I want to implement multiple ajax post requests. Suppose there are 3 posts. Then the 2nd post is dependent on the result from the first and the third post is dependedent on the result received from the 2nd post.
How do I place the 2nd ajax post method. Should it be done in the success handler
jQuery.ajax({
type : "post",
dataType : "json",
url : ajaxurl,
data : form_data,
async: false,
success: function(response) {
//2nd ajax post call to be placed here?
}
}
})
//or should 2nd ajax post call be placed after
I have seen some people also using jQuery.when() but I am not sure whether I could use that.
Since here I will have to check for when condition 3 times.
Thanks in Advance.
Something like this?
From https://api.jquery.com/jQuery.when/
$.when( $.ajax( "/page1.php" ), $.ajax( "/page2.php" ) ).done(function( a1, a2 ) {
// a1 and a2 are arguments resolved for the page1 and page2 ajax requests, respectively.
// Each argument is an array with the following structure: [ data, statusText, jqXHR ]
var data = a1[ 0 ] + a2[ 0 ]; // a1[ 0 ] = "Whip", a2[ 0 ] = " It"
if ( /Whip It/.test( data ) ) {
alert( "We got what we came for!" );
}
});
a1, a2 being the results returned from the various callbacks?
(This will however execute your three callbacks (async), but return the results of all three)
Otherwise if you've got a dependency from request1 to request2 you can do something like this https://api.jquery.com/jQuery.ajax/
$.ajax("page1.php").done(function(a1) {
if (a1 == "something") { // if 2nd call dependent on results from 1st
$.ajax("page2.php").done(function(a2) {
}).fail(function() {
// handle with grace
});
}
}).fail(function() {
// handle with grace
});
I have a Javascript/JQuery script that makes multiple getJSON requests to different APIs, which goes something like this:
var BTC_Value = 0;
var LTC_Value = 0;
var loadCoinValues = function()
{
$.getJSON( "http://data.mtgox.com/api/2/BTCUSD/money/ticker_fast", function( info ) {
BTC_Value = info.data['last_local']['value'];
});
$.getJSON( "https://btc-e.com/api/2/ltc_usd/ticker", function( info ) {
LTC_Value = info.ticker['avg'];
});
};
loadCoinValues();
$("h1").text(BTC_Value); //This returns the correct value.
$("h2").text(LTC_Value); //This returns nothing.
Why does the second getJSON not display a value? Is there a rule I do not know about affecting the results of my code?
$.getJSON is an asynchronous call. You should do something like this instead:
$.getJSON(... ,function(info) {
$('h1').text(info.data['last_local']['value']);
});
When you do $('h1').text(BTC_Value); BTC_Value doesn't have the value you want yet.
When the AJAX request completets it does, but not before.
I am using jQuery UI autocomplete for a city search on my site. It starts to search after the user has entered 3 characters.
I'm wondering how to change this script to abort the last query if the user continues typing.
function enableCitiesAutocomplete() {
var url = LIST.urls.api_cities_search;
$('#txt_search_city').autocomplete({
source: url,
minLength: 3,
select: function( event, ui ) {
$(this).val( ui.item.value );
$( "#id_city" ).val( ui.item.id );
$(this).closest('form').submit();
}
});
}
I don't know if you can "abort" a completion that has already begun. But you may be able to get what you want by interrupting the render part of the autocomplete. This might work if the autocomplete consists of a query and a render, and the query involves a network transaction.
If there's a query and a render, and the query is just a search through a locally-stored list, then this probably won't work, because the latency is too low.
This SO question describes how to monkey-patch the renderMenu and renderItem fns in the jQuery autocomplete package. I'm thinking you can use the same monkey-patch approach.
You'd need to specify, as the source, a function, not a URL. This SO question describes how. Then, in that function, increment a count variable that says "a search is in process". Then perform the search by doing an "ajax get". Also patch the renderMenu to (a) only render if a single search is in process, and (b) decrement the search count.
It will probably look something like this:
var queriesInProcess = 0;
var ac = $('#txt_search_city').autocomplete({
minLength: 3,
select : .... ,
// The source option can be a function that performs the search,
// and calls a response function with the matched entries.
source: function(req, responseFn) {
var url = baseUrl + req.term;
queriesInProcess++;
$.ajax({
url: url,
//data: data,
success: function(json,xhr) {
queriesInProcess--;
var a = ....retrieve from the json ....
responseFn( a );
},
errpr : function () { queriesInProcess--; },
dataType: 'json'
});
}
});
ac.data( "autocomplete" )._renderMenu = function( ul, items ) {
if (queriesInProcess > 0) return;
var self = this;
$.each( items, function( index, item ) {
self._renderItem( ul, item );
});
};
If you want to get more sophisticated, you can also try to abort the pending ajax request.
This would need to happen in the source function; you'd need to cache/stash the XHR returned from the $.ajax() call, and call abort() when a new one comes through.
As a learning exercise I've hacked together a script for an SO feature request (for the purposes of this question please ignore the merits or otherwise of that request). In the script I've encountered a technical issue that my limited javascript knowledge can't get past and I'd appreciate suggestions on how to resolve it.
To avoid spamming the server I use some search hacks to determine the number of answers and accepted answers for a tag. This involves using window.setTimeout() to callback to a function that sends a get request for each tag, increasing the timeout on each call to stagger the requests.
To get the results in a single request involves appending &pagesize=1 to the end of the url in the get request, so that the number of pages in the results gives you the total number of results without having to make any further requests.
A side affect of this approach is that subsequent page views use &pagesize=1 and I only see a single entry. I attempt to resolve this by firing another query with &pagesize=30 to reset it afterwards, but as it is all asynchronous the timing of the last query can result in the pagesize either being 1 or 30, depending on which request completes first. I've tried adding a further timeout and callback for this "reset" query but it hasn't really helped.
Is there a means to monitor the queries, waiting until all have been completed, then once they have all completed send the reset request? Or is there another approach that I could take?
You could make a call chain
Based on my previous idea of a ParallelAjaxExecuter, here's a SerialAjaxExecuter
$(function(){
var se = new SerialAjaxExecuter( function( results )
{
console.log( results );
}, 1000 );
se.addRequest( $.get, 'test.php', {n:1}, function( d ){ console.log( '1 done', d ); }, 'text' );
se.addRequest( $.get, 'test.php', {n:2}, function( d ){ console.log( '2 done', d ); }, 'text' );
se.addRequest( $.get, 'test.php', {n:3}, function( d ){ console.log( '3 done', d ); }, 'text' );
se.addRequest( $.get, 'test.php', {n:4}, function( d ){ console.log( '4 done', d ); }, 'text' );
se.execute();
});
var SerialAjaxExecuter = function( onComplete, delay )
{
this.requests = [];
this.results = [];
this.delay = delay || 1;
this.onComplete = onComplete;
}
SerialAjaxExecuter.prototype.addRequest = function( method, url, data, callback, format )
{
var self = this;
this.requests.push( {
"method" : method
, "url" : url
, "data" : data
, "format" : format
, "callback" : callback
} );
var numRequests = this.requests.length;
if ( numRequests > 1 )
{
this.requests[numRequests-2].callback = function( nextRequest, completionCallback )
{
return function( data )
{
completionCallback( data );
setTimeout( function(){ self.execute( nextRequest ); }, self.delay );
}
}( this.requests[numRequests-1], this.requests[numRequests-2].callback )
}
}
SerialAjaxExecuter.prototype.execute = function( request )
{
var self = this;
if ( 'undefined' == typeof request )
{
request = this.requests[0];
var lastRequest = this.requests[this.requests.length-1];
lastRequest.callback = function( completionCallback )
{
return function( data )
{
completionCallback( data )
self.onComplete( self.results );
}
}( lastRequest.callback )
}
request.method( request.url, request.data, function( r )
{
return function( data )
{
self.results.push( data );
r.callback( data );
}
}( request ) )
}
I didn't bake in a sleep period between requests, but that could certainly be added. Added the timeout
Note: this example is littered with console.log() calls for which you need firebug, or just remove them.
I'm not sure if I fully understand the problem but why not chain the requests rather than using a setTimeout? So at the end of the response handler of one request fire off the next request.
Append &pagesize= to every link on page that would need it with the pagesize you're currently using.
I have a tightly coupled javascript, where in there are series of if-else checks and multiple ajax calls are made. The ajax calls are nested type. My problem is I am in a deep nested ajax callable function and I want to get out from there gracefully.
The snippet of the code is .
function showSubscriptionLightBox() {
$.get("/ajax/get_subscription_lightbox_content.php?feed_id=" + feedid, function(data) {
//Work on the data we receive... and check whether user is logged in.
if(userLoggedIn) {
//Make one more ajax call
$.get("/ajax/is_user_subscribed.php?feed_id=" + feedid, function(data) {
//Work on data again.... and check if user is subscribed.
if(userSubscribed) {
//Then there is popup which comes up, a part of same page and it has a button name "task".
document.getElementById('task').onclick = function() {
if(document.getElementById('email_mode').checked) {
$.ajax({
url : "ajax/is_user_email_verified.php?user_id="+userID,
success : function(data) {
if(!data)
return;
var response;
response = eval("response = " + data);
if(!response)
return;
if(response['email_status'] == 0) {
//Exit from here
}}}
......
other part of code..
I want to exit gracefully from javascript, when the response['email_status'] == 0
Please tell me, how to do this??
I tried the return statement, but it took me to the enclosing function and not outside the script.
Thanks,
Amit
For what it is worth, here is some code from one of my applications. It syncs records using JSONP and AJAX. It first gets an array of object ids from a remote server. It then fetches the record for the object id at the zero index from the host server. Then it sends the record it receives to the remote server. At that point, it continues the process by starting the process with an incremented index into the array of ids. It terminates when the index reaches the end of the array.
(function( $ ) {
$.getJSON( 'http://remote.com/admin/record_ids.js?callback=?', function( data ) {
var set_record = function( index ) {
if ( index < data.length ) {
$.get( 'record_get.json', { contact_id: data[ index ] }, function( data ) {
$.getJSON( 'http://remote.com/admin/record_save.js?callback=?', data, function() {
set_record( index + 1 );
});
}, 'json');
}
};
set_record( 0 );
});
})( jQuery );
As you can see, when you want to get out gracefully, you just don't call. I can't imagine why you can't just return to stop your code.
There's a funny trick you can always use in JavaScript to escape the call stack: setTimeout(). It's useful in many situations, not just this, it is often used to work around DOM event related bugs in browsers as well.
$.ajax(
{
url: 'lol.php',
success: function(data)
{
setTimeOut(function()
{
// Your code comes here
}, 0); // 0 will ensure that it gets executed immediately
}
});
I know that with Prototype you could do this with try/catch blocks. You could throw an object from within one of the inner functions and it will travel up the call stack for other functions to intercept.