I'm trying to craft a simple way to defer subsequent JQuery/JavaScript code execution until after this statement is performed and the variable cache[url] has the object returned to it by the .load() operator:
cache[url] = $('<div class="bbq-item"/>').appendTo('.bbq-content').load(url);
This statement occurs in the middle of a hashchange listener function:
$(window).on( 'hashchange', function(e) {
etc.
...and I cannot move the code dependent on the success of the .load() outside of it.
It does not rely on external PHP, JSON, or anything that the typical AJAX "deferred" and "when" operators seem to thrive upon in the examples I've found online; this is pure DOM interrogation and manipulation via Javascript/JQuery.
I've tried wrapping the code that needs to follow it (and is dependent on its success) by wrapping it in a simple "if" clause, like this:
if (cache[url] = $('<div class="bbq-item"/>').appendTo('.bbq-content').load(url)) {
[...code that is dependent on success of the .load()...]
}
...but that doesn't always work, as the loading takes longer than the evaluation in some cases, it seems.
What would be the best strategy to accomplish this?
According to the documentation located here (http://api.jquery.com/load/), you can pass a callback function .load( url [, data ] [, complete ] ) into load and then call it so it would like this:
$('<div class="bbq-item"/>').appendTo('.bbq-content').load(url, function(responseText, textstatus){
cache[url] = responseText;
if(textStatus === "success" || textStatus === "notmodified"){
[...code that is dependent on success of the .load()...]
}
});
Edit: this is the closest you can get since load always returns something into cache[url]... otherwise you won't know if its successful or not.
As a note: you'll always get the if test passing because setting a variable will always pass truthy to the conditional so doing if(cache[url] = $.load()) will always evaluate
Related
Since I'm using this type of call often I wish to make this reusable:
function getJSON(cmd){
$.getJSON(cmd, function(data) {
}).done(function(data) {
return data;
}).fail(function() { console.log('Server failed!') });
}
I hoped to use it like this:
function susbcribe(id){
var cmd = 'subscribe.php?='+id;
var obj = getJSON(cmd);
console.log(obj);
}
But javascript runs console.log before the async json can even return anything to obj.
Just to be clear - i know i can execute code inside of .done(), but because I use this often I wish to forgo rewriting the same function over and over.
So the question is: is there a way to make js stop and wait for getJSON to finish and return something to obj ?
As things currently stand, you will have to at least write the done function every time. You can't escape callback hell by pretending it doesn't exist.
There are ways to avoid some of it by using promises cleverly, but for simple things, this is pretty much as simple as it gets. Until we get support for generators/iterators some time in 2025.
You could set the fail function as a global "ajax event" handler to avoid having to type error handling every time.
So I've got these functions:
function UrlExists(url){
$.ajax({
url: url,
success: function(data){
alert('exists');
},
error: function(data){
alert('fail');
}
});
}
function addScript(filepath, callback){
if (filepath) {
var fileref = document.createElement('script');
fileref.setAttribute("type","text/javascript");
fileref.setAttribute("src", filepath);
if (typeof fileref!="undefined")
document.getElementsByTagName("head")[0].appendChild(fileref);
}
if (callback) {
callback();
}
}
And then in my $(document).ready() I've got a bunch of these:
addScript(roofPathMrtu.js);
addScript(roofPathTrtu.js);
addScript(lowerPathMrtu.js);
etc...
Which I then need to check if they were successfully loaded or not, so I call:
UrlExists('roofPathMrtu.js');
The problem is that this UrlExists function is not working, and I think it's because it is running before all the addScript functions are done.
How can I have my UrlExists function run only after all the addScript functions are done? I was going to use the callback parameter of the addScript function on the last one, but I don't think that is gonna work.
A way that I have been doing this is not to use the javascript method of setimeout(), but using the jquery feature when. IF not, then I would use a Que. The syntax is
$.when(function()).then(fucntion2());
or
$.when(function1()).done(function2());
You could overlap these if you wanted to, but it is not best when considering both elegant and efficiency in code. Using the que would probably be the next step, using $.when will not accomplish what you want.
http://api.jquery.com/jQuery.when/
Your addScript() function is inserting the tag into the dom and returns immediately. At that point, the browser still needs to fetch the javascript file specified in the src attribute. I suppose that UrlExists() is being called after the execution of the addScript() functions but before the browser has a chance to fetch the javascript files.
use a $.Deferred object to "listen" to the various done events. You might want to combine a few $.Deferred object and use the $.when function to listen for multiple resolved promises
http://thiswildorchid.com/jquery-progress-and-promises check out this link it might help. But it sounds like you might need this. It helps a lot with async functions and you should see if it is a good fit for you.
What you want to do is define an onload function on the script element. It's not hard, but the implementation starts to look ugly. For the particular problem you're dealing with, I would recommend you look at Require JS.
I am attempting to do some dynamic loading that includes javascript, css, and html files.
I would like to do it like this:
$.when($.ajax(htmlPath), $.get(cssPath), $.ajax({
url: javascriptPath,
dataType: "text"
}))
.done(function(response){
// i want to pass the data for each of these to the respective functions
appendHtml(what goes here??);
appendCss(what goes here??);
executeJs(what goes here??);
})
.fail(function(){
console.log("failed");
});
So I'm confused on how I separate out the response callbacks. Currently, the response object you see in my .done function is ONLY the HTML file which I called. This function is making the correct ajax calls, and the correct files are all being responded by the server, but how do I access them once ALL the calls are complete? Need this so I won't be applying css/js to HTML which is not there yet, etc. Also, what I do is have the javascript file returned as a string and then i eval() it within the executeJs function. My understanding is that this is an okay use of eval because its the file being returned by our own server so I don't see how it could be tampered with. Is this assumption correct?
Furthermore, in my appendCss function, I'm just adding it to a "style" element in the head. Is there a big issue with this? I am using all this to make a "widget/app based" functionality where I have a js,css,and html for each "app", and I just want to query the server for them when they are needed and the app is loading.
If your downloaded data is being retrieved from the same server as your original web page, then yes, generally, you would have the same level of trust in that code as you do in the code that's already running in the browser.
The problem with eval() in a context like this isn't necessarily that you don't trust the code coming back from your own server; it's that someone might be able to alter the running javascript so that the javascriptPath variable points somewhere you didn't expect it to.
As far as your actual question goes, your done callback will actually be passed three parameters, because your when call included three promises.
Because of the way that you defined your callback (as function(response)), you are only seeing the first one -- the return value from the HTML call. The other two parameters are being ignored.
Each of the three parameters that you are being passed will be an array of three elements: [wasSuccessful, statusText, jqxhr]. To do something useful with them, you could structure your callback something like this:
$.when($.ajax(htmlPath), $.get(cssPath), $.ajax({
url: javascriptPath,
dataType: "text"
}))
.done(function(htmlResponse, cssResponse, jsResponse){
if (htmlResponse[0]) {
appendHtml(htmlResponse[2].responseText);
}
if (cssResponse[0]) {
appendCss(cssResponse[2].responseText);
}
if (jsResponse[0]) {
executeJs(jsResponse[2].responseText);
}
})
(Assuming that you have the appropriate appendHtml, appendCss, and executeJs functions written already)
There are some good examples on this page: http://api.jquery.com/jQuery.when/
And this page has the documentation on the jqxhr object (the third element in each of the arrays that are passed to your done function): http://api.jquery.com/jQuery.ajax/#jqXHR
To access to all the responses just pass three arguments to the done() callback. Try this:
$.when($.ajax(htmlPath), $.get(cssPath), $.ajax({
url: javascriptPath,
dataType: "text"
}))
.done(function(responseHTML, responseCSS, responseJS){
console.log(responseHTML[0]);
console.log(responseCSS[0]);
console.log(responseJS[0]);
})
if you try to print arguments object inside done() you can clearly see that all the responses are passed into the callback
Regarding the use of eval, consider using JSONP instead (dataType: 'jsonp'). This way jQuery takes care of executing the code for you. I suppose jQuery also uses eval() under the hood, but then at least you know that it is done in a proper manner. With respect to safety, also see the related question on when eval() is evil if you haven't already.
in jQuery, I iterate over an xml list of areas and do a POST request to get detailed information about each area. Because sending thousands of requests at once is debilitating for the client and server, I would like to set a flag so that I wait for a request to finish before sending the subsequent [next] request.
if the xml looks like this:
<area>5717</area>
<area>5287</area>
<area>5376</area>
then the xml parsing kinda looks like:
$(xml).find("area").each( function() {
doPost();
}
and the doPost() function looks like
doPost : function () {
$.post( ... )
}
Basically, I would like to add a toggling "wait" but I'm not sure how to achieve this. Is there a way I can keep the essential ".each" iteration or is another type of loop better for this?
Thanks in advance.
A general algorithm off the top of my head:
You could put the whole list into an array. Take the first item of the array and post it. In the success handler of your post you could recursively call the function with the next index int the list.
I wouldn't use async: false because it would then be a blocking operation, which I assume the OP doesn't want.
You can use:
$.ajaxSetup({async:false});
at the top of your script to make your AJAX calls synchronous.
Alternately, you can replace $.post() with $.ajax() and set the async flag to false.
can you do a setTimeout ? that will allow for the function to still process asynchronous and allow for you to wait for some time in there too.
http://www.w3schools.com/js/js_timing.asp
setTimeout(function() {}, 5000)
You can refactor your doPost() function to take the <area> element to process as an argument, and chain into the next element from your success callback. Something like:
(function doPost($area) {
if ($area.length > 0) {
$.post({
// your options,
success: function() {
// your success handling...
doPost($area.next("area"));
}
});
}
})($(xml).find("area").first());
EDIT: Maybe the code above was a little too compact indeed.
Basically, the aim is to refactor your function so that it takes a jQuery object containing the next <area> element to process, or nothing if processing should stop:
function doPost($area) {
if ($area.length > 0) {
// Perform POST request and call ourselves from success callback
// with next <area> element (or nothing if there's no such element).
}
}
Then call this function with the first <area> element to process:
doPost($(xml).find("area").first());
The first code fragment in my answer does both at the same time. Functions are first-class objects in Javascript, and you can call a function you've just defined by enclosing its definition with parenthesis and providing the usual argument list, also surrounded by parenthesis.
Although I could find help on how to redirect pages but I could not find one which would match my situation.
I would like to fadeout the popup -- which has a form inside it -- before redirecting to the form's 'action' link.
I have the fadeout in separate function as that function is more complicated and is called by number of times.
My jquery code :
$('form').submit(function(e){
function(e){
hidePopup()
,function(e){
window.location.href = $('form').attr('action');
};};
});
//hiding popup
function hidePopup(){
$(".popup").fadeOut("slow");
};
fadeOut accepts a callback function as second parameter. That callback is exectued after completely feaded out. So just use:
$(".popup").fadeOut("slow", function(){//Executes after faded out//});
In your code, I'd suggest you rewrite your hidePopup function
function hidePopup(callback){
$(".popup").fadeOut("slow",callback);
};
And execute it like
hidePopup(function(){
window.location.href = $('form').attr('action');
});
Update You can check if you have passed a valid callback with the following code:
function hidePopup(callback){
if (typeof(callback) =='function')
$(".popup").fadeOut("slow",callback);
else
$(".popup").fadeOut("slow");
};
It is a robust check that guarantees that you have passed a valid callback (well, as JS is not strictly-typed language you can't be sure that this function have the correct signature, but that's outside that question).
If you are absolutely sure you will always pass callback or undefined and you are lazy enough to wish typing 20 characters less, you could simplify if (typeof(callback) =='function') to just if (callback). it will check that you have passed something. But that 's not a good practice (but popular though). :)
Update 2: length property of a function returns the number of it's arguments. See MDN