How to continue ajax remaining requests if file not found? - javascript

I'm trying to script a function which takes all the css/js files marked by attribute-data and refreshes if any of the scripts have been modified on the server side. My initial attempt involved php and jquery/javascript. This new attempt is based on javascript/jquery only!
My problem is that while chaining the ajax requests to these files (for Modification date), all ajax requests stop if file not found. For example, if I rename (existing) style.css to (doesn't exist) style_.css, all the chained ajax requests get aborted, and the code doesn't continue.
var file_url = [url1, url1, url3, url4, url5];
function getLatestModificationDate(file_url){
$.when.apply($, file_url.map(function(url) {
return $.ajax({ type: 'HEAD', url: url, beforeSend: function(jqXHR, settings) { jqXHR.url = settings.url; } });
})).done(function() {
var results = [], lastModified, file_jqXHR;
//each argument passed to this callback for ajax, is of this form [data, statusText, jqXHR]
for (var i = 0; i < arguments.length; i++) {
var obj= {};
file_jqXHR = arguments[i][2]; //jqXHR
lastModified = file_jqXHR.getResponseHeader('Last-Modified');
obj['file'] = file_jqXHR.url;
obj['modDate'] = lastModified;
fileArray.push(obj);
}
mainFunction(fileArray); //the main function, not in scope for this question.
});
}
I tried adding error option in ajax after beforeSend, that didn't allow continuing of remaining ajax requests. Don't know if return ajax apply(.., ..)
could return false to skip the current request for 404, cause I don't know how to skip or return false for the ajax? Is there any quick way to check if the file exists? So that I add only existing files to the file_url array that's passed to the function getLatestModificationDate(file_url){...}
EDIT: Here's a screenshot from the Chrome-Console.
EDIT :
I found this question's answer that uses a new deffered for the ajax complete... could someone provide any simplification on how that code can be used for my question? Thanks!
var myDeferred = $.Deferred();
var origDeferred = $.ajax(...);
// if request is ok, i resolve my deferred
origDeferred.done(function() {
myDeferred.resolve.apply(this, arguments);
});
// if request failed, i also resolve my deferred
origDeferred.fail(function() {
myDeferred.resolve.apply(this, arguments);
});

You can use Try-Catch block:
for (var i = 0; i < arguments.length; i++) {
try{
var obj= {};
file_jqXHR = arguments[i][2]; //jqXHR
lastModified = file_jqXHR.getResponseHeader('Last-Modified');
obj['file'] = file_jqXHR.url;
obj['modDate'] = lastModified;
fileArray.push(obj);
}catch(e){}
}

Related

How do an array of pages can be loaded

Instead of request each module per line of code, I'm saving all the modules' string page in an array of modules,
var modules = ['mod1.html','mod2.html', ... , 'modN.html'];
and then pass to a function that suppose to put all the loaded modules into the modules-panel div at once.
function loadModules(modules, callback){
var content = [];
for(var i = 0; i < modules.length; i++){
$('#modules-panel').load(modules[i],function(){
content.push($('#modules-panel').html());
});
}
callback();
}
The issue is that only the last module appears where it should be.
The function should save the loaded module ( page ) into a stack repeatedly and then append all the modules in the page at once.
Given that you want to get the HTML from these pages, it would be better to use $.ajax directly and then read the response from that instead of repetedly filling an extraneous element.
Also, your callback() logic is flawed. You need to call that once all the AJAX requests have completed. For that you can use $.done. Try this:
function loadModules(modules, callback){
var content = [], requests = [];
for(var i = 0; i < modules.length; i++){
requests.push($.ajax({
url: modules[i],
success: function(html) {
content.push($('#modules-panel').html());
}
}));
}
$.when.apply($, requests).done(callback);
}
It should be noted that this may not be the best pattern to use as it will mean your server may be deluged with requests. If you can, I would look in to using server-side includes instead.
This is the place where the loadModules function is called:
var modules = ['loadfile.html', 'getFBinfo.html'];
loadModules(modules, function(data){
var total = '';
for(var i = 0; i < data.length; i++){
total += data[i];
}
$('#modules-panel').html(total);
});
Thanks for clarifying the issue! However the answer I've got was pretty simple!
I've taken the suggestion about ajax. The function became this:
function loadModules(modules, callback){
var content = [];
for(var i = 0; i < modules.length; i++){
$.ajax({
url: modules[i],
success: function(html) {
content.push(html);
if(content.length == modules.length){
callback(content);
}
}
});
}
}
As can be seen, I could return the callback function by comparing the length between what is passed to the function and the total possible/real length (the modules' length).
The problem now is to know what happens if the ajax request gets an error from the server.

Ajax dojo request locking browser big json

I have a big json data about 40000 item. When I send request to get all, browser is locked process until responce come.
So I am sending request by index and chunk like following.
var index = 0;
var chunk = 500;
var repeat = true;
document.getElementById('loading').style.display='inline-block';
while (repeat == true) {
var requestOptions = {
handleAs: "json",
sync: true,
query: {
page: index,
chunk: chunk
},
};
request.get("domain.com/getdata", requestOptions).then(
function(response) {
array.forEach(response.data, function(item) {
//do something
});
if (response.data.length < chunk) {
repeat = false;
document.getElementById('loading').style.display='inline-block';
}
index = index + 1;
},
function(error) {
repeat = false;
}
);
}
I am sending request to get first 500 record. Than get secont 500 record...
When I sart process, the browser locking. I want to Show loading request but not appearing.
I see in the comments on your question that you've been recommended to use async:true, to which you respond that it is sending requests without getting any response, and always sending the same request parameters.
I think then that you're perhaps a bit unfamiliar with the asynchronous paradigm in Javascript (remember, Ajax means asynchronous Javascript and XML).
First off: async:true is the right way to solve your problem. However, as you've noticed, that alone doesn't fix anything in your code.
Here's a simplified and modified version of your code (don't try this, it doesn't work, it's for explanation purposes only).
var index = 0;
var chunk = 500;
var repeat = true;
while (repeat == true) {
var requestOptions = {
handleAs: "json",
sync: false, // false is default, so this line is redundant
query: { page: index, chunk: chunk },
};
request.get("domain.com/getdata", requestOptions).then(
responseOk, responseError);
}
function responseOk(response) {
//do something..
if (response.data.length < chunk) {
repeat = false;
}
index = index + 1;
}
function responseError(error) {
repeat = false;
}
Here's the kicker: the ´responseOk´ function is never run. Therefore, index is never updated, and repeat is never set to false - in effect making your while loop infinite!
Why is this? The reason is that Javascript's "Ajax" functions (which are wrapped by dojo's request.get() and friends) are asynchronous.
What you are saying in your code (or rather, in my simplified version above) is effectively:
Hey, Javascript, do a GET request to the server. When you are done,
sometime in the future, run this responseOk function (or responseError
on error). In the mean time, while you are doing that, I'll
continue with my while loop.
So the while loop keeps churning out GET requests to the server, with the same index! Since the neverending loop is keeping your Javascript thread busy (you only have one!), the responseOk function isn't allowed to execute (even though the server may have responded).
That said, how can you split your huge JSON array into multiple, subsequent requests?
You can try something like this:
var index = 0,
chunk = 500,
requestOptions = {....};
function handleResponseAndGetNextChunk(response) {
response && array.forEach(response.data, function(item) {
//do something
});
if(response && response.data.length < chunk) {
return;
} else {
requestOptions.page = index++;
request.get("domain.com/getdata", requestOptions).then(
handleResponseAndGetNextChunk, responseError);
}
}
// To start off the sequence of requests:
handleResponseAndGetNextChunk(null);

how to capture all requests and response on the page and perform action accordin to need on every response status

is their any way to monitor all the request made on the page triggered by script, click or anything else, it should not dependant on any script block or code just monitor what ever the request made, using jquery and javascript
example:
// monitor all the request made on the page.
monitor{
success:function(){
}
error:function(){
}
}
You can not track all the requests made on the webpage. However, you can track the requests that were made using jQuery by replacing $.ajax bay a wrapper.
Sample replacement plugin:
(function($, undefined) {
// a private variable which will store the current active monitors
var monitors = [];
// a public API to add a monitor.
$.monitorAjax = function(monitor) {
monitors.push(monitor);
};
// here starts the implementation.
// a function to wrap a callback (error or success) to make monitors functions called.
var wrapCallback = function(name, settings) {
return function() {
for(var i = 0; i < monitors.length; i++) {
var monitor = monitors[i];
if(monitor[name] != null) monitor[name].apply(this, arguments);
}
if(settings[name] != null) settings[name].apply(this, arguments);
};
};
// replace $.ajax by a wraped version which will replace success and error callbacks by wrappers.
// note that you may also track calls and their settings if you want.
var unwrappedAjax = $.ajax;
$.ajax = function(url, settings) {
if(settings == null) settings = {};
var wrappedSuccess = wrapCallback("success", settings);
var wrappedError = wrapCallback("error", settings);
var wrappedSettings = $.extend({}, settings, {success: wrappedSuccess, error: wrappedError});
return unwrappedAjax(url, wrappedSettings);
};
})(jQuery);
In jQuery maybe this:
var original_jquery_ajax=$.ajax;
$.ajax=function(){
var a_fn, a_url;
var cb=function(data, status, settings){
a_fn(data, status, settings);
console.log(a_url, data); // <-- here
}
for(var i=0; i<arguments.length; i++)
if(arguments[i] instanceof Object){
if(arguments[i].success){
a_fn=arguments[i].success; arguments[i].success=cb;
}
if(arguments[i].url) a_url=arguments[i].url;
}
if(typeof(arguments[0])=="string") a_url=argements[0];
var aj=original_jquery_ajax.apply(null,arguments);
var done_original=aj.done;
aj.done=function(cb_fn){
a_fn=cb_fn;
done_original(cb);
return aj;
};
return aj;
};
Now, when use $.ajax(url), you have in console the url and the returned data

jQuery .push into an Array in a .get call gives an empty result

Can anyone tell me why the below gives me an empty string? When I console.log(contentArray) in the $.get() callback function it shows the data but when I try to do it where it is in the code below, the result is empty.
sectionArray = [];
contentArray = [];
$(function () {
if (index == 1) {
$('menu:eq(' + (section - 1) + ') li a').each(function () {
sectionArray.push($(this).attr('href'));
});
var len = sectionArray.length;
for (var i = 0; i < len; i++) {
href2 = sectionArray[i];
$.get(href2, function (data) {
string = data.toString();
contentArray.push(string);
});
}
content = contentArray.toString();
console.log(content);
}
because ajax request ends after you call console.log() try this:
$.get(href2, function(data){
string = data.toString();
contentArray.push(string);
content = contentArray.toString();
console.log(content);
});
also do ajax request in loop is not best thing to do. that wont work as you want.
UPDATE:
also jQuery has async option set to false and your code should work but will work slow. Synchronous requests may temporarily lock the browser.
UPDATE 2
maybe try something like this(maybe not so good idea :D):
var countRequests = len;
$.get(href2, function(data){
string = data.toString();
contentArray.push(string);
countRequests = countRequests - 1;
if (countRequests == 0) {
content = contentArray.toString();
console.log(content);
// or create callback
}
});
The problem is that your $.get() ajax requests are executed asynchronously.
That is, the $.get() function returns immediately without waiting for the response, your entire for loop completes (queueing up multiple ajax requests), then your console.log() occurs at which point the array is still empty. Only after that do any of the ajax success handlers get called, regardless of how fast the ajax responses come back.
EDIT: Here is an answer from another question that shows how to do something after all the ajax calls have completed: https://stackoverflow.com/a/6250103/615754

Javascript variable not being updated - possibly out of scope [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
AJAX- response data not saved to global scope?
I basically have a loop that contacts a script on my site using AJAX and then updates a string with the response from that script. Here's the code:
// Image code array
var result_url = 'http://localhost/view/';
for(i = 0; i < urls.length; i++) {
// Add URL to queue
$('#url_queue').append('<div class="uploadifyQueueItem"><div class="cancel"><img src="/assets/img/cancel.png" /></div><span class="fileName">' + image_name_from_url(urls[i]) + '</span><div class="uploadifyProgress"><div class="uploadifyProgressBar"></div></div></div>');
// Make a request to the upload script
$.post('/upload', { url: urls[i], username: username }, function(response) {
var response = jQuery.parseJSON(response);
if(response.error) {
alert(response.error);
return;
}
if(response.img_code) {
result_url += response.img_code + '&';
}
});
}
console.log(result_url);
The Firebug console just shows http://localhost/view/ when the string is logged. It's like the img_code response from my upload script isn't being appended to the string at all. I have tried logging the value of result_url within the $.post() method and that works fine, but the value is not being saved properly because it doesn't show later in my code. Is this a scope problem? Will I have to define result_url as a global variable?
Thanks for any help.
You are checking console.log(result_url); before the AJAX requests complete.
AJAX requests are (by default) run asynchronously. What that means is that your script continues to run while the request is still being made to the server.
Your callback function (provided to $.post as the 3rd parameter) is the one that get's executed after your AJAX request has completed.
Also note, that your AJAX request callback functions are called when the request is done. Your requests might not finish in the same order that they started. You could prevent all this by setting async:false, but that'll halt all of your javascript execution.
Another option would be to collect the jqXHR objects being returned by $.post, and then call $.when().done(), so that your console.log(result_url) happens only when all the AJAX requests are resolved:
// Image code array
var result_url = 'http://localhost/view/',
jqHXRs = [];
for(i = 0; i < urls.length; i++) {
// Add URL to queue
$('#url_queue').append('<div class="uploadifyQueueItem"><div class="cancel"><img src="/assets/img/cancel.png" /></div><span class="fileName">' + image_name_from_url(urls[i]) + '</span><div class="uploadifyProgress"><div class="uploadifyProgressBar"></div></div></div>');
// Make a request to the upload script
jqHXRs.push($.post('/upload', { url: urls[i], username: username }, function(response) {
var response = jQuery.parseJSON(response);
if(response.error) {
alert(response.error);
return;
}
if(response.img_code) {
result_url += response.img_code + '&';
}
}));
}
$.when.apply(this, jqHXRs).done(function(){
console.log(result_url);
});
This is because you're doing the console.log immediately after firing the Ajax. Since Ajax is asynchronous, the success function will not necessarily be called before the code which follows your ajax code.
jQuery's Ajax tools provide a way of calling ajax synchronously by including the async:false option. Try replacing your ajax call with:
$.ajax({
url:'/upload',
data:{ url: rls[i], username:username },
success:function(response) {
var response = jQuery.parseJSON(response);
if(response.error) {
alert(response.error);
return;
}
if(response.img_code) {
result_url += response.img_code + '&';
}
},
method:"post",
async:false
});
That way, code which follows you Ajax call would only be executed after the ajax completes.
Remember, though that this will lock up your page for the duration of the Ajax. Maybe it would be easier to just put the console.log(result_url); at the end of the success callback.
You're logging the result_url after the loop, but the $.post upload request may not have been complete yet. What I would recommend is to put the code that uses result_url inside a continuation call back and call it after you know that the last post request has completed.
e.g.
function continuation_code(result_url) {
// all your code that uses result_url goes here.
}
var result_url = 'http://localhost/view/';
var num_results_returned = 0;
for(i = 0; i < urls.length; i++) {
// Add URL to queue
$('#url_queue').append('<div class="uploadifyQueueItem"><div class="cancel"><img src="/assets/img/cancel.png" /></div><span class="fileName">' + image_name_from_url(urls[i]) + '</span><div class="uploadifyProgress"><div class="uploadifyProgressBar"></div></div></div>');
// Make a request to the upload script
$.post('/upload', { url: urls[i], username: username }, function(response) {
var response = jQuery.parseJSON(response);
if(response.error) {
alert(response.error);
return;
}
if(response.img_code) {
result_url += response.img_code + '&';
}
num_results_returned += 1;
if (num_results_returned == urls.length) {
continuation_code(result_url);
}
});
}

Categories