I've noticed that the size of a file requested will effect how long the response takes for ajax calls. So if I fire 3 ajax GET requests for files of varying size, they may arrive in any order. What I want to do is guarantee the ordering when I append the files to the DOM.
How can I set up a queue system so that when I fire A1->A2->A3. I can guarantee that they are appeneded as A1->A2->A3 in that order.
For example, suppose A2 arrives before A1. I would want the action to wait upon the arrival and loading of A1.
One idea is to create a status checker using a timed callback as such
// pseudo-code
function check(ready, fund) {
// check ready some how
if (ready) {
func();
} else {
setTimeout(function () {
check(ready, fund);
}, 1); // check every msec
}
}
but this seems like a resource heavy way, as I fire the same function every 1msec, until the resources is loaded.
Is this the right path to complete this problem?
status checker using a 1msec-timed callback - but this seems like a resource heavy way; Is this the right path to complete this problem?
No. You should have a look at Promises. That way, you can easily formulate it like this:
var a1 = getPromiseForAjaxResult(ressource1url);
var a2 = getPromiseForAjaxResult(ressource2url);
var a3 = getPromiseForAjaxResult(ressource3url);
a1.then(function(res) {
append(res);
return a2;
}).then(function(res) {
append(res);
return a3;
}).then(append);
For example, jQuery's .ajax function implements this.
You can try something like this:
var resourceData = {};
var resourcesLoaded = 0;
function loadResource(resource, callback) {
var xhr = new XMLHttpRequest();
xhr.onload = function() {
var state = this.readyState;
var responseCode = request.status;
if(state == this.DONE && responseCode == 200) {
callback(resource, this.responseText);
}
};
xhr.open("get", resource, true);
xhr.send();
}
//Assuming that resources is an array of path names
function loadResources(resources) {
for(var i = 0; i < resources.length; i++) {
loadResource(resources[i], function(resource, responseText) {
//Store the data of the resource in to the resourceData map,
//using the resource name as the key. Then increment the
//resource counter.
resourceData[resource] = responseText;
resourcesLoaded++;
//If the number of resources that we have loaded is equal
//to the total number of resources, it means that we have
//all our resources.
if(resourcesLoaded === resources.length) {
//Manipulate the data in the order that you desire.
//Everything you need is inside resourceData, keyed
//by the resource url.
...
...
}
});
}
}
If certain components must be loaded and executed before (like certain JS files) others, you can queue up your AJAX requests like so:
function loadResource(resource, callback) {
var xhr = new XMLHttpRequest();
xhr.onload = function() {
var state = this.readyState;
var responseCode = request.status;
if(state == this.DONE && responseCode == 200) {
//Do whatever you need to do with this.responseText
...
...
callback();
}
};
xhr.open("get", resource, true);
xhr.send();
}
function run() {
var resources = [
"path/to/some/resource.html",
"path/to/some/other/resource.html",
...
"http://example.org/path/to/remote/resource.html"
];
//Function that sequentially loads the resources, so that the next resource
//will not be loaded until first one has finished loading. I accomplish
//this by calling the function itself in the callback to the loadResource
//function. This function is not truly recursive since the callback
//invocation (even though it is the function itself) is an independent call
//and therefore will not be part of the original callstack.
function load(i) {
if (i < resources.length) {
loadResource(resources[i], function () {
load(++i);
});
}
}
load(0);
}
This way, the next file will not be loaded until the previous one has finished loading.
If you cannot use any third-party libraries, you can use my solution. However, your life will probably be much easier if you do what Bergi suggested and use Promises.
There's no need to call check() every millisecond, just run it in the xhr's onreadystatechange. If you provide a bit more of your code, I can explain further.
I would have a queue of functions to execute and each of them checks the previous result has completed before executing.
var remoteResults[]
function requestRemoteResouse(index, fetchFunction) {
// the argument fetchFunction is a function that fetches the remote content
// once the content is ready it call the passed in function with the result.
fetchFunction(
function(result) {
// add the remote result to the list of results
remoteResults[index] = result
// write as many results as ready.
writeResultsWhenReady(index);
});
}
function writeResults(index) {
var i;
// Execute all functions at least once
for(i = 0; i < remoteResults.length; i++) {
if(!remoteResults[i]) {
return;
}
// Call the function that is the ith result
// This will modify the dom.
remoteResults[i]();
// Blank the result to ensure we don't double execute
// Store a function so we can do a simple boolean check.
remoteResults[i] = function(){};
}
}
requestRemoteResouse(0, [Function to fetch the first resouse]);
requestRemoteResouse(1, [Function to fetch the second resouse]);
requestRemoteResouse(2, [Function to fetch the thrid resouse]);
Please note that this is currently O(n^2) for simplicity, it would get faster but more complex if you stored an object at every index of remoteResults, which had a hasRendered property. Then you would only scan back until you found a result that had not yet occurred or one that has been rendered.
Related
I've got a bunch of integration tests using headless chrome. Because restarting the browser on an entirely new profile is so expensive the harness tries to "clean up" the browser state (flush caches, clear cookies and storage, ...) on teardown.
However there's a recurring issue that during the cleanup phase some async operations resolve and try to do whatever they do in a now nonsensical state.
There are two issues here:
async stack traces support in CDT are listed as experimental and don't appear at all in the response (possibly because they have to be enabled via a hidden flag somehow)
I have no idea what's still running at that point, and can't really even debug what breaks due to (1)
Is there any way to improve the situation expect by trawling through heisenbugs as they occur, trying to slowly make my way up the async callstacks throuth ever more logging until the root cause is found?
First we make a hook to be able to capture all xhr packets. You'll have to execute this before any of your other scripts load. Probaly put this in your boot/prepare script before running tests.
I have implemented below a start and stop button. start makes 300 xhr requests, just the "normal" way. If you press stop, you can cancel them all. Ideally you'd put the stop event handler code in an beforeunload event.
If you don't want to stop them, you can analyze their state, requested urls, etc... from one neat array where you keep track of everything within code.
This example works because only "so" many requests can be made at the same time by the browser. The rest in the queue waits as pending until a slot comes free. I used a 300 requests because I don't know a large/slow source to request from that isn't CORS protected, and this gives us humans enough time to press the stop button(I hope).
function addXMLRequestCallback(callback){
var oldSend, i;
if( XMLHttpRequest.callbacks ) {
// we've already overridden send() so just add the callback
XMLHttpRequest.callbacks.push( callback );
} else {
// create a callback queue
XMLHttpRequest.callbacks = [callback];
// store the native send()
oldSend = XMLHttpRequest.prototype.send;
// override the native send()
XMLHttpRequest.prototype.send = function(){
// process the callback queue
// the xhr instance is passed into each callback but seems pretty useless
// you can't tell what its destination is or call abort() without an error
// so only really good for logging that a request has happened
// I could be wrong, I hope so...
// EDIT: I suppose you could override the onreadystatechange handler though
for( i = 0; i < XMLHttpRequest.callbacks.length; i++ ) {
XMLHttpRequest.callbacks[i]( this );
}
// call the native send()
oldSend.apply(this, arguments);
}
}
}
/**
* adding some debug data to the XHR objects. Note, don't depend on this,
* this is against good practises, ideally you'll have your own wrapper
* to deal with xhr objects and meta data.
* The same way you can extend the XHR object to catch post data etc...
*/
var xhrProto = XMLHttpRequest.prototype,
origOpen = xhrProto.open;
origSend = xhrProto.send;
xhrProto.open = function (method, url) {
this._url = url;
return origOpen.apply(this, arguments);
};
xhrProto.send = function (data) {
this._data = data;
return origSend.apply(this, arguments);
};
+function() {
var xhrs = [],
i,
statuscount = 0,
status = document.getElementById('status'),
DONE = 4;;
addXMLRequestCallback((xhr) => {
xhrs.push(xhr);
});
document.getElementById('start').addEventListener('click',(e) => {
statuscount = 0;
var data = JSON.stringify({
'user': 'person',
'pwd': 'password',
'organization': 'place',
'requiredkey': 'key'
});
for(var i = 0;i < 300; i++) {
var oReq = new XMLHttpRequest();
oReq.addEventListener("load", (e) => {
statuscount++;
status.value=statuscount;
});
oReq.open("GET", 'https://code.jquery.com/jquery-3.4.1.js');
oReq.send(data);
}
});
document.getElementById('cancel').addEventListener('click', (event) => {
for(i = 0; i < xhrs.length; i++) {
if(xhrs[i].readyState !== DONE) {
console.log(xhrs[i]._url, xhrs[i]._data , 'is not done');
}
}
/** Cancel everything */
for(i = 0; i < xhrs.length; i++) {
if(xhrs[i]) {
xhrs[i].abort();
}
}
});
}();
<button id="start">start requests</button>
<button id="cancel">cancel requests</button>
<progress id="status" value="0" max="300"></progress>
Code of addXMLRequestCallback courtesy of meouw from this answer
Code of xhrProto keeping debug variables courtesy Joel Richard of from this answer
I use Parse in iOS to run a cloud code method that gets an ID in it's request and receives a number in the response.
The purpose of the cloud code function is to take the request ID and add it to a field of 3 different users.
Here is the cloud code method in Javascript:
amount = 3;
// Use Parse.Cloud.define to define as many cloud functions as you want.
// For example:
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
query.find({
success: function(results) {
var sent = 0;
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
results[i].save();
sent = sent+1;
}
if (sent >= amount) {
break;
}
}
response.success(sent);
},
error: function() {
response.error("Test failed");
}
});
});
When running this cloud code method I get a response of '3' meaning it called .save for 3 users. The problem is that when i go back to look in the Database viewer in the parse website it actually only updated a single user (Its always the same user). No matter how many times i run this code, it will only actually update the first user..
Anyone know why this is happening?
Both save and saveAll are asynchronous, so you should make sure the saving process is done.
Also note that, the user object can only be updated by the owner or request with masterkey.
The following code should work:
var amount = 3;
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
return query.find()
.then(function(results) { // success
var toSave = [];
var promise = new Parse.Promise();
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
toSave.push(results[i]);
}
if (toSave.length >= amount) {
break;
}
}
// use saveAll to save multiple object without bursting multiple request
Parse.Object.saveAll(toSave, {
useMasterKey: true,
success: function(list) {
promise.resolve(list.length);
},
error: function() {
promise.reject();
}
});
return promise;
}).then(function(length) { // success
response.success(length);
}, function() { // error
response.error("Test failed");
});
});
The reason this is happening is two-fold:
save() is an asynchronous method, and
response.success() will immediately kill your running code as soon as it's called.
So what's happening is that inside your for loop you're running save() several times, but since it's asynchronous, they're simply thrown into the processing queue and your for loop continues on through. So it's quickly throwing all of your save()'s into the processing queue, and then it reaches your response.success() call but, by the time it's reached, only one of the save()'s has had a chance to successfully process.
I need a little help. I'm trying to run my second function "likeLinks();" but only after my first function "getLikeURLs();" is finished. This is because my 2nd function relies on the links Array to execute. It seems like they are trying to run at the same time.
Any help would be appreciated.
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
likeLinks();
function getLikeURLs() {
for (i = 1; i < parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 2; i++) {
var link = $.get(url + 'page-' + i, function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}
The link variables are actually jQuery deferred objects - store them in an array and then you can use $.when() to create a mew deferred object that only resolves when all of the previous $.get() operations have completed:
function getLikeURLs(url) { // NB: parameter, not global
var defs = [], links = []; // NB: links no longer global
for (...) {
var link = $.get(...);
defs.push(link);
}
// wait for previous `$.get` to finish, and when they have create a new
// deferred object that will return the entire array of links
return $.when.apply($, defs).then(function() { return links; });
}
Then, to start the chain of functions:
getLikeURLs(url).then(likeLinks);
Note that likeLinks will now be passed the array of links instead of accessing it from the global state. That function should also be rewritten to allow you to wait for its $.post calls to complete, too:
function likeLinks(links) {
// loop invariant - take it outside the loop
var token = document.getElementsByName('_xfToken')[0].getAttribute('value');
// create array of deferreds, one for each link
var defs = links.map(function(link) {
return $.post(link, {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
});
});
// and another for when they're all done
return $.when.apply($, defs);
}
p.s. don't put that (relatively) expensive parseInt(document.getAttribute(...)) expression within the for statement - it'll cause it to be evaluated every iteration. Calculate it once outside the loop and store it in a variable. There's a few other places where you're repeating calls unnecessarily, e.g. window.location.pathname.split()
EDIT: My answer discusses the issue but see Alnitak answer for a much better solution.
The get in getLikeURLs and the put in likeLinks are both asynchronous. The calls to both of these function return immediately. When data is returned from the called server at some indeterminate time later, the callback functions are then called. The puts could return before the gets which would be a problem in your case. Also note that JavaScript is NOT multi-threaded so the two methods, getLikeURLs and likeLinks will never run at the same time. The callback functions, on the other hand, might be called at anytime later with no guarantee as to the call back order. For example, the 3rd get/put might return before the 1st get/put in your loops.
You could use $.ajax to specify that the gets and puts are synchronous but this is ill advised because the browser will hang if ANY get/put doesn't return in a reasonable amount of time (e.g. server is offline). Plus you don't have the "multi-tasking" benefit of sending out a lot of requests and having the various servers working at the same time. They would do so serially.
The trick is to simply call likeLinks form the callback function in getLikeURL. Your case is a little tricky because of the for loop but this should work:
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
//likeLinks(); // Don't call yet. Wait for gets to all return.
function getLikeURLs() {
var returnCount = 0; // Initialize a callback counter.
var count = parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 1;
for (i = 0; i < count; i++) {
var link = $.get(url + 'page-' + (i + 1), function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
// If all gets have returned, call likeLinks.
returnCount++;
if (returnCount === count) {
likeLinks();
}
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}
NOT A DUPLICATE AS I HAVE YET TO FOUND A SATISFYING ANSWER ON OTHER THREADS:
Load and execute javascript code SYNCHRONOUSLY
Loading HTML and Script order execution
Load and execute javascript code SYNCHRONOUSLY
Looking for native Javascript answers, no jQuery, no requireJS, and so forth please :)
SUMMARY OF THE ENTIRE QUESTION:
I want to asynchronously load scripts but have ordered execution
I am trying to enforce that the code in the inserted script elements execute exactly in the same order as they were added to the dom tree.
That is, if I insert two script tags, first and second, any code in first must fire before the second, no matter who finishes loading first.
I have tried with the async attribute and defer attribute when inserting into the head but doesn't seem to obey.
I have tried with element.setAttribute("defer", "") and element.setAttribute("async", false) and other combinations.
The issue I am experiencing currently has to do when including an external script, but that is also the only test I have performed where there is latency.
The second script, which is a local one is always fired before the first one, even though it is inserted afterwards in the dom tree ( head ).
A) Note that I am still trying to insert both script elements into the DOM. Ofcourse the above could be achieved by inserting first, let it finish and insert the second one, but I was hoping there would be another way because this might be slow.
My understanding is that RequireJS seems to be doing just this, so it should be possible. However, requireJS might be pulling it off by doing it as described in A).
Code if you would like to try directly in firebug, just copy and paste:
function loadScript(path, callback, errorCallback, options) {
var element = document.createElement('script');
element.setAttribute("type", 'text/javascript');
element.setAttribute("src", path);
return loadElement(element, callback, errorCallback, options);
}
function loadElement(element, callback, errorCallback, options) {
element.setAttribute("defer", "");
// element.setAttribute("async", "false");
element.loaded = false;
if (element.readyState){ // IE
element.onreadystatechange = function(){
if (element.readyState == "loaded" || element.readyState == "complete"){
element.onreadystatechange = null;
loadElementOnLoad(element, callback);
}
};
} else { // Others
element.onload = function() {
loadElementOnLoad(element, callback);
};
}
element.onerror = function() {
errorCallback && errorCallback(element);
};
(document.head || document.getElementsByTagName('head')[0] || document.body).appendChild(element);
return element;
}
function loadElementOnLoad(element, callback) {
if (element.loaded != true) {
element.loaded = true;
if ( callback ) callback(element);
}
}
loadScript("http://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.min.js",function() {
alert(1);
})
loadScript("http://ajax.googleapis.com/ajax/libs/chrome-frame/1.0.3/CFInstall.min.js",function() {
alert(2);
})
If you try the above code in like firebug, most often it will fire 2, and then 1. I want to ensure 1 and then 2 but include both in the head.
if I insert two script tags, first and second, any code in first must fire before the second, no matter who finishes loading first. I have tried with the async attribute and defer attribute
No, async and defer won't help you here. Whenever you dynamically insert script elements into the DOM, they are loaded and executed asynchronically. You can't do anything against that.
My understanding is that RequireJS seems to be doing just this
No. Even with RequireJS the scripts are executed asynchronous and not in order. Only the module initialiser functions in those scripts are just define()d, not executed. Requirejs then does look when their dependencies are met and executes them later when the other modules are loaded.
Of course you can reinvent the wheel, but you will have to go with a requirejs-like structure.
Ok, I think I have now came up with a solution.
The trick is that we keep track of each script to be loaded and their order as we insert them into the dom tree. Each of their callback is then registered accordingly to their element.
Then we keep track of when all has finished loading and when they all have, we go through the stack and fire their callbacks.
var stack = [];
stack.loaded = 0;
function loadScriptNew(path, callback) {
var o = { callback: callback };
stack.push(o);
loadScript(path, function() {
o.callbackArgs = arguments;
stack.loaded++;
executeWhenReady();
});
}
function executeWhenReady() {
if ( stack.length == stack.loaded ) {
while(stack.length) {
var o = stack.pop();
o.callback.apply(undefined, o.callbackArgs);
}
stack.loaded = 0;
}
}
// The above is what has been added to the code in the question.
function loadScript(path, callback) {
var element = document.createElement('script');
element.setAttribute("type", 'text/javascript');
element.setAttribute("src", path);
return loadElement(element, callback);
}
function loadElement(element, callback) {
element.setAttribute("defer", "");
// element.setAttribute("async", "false");
element.loaded = false;
if (element.readyState){ // IE
element.onreadystatechange = function(){
if (element.readyState == "loaded" || element.readyState == "complete"){
element.onreadystatechange = null;
loadElementOnLoad(element, callback);
}
};
} else { // Others
element.onload = function() {
loadElementOnLoad(element, callback);
};
}
(document.head || document.getElementsByTagName('head')[0] || document.body).appendChild(element);
return element;
}
function loadElementOnLoad(element, callback) {
if (element.loaded != true) {
element.loaded = true;
if ( callback ) callback(element);
}
}
loadScriptNew("http://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.js",function() {
alert(1);
});
loadScriptNew("http://ajax.googleapis.com/ajax/libs/chrome-frame/1.0.3/CFInstall.min.js",function() {
alert(2);
});
Ok, some of you might argue that there is missing info in the question, which I will give you and here we are actually just solving the callback. You are right. The code in the script is still executed in the wrong order, but the callback is now.
But for me this is good enough, as I intend to wrap all code that is loaded in a method call, alá AMD, such as a require or define call and will put on stack there, and then fire them in the callback instead.
I am still hoping out for Asad and his iframe solution, which I believe might provide the best answer to this question. For me though, this solution will solve my problems :)
I am posting here just like a draft
This do not work because cross-domain police
Here the idea is to obtain all scripts first and when they are in memory, execute them in order.
function loadScript(order, path) {
var xhr = new XMLHttpRequest();
xhr.open("GET",path,true);
xhr.send();
xhr.onreadystatechange = function(){
if(xhr.readyState == 4){
if(xhr.status >= 200 && xhr.status < 300 || xhr == 304){
loadedScripts[order] = xhr.responseText;
}
else {
//deal with error
loadedScripts[order] = 'alert("this is a failure to load script '+order+'");';
// or loadedScripts[order] = ''; // this smoothly fails
}
alert(order+' - '+xhr.status+' > '+xhr.responseText); // this is to show the completion order. Careful, FF stacks aletrs so you see in reverse.
// am I the last one ???
executeAllScripts();
}
};
}
function executeAllScripts(){
if(loadedScripts.length!=scriptsToLoad.length) return;
for(var a=0; a<loadedScripts.length; a++) eval(loadedScripts[a]);
scriptsToLoad = [];
}
var loadedScripts = [];
var scriptsToLoad = [
"http://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.min.js",
"http://ajax.googleapis.com/ajax/libs/chrome-frame/1.0.3/CFInstall.min.js",
"http://nowhere.existing.real_script.com.ar/return404.js"
];
// load all even in reverse order ... or randomly
for(var a=0; a<scriptsToLoad.length; a++) loadScript(a, scriptsToLoad[a]);
After a while of fiddling around with it, here is what I came up with. Requests for the scripts are sent off immediately, but they are executed only in a specified order.
The algorithm:
The algorithm is to maintain a tree (I didn't have time to implement this: right now it is just the degenerate case of a list) of scripts that need to be executed. Requests for all of these are dispatched nearly simultaneously. Every time a script is loaded, two things happen: 1) the script is added to a flat list of loaded scripts, and 2) going down from the root node, as many scripts in each branch that are loaded but have not been executed are executed.
The cool thing about this is that not all scripts need to be loaded in order for execution to begin.
The implementation:
For demonstration purposes, I am iterating backward over the scriptsToExecute array, so that the request for CFInstall is sent off before the request for angularJS. This does not necessarily mean CFInstall will load before angularJS, but there is a better chance of it happening. Regardless of this, angularJS will always be evaluated before CFInstall.
Note that I've used jQuery to make my life easier as far as creating the iframe element and assigning the load handler is concerned, but you can write this without jQuery:
// The array of scripts to load and execute
var scriptsToExecute = [
"http://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.min.js?t=" + Date.now(),
"http://ajax.googleapis.com/ajax/libs/chrome-frame/1.0.3/CFInstall.min.js?t=" + Date.now()
];
// Loaded scripts are stored here
var loadedScripts = {};
// For demonstration purposes, the requests are sent in reverse order.
// They will still be executed in the order specified in the array.
(function start() {
for (var i = scriptsToExecute.length - 1; i >= 0; i--) {
(function () {
var addr = scriptsToExecute[i];
requestData(addr, function () {
console.log("loaded " + addr);
});
})();
}
})();
// This function executes as many scripts as it currently can, by
// inserting script tags with the corresponding src attribute. The
// scripts aren't reloaded, since they are in the cache. You could
// alternatively eval `script.code`
function executeScript(script) {
loadedScripts[script.URL] = script.code
while (loadedScripts.hasOwnProperty(scriptsToExecute[0])) {
var scriptToRun = scriptsToExecute.shift()
var element = document.createElement('script');
element.setAttribute("type", 'text/javascript');
element.setAttribute("src", scriptToRun);
$('head').append(element);
console.log("executed " + scriptToRun);
}
}
// This function fires off a request for a script
function requestData(path, loadCallback) {
var iframe = $("<iframe/>").load(function () {
loadCallback();
executeScript({
URL: $(this).attr("src"),
code: $(this).html()
});
}).attr({"src" : path, "display" : "none"}).appendTo($('body'));
}
You can see a demo here. Observe the console.
cant you nest the loading using ur callbacks?
ie:
loadScript("http://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.min.js",function() {
alert(1);
loadScript("http://ajax.googleapis.com/ajax/libs/chrome-frame/1.0.3/CFInstall.min.js",function() {
alert(2);
})
})
I am just getting started with coding for FirefoxOS and am trying to get a list of files in a directory.
The idea is to find the name of each file and add it to the array (which works), but I want to return the populated array and this is where I come unstuck. It seems that the array gets populated during the function (as I can get it to spit out file names from it) but when I want to return it to another function it appears to be empty?
Here is the function in question:
function getImageFromDevice (){
var imageHolder = new Array();
var pics = navigator.getDeviceStorage('pictures');
// Let's browse all the images available
var cursor = pics.enumerate();
var imageList = new Array();
var count = 0;
cursor.onsuccess = function () {
var file = this.result;
console.log("File found: " + file.name);
count = count +1;
// Once we found a file we check if there are other results
if (!this.done) {
imageHolder[count] = file.name;
// Then we move to the next result, which call the cursor
// success with the next file as result.
this.continue();
}
console.log("file in array: "+ imageHolder[count]);
// this shows the filename
}
cursor.onerror = function () {
console.warn("No file found: " + this.error);
}
return imageHolder;
}
Thanks for your help!
Enumerating over pictures is an asynchronous call. Essentially what is happening in your code is this:
You are initiating an empty array
You are are telling firefox os to look for pictures on the device
Then in cursor.onsuccess you are telling firefox os to append to the array you have created WHEN it gets back the file. The important thing here is that this does not happen right away, it happens at some point in the future.
Then you are returning the empty array you have created. It's empty because the onsuccess function hasn't actually happened.
After some point in time the onsuccess function will be called. One way to wait until the array is full populated would be to add in a check after:
if (!this.done) {
imageHolder[count] = file.name;
this.continue();
}
else {
//do something with the fully populated array
}
But then of course your code has to go inside the getImageFromDevice function. You can also pass a callback function into the getImageFromDevice function.
See Getting a better understanding of callback functions in JavaScript
The problem is with the aSynchronous nature of the calls you are using.
You are returning (and probably using) the value of imageHolder when it's still empty - as calls to the "onsuccess" function are deferred calls, they happen later in time, whereas your function returns immediately, with the (yet empty) imageHolder value.
You should be doing in this case something along those lines:
function getImageFromDevice (callback){
...
cursor.onsuccess = function () {
...
if (!this.done) {
// next picture
imageHolder[count] = file.name;
this.continue();
} else {
// no more pictures, return with the results
console.log("operation finished:");
callback(imageHolder);
}
}
}
Or use Promises in your code to accomplish the same.
Use the above by e.g.:
getImageFromDevice(function(result) {
console.log(result.length+" pictures found!");
});