Waiting for the loading of an element in javascript - javascript

This might be a stupid question, but I am asking anyways, I have a question about waiting for callbacks.
I am using Polymer in my web development project. In one of my pages, I have a loop that loads an element inside a loop:
loop
"element-a"
end loop
I am fetching data from the database and content of "element-a" is populated from the database query results.
I only want to load another "element-a" once "element-a" has finished loading.
Right now, I have a forced delay by using :
sleepStupidly(usec);
function sleepStupidly(usec)
{
var endtime= new Date().getTime() + usec;
while (new Date().getTime() < endtime);
}
But I need a better way of doing this, any suggestion will be helpful.

As sheriffderek pointed out in the comments, promises or jQuery's $.ajax().success() method are probably the best tools to get this done. In addition, good ol' recursion could be used as well. Below is an example of how to use these within a Polymer (v1) component for your case.
<dom-module id="example-component">
<template></template>
<script>
Polymer({
is:"example-component",
properties: {
//...
},
fetchIndex: 0,
idsToFetch: [1, 2, 3, 4],
elementData: [],
ready: function () {
this.loadElement(this.idsToFetch[this.fetchIndex]);
},
loadElement: function(idToFetch) {
var self = this; // Create a reference for the ".done()" method to use when it fires
$.ajax({
url: "http://example.com/getdata",
data: {"idToFetch": idToFetch}
}).done(function (data) { // This will fire after the response is received, which is where you call the next
self.elementData.push(data); // Save the response data
self.fetchIndex++; // Move to the next element
if (self.fetchIndex < self.idsToFetch.length) { // If there's more to fetch
self.loadElement(self.idsToFetch[self.fetchIndex]); // Recursively call the same method
}
}).always(function(data){ // Optional
console.log("Fetch complete for ID: " + idToFetch);
}).error(function (error){ // Optional
console.log(error)
});
}
})
</script>
</dom-module>
In summary, call the loadElement() method on Polymer's ready() handler and provide it the first element to initiate the fetch. Within loadElement(), make the fetch to get the data for the element ID passed in. Once the fetch is .done() (preferred over but, similar to .success()), recursively call the loadElement() method again and provide it the next element in the list. This will continue recursively until the fetchIndex value equals the length of the idsToFetch array.

Related

Triggering a method after every element in a .each() loop has been processed

You'll have to forgive me if I show any kind of ineptitude here, jquery and java isn't my usual area of work. But here goes:
I have a page that shows a random list of items that are pulled from a server using an API call. The idea is that every time the user clicks "generate" a new list is produced and inserted into the page. This works but it's very fast and all the user sees is a list rapidly changing. To spruce things up I've decided to put some nice animations and effects in.
I've got a jquery function that loops through each element in the list of child elements and toggles the css style of the child element so that an effect from the animate.css library is applied. The problem is when I have another function that loads the new list and this is called immediately and therefore all of the css transitions are ignored; or rather they don't get a chance to run because the second method immediately triggers.
I've tried using a callback and had no joy, I've tried using deferred objects. No luck at all.
Here's the code I have so far:
function removeOldContent() {
$('#removableContent > div').each(function (index) {
var elm = $(this);
setTimeout(function () {
elm.toggleClass('customAnim', function () {
$(this).remove();
});
}, index * 150);
});
}
function getList() {
var rId = $('.tab-content').find('.active').data('d-id');
var serviceUrl = '/GetRandom/GetList';
$.ajax({
type: 'POST',
url: serviceUrl,
data: {
reportId : rId
},
success: function(data) {
$('#reportContainer').html(data).fadeIn('slow');
}
});
}
Ideally I'd like to be able to let removeOldContent() finish completely, after all the timeouts have run. And then trigger getList() to update the content. I'll work on making a nice transition for the inbound data but first I just need to get this working.
Any advice or pointers would be greatly appreciated.
***** Update ******
I've made a fiddle. Not giving me the same error as my dev env but should be close enough for you to see
https://jsfiddle.net/rdt1pfhk/9/
Your problem is with the timing of events. Your removeOldContent function uses a setTimeout function which in turn animates and removes the items from the DOM. Your getList() function was executing before the other function had finished. I put a quick untidy solution together using your fiddle. I return a jquery deferred object from you removeOldContent method and then only call the getList when that is resolved (and the older items removed from the dom). It is not the neatest but it will point you in the right direction. I updated your fiddle here: https://jsfiddle.net/rdt1pfhk/16/
function removeOldContent() {
var deferred = new jQuery.Deferred();
....
return deferred;
}
$(document).on('click', '.quickPick', function (e) {
removeOldContent().then(function(){
getList();
});
});
If I understood correctly you need to only delay some functions for as long as you need. I think the answer you are looking for could be found here:
How to delay calling of javascript function?
Also, I'd like to mention that I don't see a
$(document).ready(function () {
//your code here
});
anywhere. Maybe I'm wrong but since you mentioned that CSS is ignored, are you sure the page is loaded before your code starts being executed?

Need to Populate Javascript Array BEFORE (document).ready()

How can I get some javascript to run before a document ready function?
I have the following snippet finished first...
var applicantlist = [];
$.getJSON("apps.json", function(appsdata) {
for (var i = 0; i<appsdata.applications.length; i++){
var tempapp = [appsdata.applications[i].name, appsdata.applications[i].server];
applicantlist.push(tempapp);
}
});
I've tested this, and the data gets pushed into the array just fine. The problem is that I need this array to make some ajax calls that are found in my page ready function as follows...
$(document).ready(function(){
window.jsonpCallbacks = {};
alert(applicantlist.length);
for (var i = 0; i < applicantlist.length; i++){
(function(index){
window.jsonpCallbacks["myCallback" + index] = function(data){
myCallback(data,index);
};
})(i);
//Jquery/Ajax call to the WoW API for the data.
$.ajax({
"url":"http://us.battle.net/api/wow/character/" + applicantlist[i][1] + "/" + applicantlist[i][0] + "?jsonp=jsonpCallbacks.myCallback" + i,
"type":"GET",
"data": { fields: "items, talents, progression, professions, audit, guild, stats"},
"dataType":"jsonp",
"contentType":"application/json",
"jsonpCallback":"jsonpCallbacks.myCallback"+i,
"success":function(data1){
}
})
}
All of this fires off before the first snipet, no matter where I seem to put it. So, the array is empty (the alert message just shows "0").
As you can see by the URL of my ajax call, I need that array populated beforehand. I've tried putting the first snippet in a seperate .js file and calling it before all other javascript files on the actual HTML page...
What am I missing?
Move the code that sends the first request to the document.ready. You don't usually want anything happening before the document is ready. Then move the code that sends the next request(s) to the callback of the first request after you populate the array and do whatever else you need to happen first
$(document).ready(function () {
$.getJSON("apps.json", function(appsdata) {
...
// add items to your array
sendNextRequest();
}
});
function sendNextRequest() {
//Jquery/Ajax call to the WoW API for the data.
...
}
This gurantees that the calls to the WoW api don't get fired until the first $.getJSON call completes and you populate your array.
FYI this is a common challenge in javascript. You need one operation to run only after another finishes. When you use ajax, you have callbacks like in my example above that help you achieve this. Outside of ajax requests, you can use jQuery Promises to defer tasks until after something else finishes.

How to add a loading animation when loading a promise object?

I am using JavaScript and jQuery to write my website but I get the following problem.
I want to sleep the thread and display a loading animation only until the promise object is completely loaded into my website.
And I don't know how to do that, Can anyone help?
#A.Wolff
Since I have this problem when I am using the PDF.JS plugin. And I am trying to declare a self-defined class on top of it.
Following is my self-defined class
function WhiteBoardPdf3(canvasContext,url){
this.canvasContext=canvasContext;
this.url=url;
this.pdfOriPromise=PDFJS.getDocument(url);
this.pdfPromise=Promise.cast(this.pdfOriPromise);
this.pdfDoc=null;
/*----Here is the problem------*/
this.pdfPromise.then(function getPdf(_pdfDoc){
this.pdfDoc=_pdfDoc
});
/*----------------------------*/
this.pageNum=1;
this.scale=1;
this.renderPage(1);
}
WhiteBoardPdf3.prototype.getPdfPromise=function(){
return this.pdfPromise;
}
WhiteBoardPdf3.prototype.renderPage=function(){
var num=this.pageNum;
var scale=this.scale;
var canvasContext=this.canvasContext;
var canvas=canvasContext.canvas;
var canvasClassName=canvas.className;
var allCanvas=document.getElementsByClassName(canvasClassName);
var canvasContainer=document.getElementById("whiteBoardLayerContainer");
this.pdfPromise.then(function getPdf(_pdfDoc){
_pdfDoc.getPage(num).then(function(page){
var viewport=page.getViewport(scale);
for(var i=0;i<allCanvas.length;i++){
allCanvas[i].height=viewport.height;
allCanvas[i].width=viewport.width;
}
canvasContainer.style.width=viewport.width+'px';
var renderContext={
canvasContext: canvasContext,
viewport: viewport
}
page.render(renderContext);
});
});
}
WhiteBoardPdf3.prototype.getPdfNumOfPages=function(){
this.pdfDoc.numPages;
}
And the PDFJS.getDocument(url) will return a promise object.
However, the problem is that when I construct this class and call the getPdfNumOfPages() function in the main program. I notice that the program will call the getPdfNumOfPages() function before the "pdfDoc"(promise object) is finish loading. So I want to sleep the thread and display the loading animation before the promise object is finish loading. So as to the getPdfNumOfPages() function will run after the "pdfDoc" is loaded.
well you could show a image of loading on your page before sending a ajax request and hide it after a response is received .
HTML CODE:
<img name="loadingImage" id="loadingImg" src="http://www.ppimusic.ie/images/loading_anim.gif " width="100px" height="100px" />
$('#loadingImg').hide();
$.ajax({
url: "test.html",
data: {data1:'smile'},
beforeSend:function() {
$('#loadingImg').show();
},
success:function(response) {
$('#loadingImg').hide();
//process the successful response
},
error:function(response) {
$('#loadingImg').hide();
//process the error response
}
});
Happy Coding:)
Edit:
As indicated in comments (thanks A. Wolff and Frédéric Hamidi), this solution is better:
//
// launch loader
// before
//
// make xhr query
var jqxhr = $.ajax({
url: 'your/action'
});
// call the always promise method, automatically invoked
// when the $.ajax action is completed
jqxhr.always(function() {
// stop loader in every situations (success or failure)
});
Previous solving post solution below:
This solution is enought only if the xhr query is done without error.
You can use $.when
Description: Provides a way to execute callback functions based on one
or more objects, usually Deferred objects that represent asynchronous
events.
//
// set loader on here
//
$.when($.ajax('/action/to/do')).then(function(response) {
// reset your loader
// action completed ;)
});

Pattern to force events to fire serially with jquery

I'm making a bunch of ajax calls from the browser to a service and I'd like to be nice to the server by not sending them all at once. Is there a standard pattern for serializing the firing of a list of events so that the next one doesn't fire until the last one finished?
I think jQuery.whenSync() Plugin For Chaining Asynchronous Callbacks Using Deferred Objects can help you queue AJAX calls.
I do this now with one of my apps. Basically I have an array that stores all my ajax call parameters. On the callback I pass the next increment until there are no more. Here's a general example (you could use ajax or get of course, this is just how I do it:
var params = [
{url : 'someurl', data : 'somedata', callback : doSomething},
{url : 'anotherurl', data : 'moredata'}
],
process = function(i) {
if (i < params.length) {
var item = params[i];
// Navigate to first view trainee screen
$.post(
item.url,
item.data,
function(data) {
// Could test for function, etc but this is how I do it
try {
item.callback(data);
} catch (e) {}
process(++i);
}
);
}
};
process(0);

Loop calling an asynchronous function

Introduction to the problem
I need to call an asynchronous function within a loop until a condition is satisfied. This particular function sends a POST request to a website form.php and performs some operations with the response, which is a JSON string representing an object with an id field. So, when that id is null, the outer loop must conclude. The function does something like the following:
function asyncFunction(session) {
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
var response = response.json;
if (response.id) {
doStaff(response.msg);
} else {
// Break loop
}
}
})).get();
}
Note: Although I've found the problem implementing an add-on for Firefox, I think that this is a general javascript question.
Implementing the loop recursively
I've tried implementing the loop by recursivity but it didn't work and I'm not sure that this is the right way.
...
if (response.id) {
doStaff(response.msg);
asyncFunction(session);
} else {
// Break loop
}
...
Using jsdeferred
I also have tried with the jsdeferred library:
Deferred.define(this);
//Instantiate a new deferred object
var deferred = new Deferred();
// Main loop: stops when we receive the exception
Deferred.loop(1000, function() {
asyncFunction(session, deferred);
return deferred;
}).
error(function() {
console.log("Loop finished!");
});
And then calling:
...
if (response.id) {
doStaff(response.msg);
d.call();
} else {
d.fail();
}
...
And I achieve serialization but it started repeating previous calls for every iteration. For example, if it was the third time that it called the asyncFunction, it would call the same function with the corresponding parameters in the iterations 1 and 2.
Your question is not exactly clear, but the basic architecture must be that the completion event handlers for the asynchronous operation must decide whether to try again or to simply return. If the results of the operation warrant another attempt, then the handler should call the parent function. If not, then by simply exiting the cycle will come to an end.
You can't code something like this in JavaScript with anything that looks like a simple "loop" structure, for the very reason that the operations are asynchronous. The results of the operation don't happen in such a way as to allow the looping mechanism to perform a test on the results; the loop may run thousands of iterations before the result is available. To put it another way, you don't "wait" for an asynchronous operation with code. You wait by doing nothing, and allowing the registered event handler to take over when the results are ready.
Thank you guys for your help. This is what I ended doing:
var sess = ...;
Deferred.define(this);
function asyncFunction (session) {
Deferred.next(function() {
var d = new Deferred();
(new Request({
url: form.php,
content: "sess=" + session,
onComplete: function (response) {
d.call(response.json);
}
})).get();
return d;
}).next(function(resp) {
if (resp.id) {
asyncFunction(session);
console.log(resp.msg);
}
});
}
asyncFunction(sess);
Why wouldn't you just use a setInterval loop? In the case of an SDK-based extension, this would look like:
https://builder.addons.mozilla.org/addon/1065247/latest/
The big benefit of promises-like patterns over using timers is that you can do things in parallel, and use much more complicated dependencies for various tasks. A simple loop like this is done just as easily / neatly using setInterval.
If I correctly understand what you want to do, Deferred is a good approach. Here's an example using jQuery which has Deferred functionality built in (jQuery.Deferred)
A timeout is used to simulate an http request. When each timeout is complete (or http request is complete) a random number is returned which is equivalent to the result of your http request.
Based on the result of the request you can decide if you need another http request or want to stop.
Try out the below snippet. Include the jQuery file and then the snippet. It keeps printing values in the console and stops after a zero is reached.
This could take while to understand but useful.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
var def = $.Deferred(function(defObject) {
setTimeout(function() {
defObject.resolve(Math.floor(Math.random() * (MAXNUM+1)));
}, 1000);
});
def.done(function(val) {
if (val !== 0)
newAsyncRequest();
console.log(val);
});
};
newAsyncRequest();
});
Update after suggestion from #canuckistani
#canuckistani is correct in his answer. For this problem the solution is simpler. Without using Deferred the above code snippet becomes the following. Sorry I led you to a tougher solution.
$(function() {
var MAXNUM = 9;
function newAsyncRequest() {
setTimeout(function() {
var val = Math.floor(Math.random() * (MAXNUM+1));
if (val !== 0)
newAsyncRequest();
console.log(val);
}, 1000);
}
newAsyncRequest();
});

Categories