I know this kind of question has been asked a thousand times before, I read a ton of posts on this but I still can't wrap my head around it. Tried many examples without success on my own before posting here.
I'm basically trying to return images from different folders but I need to process the images from each directory separately in a specific way. I am not very familiar with json and ajax in general so this might be why I don't understand this too well.
Anyhow, my idea was to simply create a for loop and then for each directory, get the images via ajax and deal with them.
Obviously this ain't working as I expected and I read somewhere that I need a javascript "closure" to get this to work but it never works. Here is my basic code without closure:
$.getJSON('/img/content/galleries/', function(directories) {
for (var dirnum = 2; dirnum < directories.length - 1; dirnum++) {
var folder = '/img/content/galleries/' + directories[dirnum] + '/';
$.ajax({
url: folder,
success: function(data) {
$("#gallery").text("");
$(data).find("a").attr("href", function(i, val) {
if (val.match(/\.jpg|\.png|\.gif/)) {
$("#gallery").append("<img src='" + folder + val + "'>");
}
}); // end data.find
} // end success
}); // end ajax
} // end for loop
}); // end getJSON
This code runs but always give me the results of gallery2 before gallery1 (the name of my directories) and then gives an error saying it can't find images from gallery1 inside the gallery2 folder...
If anybody can help me insert some kind of closure in here that would be great. A few hours already wasted with no results so far. I just don't get that concept I guess.
Also note that I previously get the "directories" values from a php file in the parent folder of gallery1 and gallery 2 that has that code in it:
<?php
$directories = scandir('.');
header('Content-Type: application/json');
echo json_encode($directories);
?>
Or if you guys think there is a simpler way to approach this in javascript I'm all ears! I know I could have done it a different way with php intermixed with some javascript but I wanted to use javascript only here.
Thanks a bunch in advance.
Erick P.
Major flaw is each request empties the container ... so only content from last request received will be displayed.
Fix that by emptying container before the request loop starts
You can't control order of requests received ... so if order is important will need to use promises for the requests in order to make sure data is populated in correct order
$.getJSON('/img/content/galleries/', function(directories) {
//empty container before starting request loop
$("#gallery").empty();
for (var dirnum = 2; dirnum < directories.length - 1; dirnum++) {
// wrap ajax in IIFE closure
(function(dirnum){
var folder = '/img/content/galleries/' + directories[dirnum] + '/';
$.ajax({
url: folder,
success: function(data) {
//$("#gallery").text("");// move up above loop
$(data).find("a").attr("href", function(i, val) {
if (val.match(/\.jpg|\.png|\.gif/)) {
$("#gallery").append("<img src='" + folder + val + "'>");
}
}); // end data.find
} // end success
}); // end ajax
})(dirnum);//end closure in for loop
} // end for loop
}); // end getJSON
Not quite sure why you start at third index for dirnum....assume you have unwanted paths in the directories array. Would be cleaner to filter those at server
Thanks to Charlie for the help! I fixed it via the closure as everyone told me:
$.getJSON('/img/content/galleries/', function(directories) {
$("#gallery").text("");
for (var dirnum = 2; dirnum < directories.length - 1; dirnum++) {
(function(dirnum) {
var folder = '/img/content/galleries/' + directories[dirnum] + '/';
$.ajax({
url: folder,
success: function(data) {
$(data).find("a").attr("href", function(i, val) {
if (val.match(/\.jpg|\.png|\.gif/)) {
$("#gallery").append("<img src='" + folder + val + "'>");
}
}); // end data.find
} // end success
}); // end ajax
})(dirnum);
} // end for loop
}); // end getJSON
Charlie I'll try your solution also to check as another alternative. Thanks again!
Erick
Related
I am currently having an issue with using JQuery with json data to append a calculated amount of <li> elements to a <ul>. Here is my code:
$.getJSON("http://api.hivemc.com/v1/game/timv", function(data) {
$.each(data.achievements, function(key,value){
var unlocked = "Locked";
$.each(maindata.achievements, function(key2,value2){
if(value.name == key2){
unlocked = "Unlocked";
}
});
$("#achs").append("<li><p>" + value.publicname + "</p><span>"+ unlocked + "</span></li>");
});
});
As you can see, I am getting JSON data from a URL. In this, there is an array achievements. The variable maindata was set earlier from another $.getJSON().
For each achievement, I have to append a <li> element with the data of the achievement to a <ul> which has the id of #achs. However, in order to see if the achievement is unlocked, I have to check for the achievements name in the maindata JSON, meaning another $.each() loop inside the current $.each() loop.
Without the extra loop, the code works fine, and successfully forms a list of achievements and whether they are unlocked or not. However, whenever I add the extra $.each() back in again, it only works when I reload the page or go back and back onto it again.
Does anyone know why this is happening? Any help would be greatly appreciated as I am sure you can see I am not very experienced with JQuery. Also, I have been able to do this with just PHP, retrieving JSON data, but I wanted to see if using JQuery would be quicker loading than PHP.
The maindata JSON is retrieved from this code:`var maindata;
$.getJSON("http://api.hivemc.com/v1/player/" + $user + "/timv", function(data) {
$('#1').text(data.total_points);
$('#2').text(data.i_points);
$('#3').text(data.t_points);
$('#4').text(data.d_points);
$('#5').text(data.role_points);
$('#6').text(data.most_points);
maindata = data;
if(data.detectivebook == true)
$('#7').text("Yes");
else
$('#7').text("No");
$flare = data.active_flareupgrade;
$flare = $flare.charAt(0).toUpperCase() + $flare.slice(1).toLowerCase();
$('#8').text($flare);
$('#9').text(data.title);
var d = new Date(data.lastlogin * 1000);
var n = d.toISOString();
$('#10').text(d.getDate() + "/" + d.getMonth() + "/" + d.getFullYear());
$.getJSON("http://api.hivemc.com/v1/game/timv", function(data2) {
$.each(data2.achievements, function(key,value){
var unlocked = "Locked";
$.each(maindata.achievements, function(key2,value2){
if(value.name == key2){
unlocked = "Unlocked";
}
});
$("#achs").append("<li><p>" + value.publicname + "</p><span>"+ unlocked + "</span></li>");
});
});
});`
Thanks.
You should wait for your earlier AJAX call to complete before doing any processing that depends on the results of both calls:
var promise1 = $.getJSON(...); // get maindata
var promise2 = $.getJSON(...); // get data
$.when(promise1, promise2).then(function(maindata, data) {
// do your processing here
...
});
NB: there's no need to supply callbacks to the $.getJSON calls - do the processing within the .then callback.
I am trying to upload a list of images. I have the images stored in an array (called images).
I have the previews displayed on the screen.
What I want to do is upload them sequentially, and as they complete their upload, I want to set a flag. When this flag is set (thanks to the power of Knockout), the image disappears from the list.
However, I think due to the async nature of the post command .. I'm not achieving the desired results.
Below is what I am trying to do:
for(var i = 0; i < self.images().length; i++) {
var photo = self.images()[i];
var thisItem = photo;
var object = JSON.stringify({
Image: thisItem.Image,
AlbumID: albumId,
Filesize: thisItem.Filesize,
Filetype: thisItem.Filetype,
Description: thisItem.Description,
UniqueID: thisItem.UniqueID
});
var uri = "/api/Photo/Upload";
$.post({
url: uri,
contentType: "application/json"
}, object).done(function(data) {
if(data.IsSuccess) {
photo.Status(1);
}
}).fail(function() {
// Handle here
}).always(function() {
remainingImages = remainingImages - 1;
if(remainingImages == 0) self.isUploading(false);
});
}
self.isUploading(false);
But I think what's happening is that the for loop ends before all the posts have received a reply. Because online one image is removed.
I tried with a async: false ajax post, but that locked up the screen and then they all disappeared.
I thought the 'done' method would only execute once the post is completed, but I think the whole method just ends once the post commands have been sent, and then I never get the done.
How can I achieve what I'm trying to do... Set each image's status once the post gets a reply?
Your first problem is that you are losing the reference you think you have to each photo object because the loop finishes before your AJAX calls return, so that when they do return, photo is a reference to the last item in self.images().
What we need to do to solve this is to create a new scope for each iteration of the loop and each of those scopes will have its own reference to a particular photo. JavaScript has function scopes, so we can achieve our goal by passing each photo to a function. I will use an Immediately Invoked Function Expression (IIFE) as an example:
for (var i = 0; i < self.images().length; i++) {
var photo = self.images()[i];
(function (thisItem) {
/* Everything else from within your for loop goes here. */
/* Note: the done handler must reference `thisItem`, not `photo`. */
})(photo);
}
Note that you should remove self.isUploading(false); from the last line. This should not be set to false until all of the POST requests have returned.
I have created a functioning fiddle that you can see here.
However, this solution will not perform the POST requests "sequentially". I am not sure why you would want to wait for one POST to return before sending the next as this will only increase the time the user must wait. But for the sake of completeness, I will explain how to do it.
To fire a POST after the previous POST returns you will need to remove the for loop. You will need a way to call the next POST in the always handler of the previous POST. This is a good candidate for a recursive function.
In my solution, I use an index to track which item from images was last POSTed. I recursively call the function to perform the POST on the next item in images until we have POSTed all items in images. The following code replaces the for loop:
(function postNextImage (index) {
var photo = self.images()[i];
var thisItem = photo;
var object = JSON.stringify({
Image: thisItem.Image,
AlbumID: albumId,
Filesize: thisItem.Filesize,
Filetype: thisItem.Filetype,
Description: thisItem.Description,
UniqueID: thisItem.UniqueID
});
var uri = "/api/Photo/Upload";
$.post({
url: uri,
contentType: "application/json"
}, object)
.done(function (data) {
if(data.IsSuccess) {
thisItem.Status(1);
}
})
.fail(function () {
// Handle here
})
.always(function () {
if (index < (self.images().length - 1)) {
index += 1;
postNextImage(index);
} else {
self.isUploading(false);
}
});
})(0);
I have created a fiddle of this solution also, and it can be found here.
So,I am trying to use the twitch API:
https://codepen.io/sterg/pen/yJmzrN
If you check my codepen page you'll see that each time I refresh the page the status order changes and I can't figure out why is this happening.
Here is my javascript:
$(document).ready(function(){
var ur="";
var tw=["freecodecamp","nightblue3","imaqtpie","bunnyfufuu","mushisgosu","tsm_dyrus","esl_sc2"];
var j=0;
for(var i=0;i<tw.length;i++){
ur="https://api.twitch.tv/kraken/streams/"+tw[i];
$.getJSON(ur,function(json) {
$(".tst").append(JSON.stringify(json));
$(".name").append("<li> "+tw[j]+"<p>"+""+"</p></li>");
if(json.stream==null){
$(".stat").append("<li>"+"Offline"+"</li>");
}
else{
$(".stat").append("<li>"+json.stream.game+"</li>");
}
j++;
})
}
});
$.getJSON() works asynchronously. The JSON won't be returned until the results come back. The API can return in different orders than the requests were made, so you have to handle this.
One way to do this is use the promise API, along with $.when() to bundle up all requests as one big promise, which will succeed or fail as one whole block. This also ensures that the response data is returned to your code in the expected order.
Try this:
var channelIds = ['freecodecamp', 'nightblue3', 'imaqtpie', 'bunnyfufuu', 'mushisgosu', 'tsm_dyrus', 'esl_sc2'];
$(function () {
$.when.apply(
$,
$.map(channelIds, function (channelId) {
return $.getJSON(
'https://api.twitch.tv/kraken/streams/' + encodeURIComponent(channelId)
).then(function (res) {
return {
channelId: channelId,
stream: res.stream
}
});
})
).then(function () {
console.log(arguments);
var $playersBody = $('table.players tbody');
$.each(arguments, function (index, data) {
$playersBody.append(
$('<tr>').append([
$('<td>'),
$('<td>').append(
$('<a>')
.text(data.channelId)
.attr('href', 'https://www.twitch.tv/' + encodeURIComponent(data.channelId))
),
$('<td>').text(data.stream ? data.stream.game : 'Offline')
])
)
})
})
});
https://codepen.io/anon/pen/KrOxwo
Here, I'm using $.when.apply() to use $.when with an array, rather than list of parameters. Next, I'm using $.map() to convert the array of channel IDs into an array of promises for each ID. After that, I have a simple helper function with handles the normal response (res), pulls out the relevant stream data, while attaching the channelId for use later on. (Without this, we would have to go back to the original array to get the ID. You can do this, but in my opinion, that isn't the best practice. I'd much prefer to keep the data with the response so that later refactoring is less likely to break something. This is a matter of preference.)
Next, I have a .then() handler which takes all of the data and loops through them. This data is returned as arguments to the function, so I simply use $.each() to iterate over each argument rather than having to name them out.
I made some changes in how I'm handling the HTML as well. You'll note that I'm using $.text() and $.attr() to set the dynamic values. This ensures that your HTML is valid (as you're not really using HTML for the dynamic bit at all). Otherwise, someone might have the username of <script src="somethingEvil.js"></script> and it'd run on your page. This avoids that problem entirely.
It looks like you're appending the "Display Name" in the same order every time you refresh, by using the j counter variable.
However, you're appending the "Status" as each request returns. Since these HTTP requests are asynchronous, the order in which they are appended to the document will vary each time you reload the page.
If you want the statuses to remain in the same order (matching the order of the Display Names), you'll need to store the response data from each API call as they return, and order it yourself before appending it to the body.
At first, I changed the last else condition (the one that prints out the streamed game) as $(".stat").append("<li>"+jtw[j]+": "+json.stream.game+"</li>"); - it was identical in meaning to what you tried to achieve, yet produced the same error.
There's a discrepancy in the list you've created and the data you receive. They are not directly associated.
It is a preferred way to use $(".stat").append("<li>"+json.stream._links.self+": "+json.stream.game+"</li>");, you may even get the name of the user with regex or substr in the worst case.
As long as you don't run separate loops for uploading the columns "DisplayName" and "Status", you might even be able to separate them, in case you do not desire to write them into the same line, as my example does.
Whatever way you're choosing, in the end, the problem is that the "Status" column's order of uploading is not identical to the one you're doing in "Status Name".
This code will not preserve the order, but will preserve which array entry is being processed
$(document).ready(function() {
var ur = "";
var tw = ["freecodecamp", "nightblue3", "imaqtpie", "bunnyfufuu", "mushisgosu", "tsm_dyrus", "esl_sc2"];
for (var i = 0; i < tw.length; i++) {
ur = "https://api.twitch.tv/kraken/streams/" + tw[i];
(function(j) {
$.getJSON(ur, function(json) {
$(".tst").append(JSON.stringify(json));
$(".name").append("<li> " + tw[j] + "<p>" + "" + "</p></li>");
if (json.stream == null) {
$(".stat").append("<li>" + "Offline" + "</li>");
} else {
$(".stat").append("<li>" + json.stream.game + "</li>");
}
})
}(i));
}
});
This code will preserve the order fully - the layout needs tweaking though
$(document).ready(function() {
var ur = "";
var tw = ["freecodecamp", "nightblue3", "imaqtpie", "bunnyfufuu", "mushisgosu", "tsm_dyrus", "esl_sc2"];
for (var i = 0; i < tw.length; i++) {
ur = "https://api.twitch.tv/kraken/streams/" + tw[i];
(function(j) {
var name = $(".name").append("<li> " + tw[j] + "<p>" + "" + "</p></li>");
var stat = $(".stat").append("<li></li>")[0].lastElementChild;
console.log(stat);
$.getJSON(ur, function(json) {
$(".tst").append(JSON.stringify(json));
if (json.stream == null) {
$(stat).text("Offline");
} else {
$(stat).text(json.stream.game);
}
}).then(function(e) {
console.log(e);
}, function(e) {
console.error(e);
});
}(i));
}
});
I have a function in JS contains a loop, that calls an AJAX call every iteration. The call to inserts checked elements into a DB and returns the results of those elements in the same page in the next section.
The problem I have is that when I check for e.g. 4 checkboxes out of 3 groupes, the only checkboxes of the last group gets added to the page. However, when I use alert(), I can see all elements.
I used setTimeout, but I got error in the code. I also added lines to give more time to AJX call, but the problem remains. So I wonder if there is a solution to slow down the code without using alert().
This is my script:
addAptitudeField : function(currentAutocompleteField, idChamp) {
var currentAutocompleteFieldBind = currentAutocompleteField;
var idChampBind = idChamp;
window.setTimeout(function() {
// Code ...
var paramDwr = {};
var newDivName = "div" + idChamp + lastValueId;
paramDwr[attributs.r_divId] = newDivName;
paramDwr[attributs.r_currentValue] = currentValue;
paramDwr[attributs.r_hiddenIdsField] = hiddenIdsField.id;
paramDwr[attributs.r_lastValueId] = lastValueId;
paramDwr[attributs.r_itemmod] = nbAptitudesCat % 2 == 0;
// setTimeout ( RepertoireDwr.ligneSuppEtSpanMessage, 1000 ) doesn't work
RepertoireDwr.ligneSuppEtSpanMessage(paramDwr, function(ajaxPage) {
divCategorie.update(divCategorie.innerHTML + ajaxPage.texte);
aptitudeAvecDetail.remetsValeursStockees();
var btnSuppression = $(newDivName).getElementsByTagName('img')[0];
btnSuppression.setAttribute("onclick", "formulaireFiche.updateCSS('" + newDivName + "');" + btnSuppression.getAttribute("onclick") + "fiche.updateCategorieSuppressionAptLieeUo(\'divCat" + currentCategorie + "\');"); });
}
//
// alert() : It works in this case.
//
// for (var i=0; i<5000000; i++) ; it doesn't work
}, 400);
}
Thank you in advance for your help and time.
I will likely be downvoted for mentioning this, because it is not a recommended procedure, but I believe every coder should have all facts.
In jQuery AJAX construct, there is option async:false, which will delay the script from continuing UNTIL the AJAX has completed processing. Needless to say, if things go wrong in the AJAX the browser could freeze. A lot depends on who your users are, and amount of traffic -- on a few of my ten-user in-house projects it was an acceptable solution.
$.ajax({
async: false,
type: 'post',
url: 'ajax/ax.php',
data: 'request=',
success: function(d){
if (d.length) alert(d);
}
});
Ref:
What does "async: false" do in jQuery.ajax()?
The better idea, however, is to look into the Promises interface, with methods like .when() and .then()
References:
https://jsfiddle.net/v86bc028/2/
http://jqfundamentals.com/chapter/ajax-deferreds#
http://digitizor.com/jquery-html-callback-function-using-promise/#
how does jquery's promise method really work?
The problem you're running into deals with asynchronous functions, or the A in AJAX. If you don't know what an asynchronous function is, there are many others who can explain it better than I can, so give that a google.
What's happening without the alert() in there is your code makes 4 sever calls, but all 4 get sent out before you get a response to any of them. With the alert() (or setTimeout), you're giving the code time to received each response to a call before the next one is made.
There are several ways you can approach this, the first way is by calling the next call after the first receives a response. The second way is to use an async function to call all 4 at once on different chains(?). I'm not the best at explaining this part, but there's plenty of code to be found on SO and online.
I think you have a more generic problem in your code, since you seem to need to delay your executions to wait till sth. else is finished, instead of getting anounced when it is done.
The line that annoys me most is this one
divCategorie.update(divCategorie.innerHTML + ajaxPage.texte);
what exactly is update doing? How is it implemented?
I assume it does sth. like divCategorie.innerHTML += ajaxPage.texte;
Wich is highly unfavorable, since the browser has to parse and rebuild, whatever there already is in divCategorie.innerHTML.
Just appending the new Markup would be better.
long way short: maybe a good hack would be to insert some hidden node as a placeholder (so you kan keep order, although the AJAX-requests may return in a different order) and replace that node with the real content, as soon as it arrives.
Kind of like this:
addAptitudeField : function(currentAutocompleteField, idChamp) {
var currentAutocompleteFieldBind = currentAutocompleteField;
var idChampBind = idChamp;
//this is done immediately, and therefore preserves the order of the loop,
//without any delays/timeouts
var placeholder = document.createElement("div");
placeholder.className = "placeholder";
placeholder.style.display = "none";
divCategorie.appendChild(placeholder);
window.setTimeout(function() {
// Code ...
var paramDwr = {};
var newDivName = "div" + idChamp + lastValueId;
paramDwr[attributs.r_divId] = newDivName;
paramDwr[attributs.r_currentValue] = currentValue;
paramDwr[attributs.r_hiddenIdsField] = hiddenIdsField.id;
paramDwr[attributs.r_lastValueId] = lastValueId;
paramDwr[attributs.r_itemmod] = nbAptitudesCat % 2 == 0;
// setTimeout ( RepertoireDwr.ligneSuppEtSpanMessage, 1000 ) doesn't work
RepertoireDwr.ligneSuppEtSpanMessage(paramDwr, function(ajaxPage) {
//convert the passed text into a DocumentFragment
var frag = fragment(ajaxPage.texte);
//replacing the placeholder with the fragment
divCategorie.insertBefore(frag, placeholder);
divCategorie.removeChild(placeholder);
aptitudeAvecDetail.remetsValeursStockees();
var btnSuppression = $(newDivName).getElementsByTagName('img')[0];
//this is also pretty horrible to me:
btnSuppression.setAttribute("onclick", "formulaireFiche.updateCSS('" + newDivName + "');" + btnSuppression.getAttribute("onclick") + "fiche.updateCategorieSuppressionAptLieeUo(\'divCat" + currentCategorie + "\');"); });
}
}, 400);
}
I think you should do some major refactoring. And take a look into Promises.
// * -> DocumentFragment
//strings/primitives are parsed as HTML-markup,
//null / undefined is ignored
//Arraylike structures are parsed recursively
var fragment = (function(container){
return function(src){
return reducer(document.createDocumentFragment(), src);
}
function reducer(frag, node){
var i, len, fc, c, r;
if(node === Object(node)){
if("nodeType" in node){
//dom nodes
frag.appendChild(node);
}else{
//Arraylike structures, like NodeLists or jQuery-Objects, or just plain Arrays
for(i = 0, len = ("length" in node && node.length)|0, r = reducer; i < len; (i in node) && r(frag, node[i]));
}
}else if(node != null) {
//strings (all primitives)
for((c=container).innerHTML = node; fc = c.firstChild; frag.appendChild(fc));
}
return frag;
}
})(document.createElement("div"));
I am using an AJAX call on my page which return table rows and on success adds them to a table in the page. Below is the code currently used:
function GetDPRecords(Perso) {
//alert(Perso);
$Records = $('#DPRecords');
//alert($PersoFileName.val()+" | "+$ProcFromDate.val()+" | "+$ProcToDate.val());
$.ajax({
type: "GET",
url: "SelectDPRecords.jsp",
data: $('form#Search_Form').serialize() + "&Personalized=" + Perso,
beforeSend: function () {
$Records.find("tr:gt(0)").remove();
$("<tr><td colspan='4'><h3 style='margin: 4px 10px'> Loading... </h3></td></tr>").hide().appendTo($Records).show(400);
},
success: function (data) {
$Records.find("tr:gt(0)").remove();
$(data).hide().appendTo($Records).show(400);
}
});
}
The issue is that I at times expect a large number of rows to be returned (1,000-5,000). I did a test run with 4,000 rows data returned and it caused the browser to be unresponsive for about 20 seconds.
Any way to optimize the code and reduce the loading time?
One possible solution is to use a paging system: instead of returning 1,000-5,000 rows, you break the results into pages of, say, 50 results each, and only return one page at a time. You would then give the user buttons to load other pages at the top/bottom of the table.
For an example of what I am talking about, see http://luis-almeida.github.io/jPages/defaults.html. It uses pictures instead of rows, but it is the same basic concept.
Just to add #aj_r's suggestion If you do not want to hit the server again to retrieve the data for next range then you can store the result to a javascript variable (JSON Object Array) and then use that locally with pagination.
I've been facing the same problem recently using huge datagrids in jqGrid and I've been able to find a tricky solution which turned out to be working very well.
Thing is you cannot render all this data at once - this is just too much, especially considering how slow DOM manipulations are. So you need some sort of queue. You can split this data into chunks and render them sequentiali using setTimeout() or setInterval() but those doesn't have well reputation in terms of performance as well.
I ended up using reuestAnimationFrame and splitting my huge data into pieces and rendering them whenever an animation frame was available. So you need to start with polyfill to make sure what you are going to do will actually work, I'm using a great one by Paul irish:
Heres an awsome jsFiddle: http://jsfiddle.net/cjw5eota/1/
And here's the JS:
var dataRows = 5000; // enter ammount of returned rows
var chunkSize = 200; // define single chunk size, optimize for best performance bu trying different values, if your DOM manipulation is heavy use less, if its lightweight you can use more, use 1 to see in slowmo how it works
// We are simulating big returned object here
var data = {}
for (var i = 0; i < dataRows; i++) {
data[i] = {
'id': i,
'name': 'I am data for row ' + i
};
}
// shim layer with setTimeout fallback
window.requestAnimFrame = (function () {
return window.requestAnimationFrame || window.webkitRequestAnimationFrame || window.mozRequestAnimationFrame || function (callback) {
window.setTimeout(callback, 1000 / 60);
};
})();
function renderRecords(data) {
var dataLength = Object.keys(data).length;
var i = 0;
function renderInQueue() {
console.time('Rendering in queue');
for (t = 0; t < chunkSize; t++) {
if (i < dataLength) {
var row = '<tr><td>' + data[i].id +
'</td><td>' + data[i].name + '</td></tr>'
$('table').append(row);
}
i++;
}
if (i < dataLength) {
requestAnimationFrame(renderInQueue);
} else {
console.log('Done rendering');
console.timeEnd('Rendering in queue');
}
}
renderInQueue();
}
// run the script of rendering
renderRecords(data);
I have included a simple performance benchmark for you to see in console how much an entire rendering process takes. Play with chunkSize to see how it changes and try to find a value that will best suit your needs, that is give you decent rendering time without putting heavy load on the browser.
The more you try to render at once the fastest the rendering will be, but more resources will be required to handle each iteration of rendering.