this following piece of code works, but it sort of makes the browser quirk just a bit. Nothing major. I'm wondering if there would be a way to make this more efficient? Can I use caching or somehow populate one select and then just copy it to the other 5. (there are 6 drop downs with a class of 'mask' on the page.)
Any help would be greatly appreciated!
$('.mask').each(function () {
$(this).append($('<option/>').val("").text(""));
for (var i = 1; i < 256; i++) {
$(this).append($('<option/>').val(i).text(i));
}
});
});
You can create the nodes once then clone them, like this:
var temp = $('<select/>');
$('<option/>').val("").text("").appendTo(temp);
for (var i = 1; i < 256; i++) {
$('<option/>').val(i).text(i).appendTo(temp);
}
temp.children().clone().appendTo('.mask');
Instead of doing a lot of individual appends to the DOM (which is very costly) this batches all the elements up in a document fragment then clones them, appending in batches (one batch per select).
It's much faster (around 3-10 times, as tested here) to build up the HTML yourself in a single string:
var html = "<option value=''></option>";
for (var i = 1; i < 256; i++) {
html += "<option value='" + i + "'>" + i + "</option>";
}
$(".mask").append(html);
See performance test comparing the current options in this thread: http://jsperf.com/appending-options-jquery
Related
When I need to add up about 2000 elements by jQuery function append(), browser gets freeze, and seems everything stuck for some time long.
Is this because of jQuery it'self issue or.. any logical issue on my coding side?
What I am doing is like following:
for (var i=0; data.length; i++) {
add_elem(data[i]);
}
function add_elem(data) {
$("#wrapper").append('<div class="row" id="elem_' + data['id'] + '>' + data['htmlv'] + '</div>');
uniform_elem($('#elem_' + data['id']));
}
uniform_elem() has some ui processing to make things look good.
Any good idea?
Instead of calling add_elem in the loop, store all of the html for the elements in a single variable as you loop through the data variable and then invoke add_elem by passing in the new variable.
var elements = "";
// Store all elements in the loop
for (var i=0; data.length; i++) {
elements+= '<div class="row" id="elem_' + data['id'] + '>' + data['htmlv'] + '</div>';
}
// Invoke add_elem once and only once using the new elements var
add_elem(elements);
function add_elem(data) {
// Append once to the DOM
$("#wrapper").append(data);
for (var j=0; data.length; j++) {
uniform_elem($('#elem_' + data['id']));
}
}
If you want to work with each object, you may do:
for (var i=0;i< data.length; i++) {
let u=i;
setTimeout(()=>{
add_elem(data[u]);
},0);
}
This pushes the calls onto the browsers qeue, wich is just executed when the main thread is free ( so no blocking)
I am populating a list with about 25,000 items, using code like this:
var html = "";
for(var i = 0; i < reallyLongArray.length; i++) {
html += "<li><a href='#'>Hi</a></li>";
}
$("#list ol").html(html);
Somewhat to my surprise, I used a profiler and found out that the bottleneck in my code was not the loop that iterated thousands of times, but setting the html of the list to the string. This usually takes about 5-10 seconds on my computer, which is an order of magnitude too slow.
Is there a way to do this that is significantly faster, i.e., at least 10 times faster?
Wrap the html in a single item. When jQuery builds elements from a string, it adds all top level items by iterating them. If you wrap the list items in a single element, it should go much faster because it only has to add 1 top level element to the dom.
var html = "<ul>";
// your loop...
var html += "</ul>";
// add list html to list container
Aside from directly using innerHTML:
$("#list ol").get(0).innerHTML = html;
...and trying the "stringbuffer" technique:
var html = [];
for(i = 0; i < reallyLongArray.length; i++) {
html.push("<li><a href='#'>Hi</a></li>");
}
$("#list ol").html(html.join(''));
...not really.
Using DOM methods to create it should work faster:
var list = ("#list ol");
for(i = 0; i < reallyLongArray.length; i++) {
$(document.createElement('li'))
.append($(document.createElement('a'))
.text('Hi')
.attr({href: 'foobar'})
)
.appendTo(list);
}
edit: Actually, using DocumentFragment should make this even faster:
var fragment = document.createDocumentFragment();
for(i = 0; i < reallyLongArray.length; i++) {
fragment.appendChild($(document.createElement('li'))
.append($(document.createElement('a'))
.text('Hi')
.attr({href: 'foobar'})
)
.get(0)
);
}
$('list ol').append(fragment);
You might also want to clear() the <ol> before adding the elements to it
another edit I've created a jsperf test at http://jsperf.com/insert-html-vs-dom-manipulation - both those versions are slower than setting the innerHTML (because jQuery is used to create the elements). Using dom maniuplation with native methods is much faster than setting the HTML, BUT the fastest way, by a large margin, is using DOM manipulation with DocumentFragment without jQuery, using native methods.
Array joins are faster than string manipulation.
var html[];
for(i = 0; i < reallyLongArray.length; i++) {
html.push(string);
}
$(selector).html(html.join(''));
This will speed it up by quite a bit. String concatenation can take a long time when used a lot because of what happens "under the hood".
$(document).ready(function(){
var html = new Array();
for(i = 0; i < reallyLongArray.length; i++) {
html[i] = "<li><a href='#'>Hi</a></li>";
}
document.getElementById('list').getElementsByTagName('ol')[0].innerHTML = html.join("");
});
One thing to just note here, is that when doing an iteration of 25,000, it will be difficult to reduce the time down to milliseconds if you are inserting a large number of records. This happens especially in IE as it parses each new element before inserting it. Building a "pager" and caching the items you are going to insert will significantly speed that up.
$.grep(ComboBoxData, function (e) {
if (e.Text.toLowerCase().indexOf(TextBox) != -1 || e.ID.toLowerCase().indexOf(TextBox) != -1) {
Show = true;
Result += "<li hdnTextID='" + hdTextID + "' onclick='Select(this," + TextBoxID + ")' onmouseover='triggerTheli(this)' HdntriggerID='" + e.HdntriggerID + "' postBack='" + e.TriggerPostBack + "' Event='" + e.EventName + "' HdnID='" + e.hdnID + "' SelectedValue='" + e.ID + "' style='background:" + Color + "'><a>" + e.Text + "<p> " + e.Description + " </p></a></li>";
if (Color == "#eff3ec")
Color = "#fff";
else
Color = "#eff3ec";
}
});
a good example of mine
I am using the below code in my program but it seems that these few line of code is taking too much time to execute. For 100 iteration it is consuming 1 mins approx. for 200+ iteration my broser is showing a warning message that script is taking too much time. As per the scenario 500+ ids can be pushed into the array.
for (var i = 0; i < arrid.length; i++)
{
$("#" + outerDiv + "> div[id=" + arr[i] + "]").attr("class", "selected");
}
arrid is an array of div ids. Outerdiv is the the parent div of all these div ids present in arrid. arr ids cannot be accessed directly, it has to be referenced using the parent div i.e. outerDiv.
One quick thing you could do is cache your selector so jQuery does not have to query the dom 500+ times.
var $div = $("#" + outerDiv);
for (var i = 0; i < arrid.length; i++)
{
$div.children("div[id=" + arr[i] + "]").attr("class", "selected");
}
On second thought, since you have a list of id's, you shouldn't need any of that as the id should be unique per the dom.
for (var i = 0; i < arrid.length; i++)
{
$("#" + arr[i]).attr("class", "selected");;
}
If outerDiv is an element, you can write
for (var i = 0; i < arrid.length; i++)
{
$("#" +arr[i], outerDiv).attr("class", "selected");
}
But assuming the id's are unique, you shouldn't need to reference the outer div at all. It might even be faster not to.
Also, if this is all you're doing and you're concerned about performance, why not just use plain ol' javascript?
for (var i = 0; i < arrid.length; i++)
{
document.getElementById(arr[i]).className = "selected";
}
Using jQuery function is expensive - as is manipulating the DOM.
You can reduce to one call to jQuery like this:
var divSelector = [];
for (var i = 0; i < arrid.length; i++)
{ // add selector to array
divSelector.push( "#" + outerDiv + "> div[id=" + arr[i] + "]" );
}
// execute jquery once with many selectors
$(divSelector.join(',')).attr("class", "selected");
This code is untested, but should work in principal.
Don't call jQuery selection function too many times
Instead select your elements once and do the rest in a different way. For this thing to work it would be better to convert your arr array of IDs into an accosiative array that can make searching much much faster.
// convert arr = ["idOne", "idTwo", ...]
// into an associative array/object
// a = { idOne: true, idTwo: true, ... }
var a = {};
$.each(arr, function(index, el){
a[el] = true;
})
// do the rest
$("#" + outerDiv " > div[id]").each(function(){
if (a[this.id] === true)
{
$(this).addClass("selected");
}
})
The most inner call can be as well replaced with:
this.className = "selected"
when you can be sure no other classes will be added to the element.
But when you want to set selected on all child div elements (if your IDs cover all elements) then a simple:
$("#" + outerDiv " > div[id]").addClass("selected");
would do the trick just fine.
Store the instance into a variable($div) and use another variable to store the max length the loop for will do.
var $div = $("#" + outerDiv);
for (var i = 0, max = arrid.length; i < max; i++)
{
$div.children("div[id=" + arr[i] + "]").attr("class", "selected");
}
How about this as you have the id's of all the elements you are trying to change the class on:
for (var i = 0; i < arrid.length; i++)
{
$("#" + arr[i]).addClass("selected");
}
I know using the selector div[id=val] performs very slow is certain browser based on a speed test I ran, worth a look as it is pretty interesting:
http://mootools.net/slickspeed/
I think you can directly push jQuery object to the array, then you can just arr[i].attr('class', 'selected')
Currently I have a loop that updates the DOM in each iteration; I have learned this is a bad practice & you should update the DOM as little as possible for better speed.
So I was wondering how I go about editing the below so I can store all the elements in one element or something & then do a single DOM addition once the loop ends.
Here is the loop..
for (var i = spot; i < spot + batchSize && i < cats.options.length; i++) {
// Check if the cat is selected
if (cats.options[i].selected == true) {
// Set this category's values to some variables
var cat_id = cats.options[i].getAttribute('value');
var cat_name = cats.options[i].text;
if (checkCatSICAdd(cat_id) === false) {
// Now we create the new element
var new_option = document.createElement('option');
// Add attribute
new_option.setAttribute('value',cat_id);
// Create text node
var new_text_node = document.createTextNode(cat_name);
// Append new text node to new option element we created
new_option.appendChild(new_text_node);
// Append new option tag to select list
sel_cats.appendChild(new_option);
} else {
failed++;
}
}
}
Working with DOM element in the loop is slow - no matter if you attach them to the document or not. Attaching them at the end is a bit faster since only only redraw is required but it's still slow.
The proper way is generating a plain old string containing HTML and attaching this string to the DOM using the innerHTML property of a DOM element.
Your code should be ok. The DOM won't actually redraw until the Javascript has finished executing. However, if you've encountered a problem browser that does perform badly, you could try creating a new select before your loop that is not yet attached to the DOM, populating it as you are now, then replacing sel_cats with that new select at the end. That way, the DOM is only updated once.
Your way is good enough unless you have great many items added to sel_cats - you add to the DOM only once.
The only way to improve efficiency might be to store the options as raw HTML then assign that after the loop:
var arrHTML = [];
for (var i = spot; i < spot + batchSize && i < cats.options.length; i++) {
// Check if the cat is selected
if (cats.options[i].selected == true) {
// Set this category's values to some variables
var cat_id = cats.options[i].value;
var cat_name = cats.options[i].text;
if (checkCatSICAdd(cat_id) === false) {
arrHTML.push("<option value=\"" + cat_id + "\">" + cat_name + "</option>";
}
else {
failed++;
}
}
}
sel_cats.innerHTML = arrHTML.join("");
Once you have the select list assigned to a variable, remove it from the dom using removeChild on its parent tag. You can then use appendChild in the loop before adding the select list back into the dom.
Your code is way bloated, DOM 0 methods will be much faster.
If speed really matters, store spot + batchSize && i < cats.options.length in variables so they aren't re-calcuated on each loop (modern browsers probably don't, but older ones did):
for (var i=spot, j=spot+batchSize, k=cats.options.length; i < j && i < k; i++) {
// Store reference to element
var opt = cats.options[i];
// The selected property is boolean, no need to compare
if (opt.selected) {
// if checkCatSICAdd() returns boolean, just use it
// but maybe you need the boolean comparison
if (checkCatSICAdd(opt.name) === false) {
// Wrapped for posting
sel_cats.options[sel_cats.options.length] =
new Option(opt.value, opt.name);
} else {
failed++;
}
}
}
I'm using a function which utilizes jQuery in order to grab information from a JSON feed. The problem here is that from the feed I must pick 10 items that meet the criteria of being within the last year (31 billion milliseconds from the request for argument's sake) and I have to specify how many results I want from the feed with a variable 'maxRows' that is inserted into the URL. Here's the function...
function topTen(rows) {
$.getJSON("http://ws.geonames.org/earthquakesJSON?north=90&south=-90&east=-180&west=180&maxRows=" + rows,
function(json) {
var topTen = new Array();
var now = new Date();
var i;
for(i = 0; i < json.earthquakes.length; i++)
{
var time = json.earthquakes[i].datetime;
var then = new Date(time.replace(" ", "T"));
if(now - then < 31536000000) { topTen.push(json.earthquakes[i].eqid); }
}
if(topTen.length >= 10)
{
var html = "The Top Ten Earthquakes Of The Past Year<ol>";
for(i = 1; i <= 10; i++)
{
html += "<li id='number" + i + "' >" + topTen[i - 1] + "</li>";
}
html += "</ol>";
$('#top_ten').html(html);
}
});
}
Now the problem is that from the first request it is likely I will not get 10 results that meet my criteria. So in order to counteract this I try to put the function in a loop until another criteria is met. However, this always winds up failing because the getJSON function (or perhaps the callback) is asynchronous, meaning if I try something like
var rows = 10;
do{
topTen(rows);
rows += 10;
while(!document.getElementById("number10"))
The problem then becomes, however, that the function doing the actual work is not bound by the line-by-line timing of the loop and so the loop itself runs many, many, MANY times before any of the functions actually finish and the loop condition is met. So right now I'm trying to devise another approach that goes something like this
topTen(rows);
rows += 10;
pause(1000);
topTen(rows);
rows += 10;
pause(1000);
topTen(rows);
rows += 10;
pause(1000);
if(document.getElementById("number10"))
alert("There's 10!");
else
alert("There's not 10!");
The pause is basically just what it sounds like and takes in milliseconds. A simple comparison of an initial date object to later date objects in a loop that I copied and pasted. This works to keep the functions from firing off immediately after one another, but then the problem becomes that the if condition is NEVER met. I don't know what it is, but no matter how much time I allow for pausing, the getElementById function never seems to find the element with an id of 'number10' even though I can see it very clearly in Firebug.
I've have been crashing my browser SEVERAL times because of this problem and I am seriously getting PO'd and sick of it. If anyone could find a solution to this problem or even suggest an easier, more elegant solution, I would be eternally grateful.
PS - I've tried things like global variables and using recursion to call topTen() from inside the callback function and send in a larger 'rows' variable, but those don't work because it seems like the callback functions are in their own contained little world where 90% of the rest of my javascript doesn't exist.
You are doing this the wrong way...
You need to wait for one call to return before calling again. Lucky for you, you already have a function being called with it returns. So a simple change to that function and you are done.
var topTenList = new Array();
function topTen(rows) {
$.getJSON("http://ws.geonames.org/earthquakesJSON?north=90&south=-90&east=-180&west=180&maxRows=" + rows,
function(json) {
var now = new Date();
var i;
for(i = 0; i < json.earthquakes.length; i++)
{
var time = json.earthquakes[i].datetime;
var then = new Date(time.replace(" ", "T"));
if(now - then < 31536000000) { topTenList.push(json.earthquakes[i].eqid); }
}
if (topTenList.length < 10)
{
topTen(rows+10);
return;
}
else
{
var html = "The Top Ten Earthquakes Of The Past Year<ol>";
for(i = 1; i <= 10; i++)
{
html += "<li id='number" + i + "' >" + topTenList[i - 1] + "</li>";
}
html += "</ol>";
$('#top_ten').html(html);
}
});
}