Performant append to array of existing DOM elements - javascript

I have the following piece of code, that loops over an Array of DOM objects, runs a test and appends some text to the node if returns true.
$.each( selections, function ( i, e ) {
var test = some_test(e);
if(test) {
$(e).append('passed');
}
});
Whilst this code works fine, on a large set it is obviously going to perform a lot of appends to the DOM. This article on reasons to use append correctly demonstrates how appending to the DOM is far more performant :
var arr = reallyLongArray;
var textToInsert = [];
var i = 0;
$.each(arr, function(count, item) {
textToInsert[i++] = '<tr><td name="pieTD">';
textToInsert[i++] = item;
textToInsert[i++] = '</td></tr>';
});
$('table').append(textToInsert.join(''));
My question is what is the most performant way to make changes to a set of existing DOM elements without having to call .append on each element ? The above example demonstrates the creation of a new element. Is this the only way ?

What makes live DOM manipulations slow is mainly the fact that it's causing DOM reflows for most manipulations that are made. Therefore, you should strive to reduce the amount of DOM reflows by reducing the number of live DOM manipulations.
If you want to manipulate multiple elements that are already part of the DOM, one strategy that can be used is to temporarly remove the parent of those nodes from the DOM, manipulate the elements and then re-attach the parent node where it was.
In the exemple below, I detach the table before manipulating it's rows and then reattach it to the DOM. That's 2 reflows rather than n reflows.
var data = [1, 2, 3, 4, 5, 6, 7];
populateCells(document.querySelector('table'), data);
function populateCells(table, data) {
var rows = table.rows,
reAttachTable = temporarilyDetachEl(table);
data.forEach(function (num, i) {
rows[i].cells[0].innerHTML = num;
});
reAttachTable();
}
function temporarilyDetachEl(el) {
var parent = el.parentNode,
nextSibling = el.nextSibling;
parent.removeChild(el);
return function () {
if (nextSibling) parent.insertBefore(el, nextSibling);
else parent.appendChild(el);
};
}
<table>
<tr><td></td></tr>
<tr><td></td></tr>
<tr><td></td></tr>
<tr><td></td></tr>
<tr><td></td></tr>
<tr><td></td></tr>
<tr><td></td></tr>
</table>

You could completely rebuild the set of DOM elements using one of the techniques from the linked article, then clear the existing set and .append() the new one. But given that you already have the set and just want to adjust a subset of its items, there's nothing wrong with your existing code.

When talking about performance it is a good idea to profile your code so that you don't have to guess.
Here is a simple test case around this example:
http://jsperf.com/most-performant-way-to-append-to-dom-elements
(and related fiddle to demo the visual side of this: http://jsfiddle.net/f62ptjbf/)
This test compares four possible methods for doing this: (does not by any means cover all solutions)
Append "passed" as text node (like your example code)
Append "passed" as a SPAN node (slight variation to your example)
Build a DOM fragment that renders the nodes and records the "passed", then add to DOM as a single append operation from HTML string
Remove selection elements from DOM, manipulate them, then re-add to DOM.
It shows that [at least in my copy of Chrome browser] the fastest method is removal of elements before processing, then re-add to DOM after processing. (#4)
Here is another test that shows [at least in my copy of Chrome] it is faster to append SPAN elements with the text "passed" than it is to append text nodes.
http://jsperf.com/append-variations
Based on these findings, I could recommend two potential performance improvements to your code:
Remove DOM elements in selection before manipulating them, then re-add when finished.
Append a SPAN with the text "passed" instead of appending the text directly.
However, #1 works best if the elements have a shared parent. The performance will be a function of the number of append operations.

Related

Does document.createElement keep triggering DOMContentLoaded event listener?

When I run this script, the page is continuously loading and eventually freezes. Is this because everytime I create an element, the main DOMContentLoaded listener is being called?
If so, how can I stop this recursive behaviour and just add one node to every pre existing node?
//Waits for page to load
document.addEventListener('DOMContentLoaded', function() {
//Get all elements
var items = document.getElementsByTagName("*");
//Loop through entire DOM
for (var i = 0; i < items.length; i++) {
//If it is not a text node
if (!(items[i].nodeType == 3)){
//Create a div
var newDiv = document.createElement("div");
//Add div to current object
items[i].appendChild(newDiv);
}
}
});
It's because items is referencing a "live list". This means that any updates to the DOM are going to be reflected in your list if they match the original selector.
Because you're appending a div, and your selector selects all elements, it gets added to the list, pushing any subsequent members up an index, and so the iteration continues.
To avoid this, make a non-live copy of the collection before iterating.
var items = Array.from(document.getElementsByTagName("*"));
And FYI, the if (!(items[i].nodeType == 3)){ can be removed because getElementsByTagName will never return text nodes.
If you're supporting very old versions of IE, you may want to check that the .nodeType === 1, since some of those old versions included comment nodes when using "*".
Lastly, you can use modern features to clean this up a bit.
document.addEventListener('DOMContentLoaded', () => {
for (const el of [...document.getElementsByTagName("*")]) {
var newDiv = el.appendChild(document.createElement("div"));
// Work with newDiv
}
});
You get a continuously loading page because you have this:
var items = document.getElementsByTagName("*");
Which returns a "live" node list - meaning a list that can/will change as the matched elements change. Since your selector is for everything and then you create new elements, the set of matched elements changes (it gets bigger). Which, in turn causes the length of the node list to change, which keeps your loop running.
You should not try to match all the elements or you should use another method for getting the elements that doesn't return a live node list, like querySelectoAll()

Efficiently replacing strings within an HTML block in Javascript

I am using Javascript(with Mootools) to dynamically build a large page using HTML "template" elements, copying the same template many times to populate the page. Within each template I use string keywords that need to be replaced to create the unique IDs. I'm having serious performance issues however in that it takes multiple seconds to perform all these replacements, especially in IE. The code looks like this:
var fieldTemplate = $$('.fieldTemplate')[0];
var fieldTr = fieldTemplate.clone(true, true);
fieldTr.removeClass('fieldTemplate');
replaceIdsHelper(fieldTr, ':FIELD_NODE_ID:', fieldNodeId);
parentTable.grab(fieldTr);
replaceIdsHelper() is the problem method according to IE9's profiler. I've tried two implementations of this method:
// Retrieve the entire HTML body of the element, replace the string and set the HTML back.
var html = rootElem.get('html').replace(new RegExp(replaceStr, 'g'), id);
rootElem.set('html', html);
and
// Load the child elements and replace just their IDs selectively
rootElem.getElements('*').each(function(elem) {
var elemId = elem.get('id');
if (elemId != null) elemId = elemId.replace(replaceStr, id);
elem.set('id', elemId)
});
However, both of these approaches are extremely slow given how many times this method gets called(about 200...). Everything else runs fine, it's only replacing these IDs which seems to be a major performance bottleneck. Does anyone know if there's a way to do this efficiently, or a reason it might be running so slow? The elements start hidden and aren't grabbed by the DOM until after they're created so there's no redrawing happening.
By the way, the reason I'm building the page this way is to keep the code clean, since we need to be able to create new elements dynamically after loading as well. Doing this from the server side would make things much more complicated.
I'm not 100% sure, but it sounds to me that the problem is with the indexing of the dom tree.
First of all, do you must use ids or can you manage with classes? since you say that the replacement of the id is the main issue.
Also, why do you clone part of the dom tree instead of just inserting a new html?
You can use the substitute method of String (when using MooTools), like so:
var template = '<div id="{ID}" class="{CLASSES}">{CONTENT}</div>';
template.substitute({ID: "id1", CLASSES: "c1 c2", CONTENT: "this is the content" });
you can read more about it here http://mootools.net/docs/core/Types/String#String:substitute
Then, just take that string and put it as html inside a container, let's say:
$("container_id").set("html", template);
I think that it might improve the efficiency since it does not clone and then index it again, but I can't be sure. give it a go and see what happens.
there are some things you can do to optimise it - and what #nizan tomer said is very good, the pseudo templating is a good pattern.
First of all.
var fieldTemplate = $$('.fieldTemplate')[0];
var fieldTr = fieldTemplate.clone(true, true);
you should do this as:
var templateHTML = somenode.getElement(".fieldTemplate").get("html"); // no need to clone it.
the template itself should/can be like suggested, eg:
<td id="{id}">{something}</td>
only read it once, no need to clone it for every item - instead, use the new Element constructor and just set the innerHTML - notice it lacks the <tr> </tr>.
if you have an object with data, eg:
var rows = [{
id: "row1",
something: "hello"
}, {
id: "row2",
something: "there"
}];
Array.each(function(obj, index) {
var newel = new Element("tr", {
html: templateHTML.substitute(obj)
});
// defer the inject so it's non-blocking of the UI thread:
newel.inject.delay(10, newel, parentTable);
// if you need to know when done, use a counter + index
// in a function and fire a ready.
});
alternatively, use document fragments:
Element.implement({
docFragment: function(){
return document.createDocumentFragment();
}
});
(function() {
var fragment = Element.docFragment();
Array.each(function(obj) {
fragment.appendChild(new Element("tr", {
html: templateHTML.substitute(obj)
}));
});
// inject all in one go, single dom access
parentTable.appendChild(fragment);
})();
I did a jsperf test on both of these methods:
http://jsperf.com/inject-vs-fragment-in-mootools
surprising win by chrome by a HUGE margin vs firefox and ie9. also surprising, in firefox individual injects are faster than fragments. perhaps the bottleneck is that it's TRs in a table, which has always been dodgy.
For templating: you can also look at using something like mustache or underscore.js templates.

Seeking for faster $.(':data(key)')

I'm writing an extension to jQuery that adds data to DOM elements using
el.data('lalala', my_data);
and then uses that data to upload elements dynamically.
Each time I get new data from the server I need to update all elements having
el.data('lalala') != null;
To get all needed elements I use an extension by James Padolsey:
$(':data(lalala)').each(...);
Everything was great until I came to the situation where I need to run that code 50 times - it is very slow! It takes about 8 seconds to execute on my page with 3640 DOM elements
var x, t = (new Date).getTime();
for (n=0; n < 50; n++) {
jQuery(':data(lalala)').each(function() {
x++;
});
};
console.log(((new Date).getTime()-t)/1000);
Since I don't need RegExp as parameter of :data selector I've tried to replace this by
var x, t = (new Date).getTime();
for (n=0; n < 50; n++) {
jQuery('*').each(function() {
if ($(this).data('lalala'))
x++;
});
};
console.log(((new Date).getTime()-t)/1000);
This code is faster (5 sec), but I want get more.
Q Are there any faster way to get all elements with this data key?
In fact, I can keep an array with all elements I need, since I execute .data('key') in my module. Checking 100 elements having the desired .data('lalala') is better then checking 3640 :)
So the solution would be like
for (i in elements) {
el = elements[i];
....
But sometimes elements are removed from the page (using jQuery .remove()). Both solutions described above [$(':data(lalala)') solution and if ($(this).data('lalala'))] will skip removed items (as I need), while the solution with array will still point to removed element (in fact, the element would not be really deleted - it will only be deleted from the DOM tree - because my array will still have a reference).
I found that .remove() also removes data from the node, so my solution will change into
var toRemove = [];
for (vari in elements) {
var el = elements[i];
if ($(el).data('lalala'))
....
else
toRemove.push(i);
};
for (var ii in toRemove)
elements.splice(toRemove[ii], 1); // remove element from array
This solution is 100 times faster!
Q Will the garbage collector release memory taken by DOM elements when deleted from that array?
Remember, elements have been referenced by DOM tree, we made a new reference in our array, then removed with .remove() and then removed from the array.
Is there a better way to do this?
Are there any faster way to get all elements with this data key?
Sure!, loop over the data store directly instead of via the element, for example if you wanted a count:
var x=0;
for(var key in $.cache) {
if(typeof $.cache[key]["lalala"] != "undefined") x++;
}
This will be nearly instant, since elements only have an entry in $.cache if they have data and/or events, and there's no DOM traversal happening.
For the other piece, yes this will skip removed elements, since their cache is cleaned up as well, provided you don't remove them via .innerHTML directly.
Since V1.4.3 jQuery supports the HTML5-data-attribute.
This means: if you set an HTML-attribute data-lalala you can also access these attribute using element.data('lalala') . This should be much faster, because you an use the
native attribute-selector $('*[data-lalala]') instead of some workarounds.
So you have to use:
el.attr('data-lalala', my_data);
instead of
el.data('lalala', my_data);
Note: As those data-attributes only allow strings, you'll need to store objects as a stringified JSON there, if you need to work with objects.

Get all DOM block elements for selected texts

When selecting texts in HTML documents, one can start from within one DOM element to another element, possibly passing over several other elements on the way. Using DOM API, it is possible to get the range of the selection, the selected texts, and even the parent element of all those selected DOM elements (using commonAncestorContainer or parentElement() based on the used browser). However, there is no way I am aware of that can list all those containing elements of the selected texts other than getting the single parent element that contains them all. Using the parent and traversing the children nodes won't do it, as there might be other siblings which are not selected inside this parent.
So, is there is a way that I can get all these elements that contains the selected texts. I am mainly interested in getting the block elements (p, h1, h2, h3, ...etc) but I believe if there is a way to get all the elements, then I can go through them and filter them to get what I want. I welcome any ideas and suggestions.
Thank you.
Key is window.getSelection().getRangeAt(0) https://developer.mozilla.org/en/DOM/range
Here's some sample code that you can play with to do what you want. Mentioning what you really want this for in question will help people provide better answers.
var selection = window.getSelection();
var range = selection.getRangeAt(0);
var allWithinRangeParent = range.commonAncestorContainer.getElementsByTagName("*");
var allSelected = [];
for (var i=0, el; el = allWithinRangeParent[i]; i++) {
// The second parameter says to include the element
// even if it's not fully selected
if (selection.containsNode(el, true) ) {
allSelected.push(el);
}
}
console.log('All selected =', allSelected);
This is not the most efficient way, you could traverse the DOM yourself using the Range's startContainer/endContainer, along with nextSibling/previousSibling and childNodes.
You can use my Rangy library to do this. It provides an implementation of DOM Range and Selection objects for all browsers, including IE, and has extra Range methods. One of these is getNodes():
function isBlockElement(el) {
// You may want to add a more complete list of block level element
// names on the next line
return /h[1-6]|div|p/i.test(el.tagName);
}
var sel = rangy.getSelection();
if (sel.rangeCount) {
var range = sel.getRangeAt(0);
var blockElements = range.getNodes([1], isBlockElement);
console.log(blockElements);
}
It sounds like you could use Traversing from the jQuery API.
Possibly .contents()
Hope that helps!
Here is a es6 approach based on #Juan Mendes response:
const selection = window.getSelection();
const range = selection.getRangeAt(0);
const elementsFromAncestorSelections = range.commonAncestorContainer.getElementsByTagName("*");
const allSelectedElements = Array.from(elementsFromAncestorSelections).reduce(
(elements, element) =>
selection.containsNode(element, true)
? [...elements, element]
: elements,
[],
);

Speed up selectors and method

Before I dive in to the jQuery/Sizzle source I thought i'd ask here about ways to speed the below method up.
This is a standard "Check All" checkbox scenario. A header cell (<thead><tr><th>) has a checkbox that when checked checks all other checkboxes in it's table's tbody that are in the same column.
This works:
// We want to check/uncheck all columns in a table when the "select all"
// header checkbox is changed (even if the table/rows/checkboxes were
// added after page load).
(function () {
// use this dummy div so we can reattach our table later.
var dummy = $("<div style=display:none />");
// hook it all up!
body.delegate(".js_checkAll", "change", function () {
// cache selectors as much as possible...
var self = $(this),
// use closest() instead of parent() because
// the checkbox may be in a containing element(s)
cell = self.closest("th"),
// Use "cell" here to make the "closest()" call 1 iteration
// shorter. I wonder if "parent().parent()" would be faster
// (and still safe for use if "thead" is omitted)?
table = cell.closest("table"),
isChecked,
index;
// detaching speeds up the checkbox loop below.
// we have to insert the "dummy" div so we know
// where to put the table back after the loop.
table.before(dummy).detach();
index = cell.index();
isChecked = this.checked;
// I'm sure this chain is slow as molasses
table
// get only _this_ table's tbody
.children("tbody")
// get _this_ table's trs
.children()
// get _this_ table's checkboxes in the specified column
.children(":eq(" + index + ") :checkbox")
// finally...
.each(function () {
this.checked = isChecked;
});
// put the table back and detach the dummy for
// later use
dummy.before(table).detach();
});
} ());
However, for 250+ rows, it starts to become slow (at least on my machine). Users may need to have up to 500 rows of data so paging the data isn't the solution (items are already paged # 500/page).
Any ideas how to speed it up some more?
I wouldn't use all those calls to .children() like that. You'd be much better off just using .find() to find the checkboxes, and then check the parents:
table.find('input:checkbox').each(function(_, cb) {
var $cb = $(cb);
if ($cb.parent().index() === index) cb.checked = isChecked;
});
By just calling .find() like that with a tag name ('input'), Sizzle will just use the native getElementsByTagName (if not querySelectorAll) to get the inputs, then filter through those for the checkboxes. I really suspect that'd be faster.
If finding the parent's index gets expensive, you could always precompute that and store it in a .data() element on the parent (or right on the checkbox for that matter).
// I wonder if "parent().parent()" would be faster
// (and still safe for use if "thead" is omitted)?
No. If <thead> is omitted then in HTML a <tbody> element will be automatically added, because in HTML4 both the start-tag and the end-tag are ‘optional’. So in HTML, it would be parent().parent().parent(), but in XHTML-served-as-XML, which doesn't have the nonsense that is optional tags, it would be parent().parent().
It's probably best to stick with closest(). It's clearer, it's not particularly slow and you're only using it once, so it's not critical anyway.
index = cell.index();
Although, again, this is only once per table so not critical, there is a standard DOM property to get the index of a table cell directly, which will be faster than asking jQuery to search and count previous siblings: index= cell[0].cellIndex.
// we have to insert the "dummy" div so we know
// where to put the table back after the loop.
That's a bit ugly. Standard DOM has a better answer to this: remember the element's parentNode and nextSibling (which may be null if this is the last sibling) and when you're done you can parent.insertBefore(table, sibling).
.children("tbody")
.children()
.children(":eq(" + index + ") :checkbox")
.each(function () {
this.checked = isChecked;
});
You should consider using .children().eq(index) rather than hiding that away in a selector. Won't make a big difference, but it's a bit clearer.
In any case, you can save jQuery's selector engine a bunch of work by using some more standard DOM to traverse the table:
$.each(table[0].tBodies[0].rows, function() {
$(this.cells[index]).find('input[type=checkbox]').each(function() {
this.checked = isChecked;
});
});
Selector queries can be fast, when they're operating against the document and using only standard CSS selectors. In this case jQuery can pass the work onto the browser's fast document.querySelectorAll method. But scoped selectors (find and the second argument to $()) can't be optimised due to a disagreement between jQuery and Selectors-API over what they mean, and non-standard Sizzle selectors like :eq and :checkbox will just get rejected. So this:
$('#tableid>tbody>tr>td:nth-child('+(index+1)+') input[type=checkbox]')
could actually be faster, on modern browsers with querySelectorAll!

Categories