Document listener jQuery.each - javascript

We have a web app that spits out elements server-side and some through Backbone -- and have a simple jQuery.each() statement that flips through elements on the page and adds/removes certain classes based on size of the component. I'm calling this within an "init" function, but it's obviously not working for any elements that are not on the page when loaded; within the Backbone scope.
I'd rather not re-call this function in each Backbone instance (in-case we end up adding more views that render other items needed by this function) and wanted something like a $(document).on() function to work like I do for normal clicks, hovers, etc. I haven't found any examples on how I could make this work with jQuery.each(), so would appreciate any insight on how I'd go about this.

Mutation Observers are what I'll go with.

Related

Angular and calling a JS event only once after document.ready()?

One of the things I'm still hung up on with Angular is understanding all of the lifecycle hooks and when to use what when.
I often need to stick a little plain'ol JS into a component to deal with some DOM issue that, alas, I can't handle via Angular (usually because I'm working within a component that needs to access some elements from a parent component's rendered DOM that we have no access to the Angular code for...I do realize this isn't the 'proper' Angular way but...).
An example right now is a few pages I'm working on that use a component need that needs to hide a DOM element on the page that isn't a part of this component. I need to use JS for this (a whole other story why CSS isn't the solution for this one).
But I only want to do this once the DOM is fully rendered.
Sometimes this seems to work when inserted into ngAfterViewInit -- but sometimes not. It seems that there's no guarantee the full DOM is ready using that lifecycle.
Moving that logic into ngAfterViewChecked does work. However, the issue with ngAfterViewChecked is that it's getting called dozens of times on some pages--and the first few times it's called, the DOM isn't even ready. Not the end of the world, but there's no reason for me to be attempting to grab the same DOM object 40 times per page render. I somewhat remedy this by adding a boolean flag to tell this bit of JS to stop running once it finds the DOM elements but that's hacky.
So my question is: What is the proper (if there is one) way to handle JS manipulation of the DOM after the DOM is fully rendered in the context of an Angular component? Is it to use one of the angular lifecycle events? Something else? Or this whole idea of manipulating DOM objets outside of the component I'm working in just anathema to the 'Angular way' so just isn't something accommodated within Angular?

Continue modifying DOM as user scrolls

I have a chrome extension that modifies the DOM based on keywords. The problem is, for websites like twitter that have an infinite scroll, I need a way for my function to keep firing as the user scrolls through the page.
Is .livequery() the only way to do this or is there a better way?
Right now all of the logic is plain JavaScript/Jquery, but I'm open to using a framework like Angular if that's the best way to do it.
I have several functions that interact -
1) a hide() function that adds a class to divs containing words I want hidden
2) a walk() function that walks the DOM and identifies divs to call hide() on
3) walkWithFilter() function that gets words to filter from localstorage and calls walk() function
The last function walkWithFilter() is called in a window.onload() event
It seems like the onScroll event would be a natural match for this. The trick would be that you'd need to keep track of what's already been processed to avoid reprocessing old content. If you're assuming that the user is always exposing new content below the existing content, that could be as simple as keeping a pointer to the last processed item and restarting the walkWithFilter method from there. That doesn't seem like an entirely safe assumption to me, though.
If you want to be more robust in that regard, you could try a virtual DOM approach: you maintain a copy of the DOM as you last saw it, compare it to the DOM as it currently exists, and take a diff. I know there are a bunch of premade libraries for this kind of thing, but I haven't used any and can't recommend a specific one (the link just goes to the first example that showed up in Google). It also doesn't appear to be overly burdensome to roll your own, if you're so inclined.

Polymer 1.0 Lifecycle Event for Complete Rendered DOM

Background
I've been working with Polymer for a while. I've been converting from .5 and building new elements for a production app. We are currently using Polymer 1.0.6, and this particular issue is also using jQuery 2.x.x and typeahead.js.
Issue
We have an element that builds a dynamic list of label and inputs provided by a data source. in the ready function we get a list of input data, and set that to a local list variable that is bound to a foreach template to create the labels and inputs.
I was unable to find a Polymer element I really liked for typeahead, for Polymer 1.0. So I defaulted to using typeahead.js. my problem is that I cannot find a lifecycle event or workaround, to call the typeahead function after the dom has processed setting the bound list in the ready function.
Code
The easiest way to demonstrate this issue, was to create a HEAVILY trimmed down version in a jsbin. I know the element looks bad, it was cut down as much as possible to demo the core issue I'm facing.
http://jsbin.com/zivano/edit?html,output
What Have I Tried?
I've tried using the attached event, and while it does process after the ready function, the dom changes from ready have not taken effect. I found similar issues on SO domReady vs ready - Migrating to Polymer 1.0 I've tried both suggestions, the second is still being used in the jsbin, without success.
I have also bound the click event of my inputs to a function calling the typeahead setup code, to prove that if the calls are made after the dom is rendered it will work correctly.
Summary
If update a data bound, local variable in the ready function, is there a lifecycle event I can call that will guarantee that those dom changes will be rendered, so I can make a dom query against those new items? Or is there a work around that will let me call a js function on a dom element, one time after the element dom fully renders?
my problem is that I cannot find a lifecycle event or workaround, to
call the typeahead function after the dom has processed setting the
bound list in the ready function.
I think I had a problem like this . For my problem I found a solution using the following :
var self = this;
window.addEventListener('WebComponentsReady', function(e) {
// imports are loaded and elements have been registered
/*Example*/
console.log('Components are ready');
var p = self.getElementsByTagName("paper-item");//paper-item created dynamically
console.log(p);//can access and use this paper-item
/*Finish example*/
//here you can call typeahead because the dom has been processed
});
Sorry for my English or if I dont understand your question, my English is bad.
The Issue I had was that the data-bound list was populated through an ajax function, which was completed after the attached function, even if I made an async call inside of the attached function, it would still fail because of race conditions.
It's worth noting the answer by Flavio Ochoa, will work. I personally preffered to not have my custom elements add listeners to the Window. So i went a different route.
Since my issues we're predicated on guaranteeing that the bound list was updated, I wrapped the ajax call in a Promise, and added the typeahead init logic to the then statement. That solution appears to be working.
I do have some concerns whether the promise can guarantee that the bound list will have propagated to the DOM by the time the then statement is processed. But so far it has worked consistently. I'll edit this answer if I can prove otherwise.

Garbage collection of unneeded event listeners in javascript

I am building a single page webapp. This means over a period of time I get new DOM elements, remove unneeded ones. For example when I fetch a new form I just replace the contents of a specific div with that form HTML and also set up listeners unique to this form's elements. After some period I replace the contents of this form with a new instance of form (having different ID's).
I set up the event listeners again for this new form. Now the previous form is no longer part of the DOM so I the DOM elements should be automatically garbage collected. I am also expecting the listener functions pointing to the elements removed from the DOM to disappear.
However the following profile gathered from Chrome suggests that my listener count is increasing over time. Can you tell me why this is so? I tried clicking on the "Collect Garbage" button. But this is the profile I get. Is there something wrong with the way I am building my application? Is there a problem and if so how should I fix it?
In case it matters I am using JSP templating language with jquery, jquery-ui and some other plugins.
This is how the dynamic fragments that I add/remove on my page look like.
<script>
$(document).ready(function() {
$("#unique_id").find(".myFormButton").button().click(
function() {
$.ajax({url: "myurl.html",
success: function(response) {
console.log(response);
}
});
});
});
</script>
<div id="unique_id">
<form>
<input name="myvar" />
<button class="myFormButton">Submit</button>
</form>
</div>
Update
If you want to have a look at the actual code here is the relevant portion.
This link shows that when clear button is pressed the function clearFindForm is called which effectively refetches content (HTML fragment) using an ajax request and replaces the entire div in this jsp with the content fetched.
The refetchContent function works as below: Here is the link to the code in case that helps in giving a better answer.
function refetchContent(url, replaceTarget) {
$.ajax({
url: url,
data: {},
type: "GET",
success: function (response) {
replaceTarget.replaceWith(response);
},
error: function (response) {
showErrorMessage("Something went wrong. Please try again.");
}
});
}
While jQuery is very good at removing event listeners to DOM elements that are removed via it's methods (including .html() - just read the API: http://api.jquery.com/html/) - it won't remove event listeners to DOM elements that may still have a reference to them in a detached DOM tree.
For example, if you do something like this:
$.ajax({
....
})
.done(function(response,status,jqXHR) {
//create a detached DOM tree
form = $(response)
//add an event listener to the detached tree
form.find('#someIDInTheResponse').on('submit',function() {
});
//add the form to the html
$('#someID').html(form);
});
//at some other point in the code
$('#someIDInTheResponse').remove();
Note that in the above example, despite the fact that you removed the element from the DOM, the listener will not be removed from memory. This is because the element still exists in memory in a detached DOM tree accessible via the global variable "form" (this is because I didn't create use "var" to create the initial detached DOM tree in the scope of the done function....there are some nuances and jQuery can't fix bad code, it can only do it's best.
2 other things:
Doing everything inside callbacks or event listeners (like do this on a button click) turns into real bad spaghetti code really fast and becomes unmanageable rather quickly. Try and separate application logic from UI interaction. For example, don't use callbacks to click events to perform a bunch of logic, use callbacks to click events to call functions that perform a bunch of logic.
Second, and somewhat less important, (I welcome feedback on this perspective via comments) I would deem 30MB of memory to be a fairly high baseline for a web app. I've got a pretty intensive google maps web app that hits 30MB after an hour or so of intensive use and you can really notice start to notice it's sluggishness when you do. Lord knows what it would act like if it ever hit 60MB. I'm thinking IE<9 would become virtually unusable at this point, although, like I said, I welcome other people's feedback on this idea.
I wonder if you are simply not unbinding/removing the previously bound event listeners when you replace fragments?
I briefly looked at the specific sections of code you linked to in your updated question, but didn't see any event listener binding other than what you are doing in document ready, so I'm guessing you are doing some additional binding when you replace the document fragments. I'm not a jQuery expert, but in general binding or assigning additional event listeners does not replace previously bound/assigned event listeners automatically.
My point is that you should look to see if you are doing binding via "click()" (or via some other approach) to existing elements without unbinding the existing event listener first.
You might take a look at moff's answer to this question, which provides an example for click, specifically.
I can't add a comment because of reputation but to respond to what Adam is saying...
To summarise the case Adam presents, it's potentially nothing to do with jQuery, the problem may be within normal Javascript. However you don't present enough code for anyone to really get to the bottom of the problem. Your usage of scoped encapsulation may be perfectly fine and the problem may be else where.
I would recommend that you search for tools for finding the cause of memory leaks (for example, visualising/traversing the entire object/scope/reference/function tree, etc).
One thing to watch out for with jQuery are plugins and global insertions into the DOM! I've seen many JS libs, not just jQuery plugins, fail to provide destroyers and cleanup methods. The worst offenders are often things with popups and popouts such as date pickers, dialogs, etc that have a nasty habit of appending layer divs and the like into body without removing them after.
Something to keep in mind if that a lot of people just get as far as to make things construct but don't handle destruct especially in JS because they expect still even in this day and age that you will be serving normal webpages. You should also check plugins for destroy methods because not all will hook onto a remove event. Events are also used in a messy fashion in jQuery by others so a rogue handler might be halting the execution of following cleanup events.
In summary, jQuery is a nice robust library but be warned, just because someone depends on it does not mean it inherits jQuery's quality.
Out of curiosity... have you checked the listeners on document.ready? Maybe you need to manually GC those.

AJAX - Element / Class / Timer cleanup on replacing content

As many developers will be I'm producing web based application that are using AJAX to retrieve data and HTML.
I'm new to web development and javascript but have a couple of decades experience in programming in other languages.
I'm using mootools, which is a great framework, but have been battleing with the lack of destructors in javascript or even onDestroys/ unloads for the dom elements.
I've written a number of UI classes ( mostly to learn ) and alot of them use setInterval timers to periodically get data from the WebServer and update elements on the page (mostly images from cameras).
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I've been reading more on the mootools site and have realized a number of mistakes I've been making and have started to correct alot of the issues. The biggest of which was not using Element.store and Element.retrieve instead of linking my classes directly to the Elements.
I've already found that the contents of the div being reloaded need to be freed by calling destroy on all its child elements before calling the Request.HTML but that will not remove (clear) any timers that are running.
So I've done a JSFiddle here deinitialize classes to show what i've been trying, its appears to work fine but the following and what i want to know is,
Is it a good idea?
are there any other issues I might have missed?
can you see any problem with this type of implementation ?
or am I reinventing the wheel and missed
something?
Explanation
When the class is initialized it stores itself with the element.
It also appendes (makes if necessary) itself into an AssocClasses array also stored with the element.
I've created a ClearElement function that is called whenever the contents of an element are to be replace with and AJAX call or other method, which gets all elements within the div and if they have and AssocClasses array attached, calls the deinitialize on each of the Classes in the array, then it calls destroy on each of its direct children to free the elements/storage.
Any information, pointers etc would be most greatfully recieved.
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I would rethink your timer storage and use of evalScripts in your ajax calls.
Keep these outside of your AJAX requests. When doing peer code reviews rarely have I seen an instance where these were needed and could be done in a better way.
Maybe on the link that is clicked have it trigger a callback function on Complete or onSuccess
Without seeing your exact code it will be hard to advise further.

Categories