As many developers will be I'm producing web based application that are using AJAX to retrieve data and HTML.
I'm new to web development and javascript but have a couple of decades experience in programming in other languages.
I'm using mootools, which is a great framework, but have been battleing with the lack of destructors in javascript or even onDestroys/ unloads for the dom elements.
I've written a number of UI classes ( mostly to learn ) and alot of them use setInterval timers to periodically get data from the WebServer and update elements on the page (mostly images from cameras).
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I've been reading more on the mootools site and have realized a number of mistakes I've been making and have started to correct alot of the issues. The biggest of which was not using Element.store and Element.retrieve instead of linking my classes directly to the Elements.
I've already found that the contents of the div being reloaded need to be freed by calling destroy on all its child elements before calling the Request.HTML but that will not remove (clear) any timers that are running.
So I've done a JSFiddle here deinitialize classes to show what i've been trying, its appears to work fine but the following and what i want to know is,
Is it a good idea?
are there any other issues I might have missed?
can you see any problem with this type of implementation ?
or am I reinventing the wheel and missed
something?
Explanation
When the class is initialized it stores itself with the element.
It also appendes (makes if necessary) itself into an AssocClasses array also stored with the element.
I've created a ClearElement function that is called whenever the contents of an element are to be replace with and AJAX call or other method, which gets all elements within the div and if they have and AssocClasses array attached, calls the deinitialize on each of the Classes in the array, then it calls destroy on each of its direct children to free the elements/storage.
Any information, pointers etc would be most greatfully recieved.
Most issue occur when another page is requested with the menu and the content div is reloaded with new HTML and Javascript ( using Request.HTML ). This simple replaces all the elements already in the div with the new one and runs the new scripts. Any timers in the old scripts or old objects created will continue to run. This was leaving me with lots of orphaned Clases, elements and timers.
I would rethink your timer storage and use of evalScripts in your ajax calls.
Keep these outside of your AJAX requests. When doing peer code reviews rarely have I seen an instance where these were needed and could be done in a better way.
Maybe on the link that is clicked have it trigger a callback function on Complete or onSuccess
Without seeing your exact code it will be hard to advise further.
Related
I am a lowly operations employee without authorization to change the programs and permissions on my machine, and I would like to automate some highly repetitive data entry. I know there are a lot of programs that can do that, however, for the sake of this discussion we'll assume that I'm not allowed to have any of them and I can only script through the debug F12 menu in Chrome. I also probably don't understand half of these words as well as I should.
I have to run test cases on a third-party vendor's highly dynamic website, and I've already successfully written javascript which adds texts to elements in the DOM and presses the "next" button.
The problem is, upon .click()ing the "next" button, it takes time for the page to update, and the update creates new elements which weren't in the DOM when the script was initialized. I need to find a way to delay the execution of the script until the DOM contains all the elements I need to update.
As a really, really crude proof of concept I wrote the pre-filler for each page as a function, and I serially called each function at the end of the previous function, using setTimeout(nextfunct, 10000) to let the page update before executing the next line. (I was going to refine that by trying to create some kind of object listener instead of an arbitrary 10 second delay, but I wasn't even able to get that far.) This approach creates two errors.
1) The script seems to be checking whether the elements are on the DOM before the end of the setTimeout(), so it still gives me an error. If nextfunct is defined as
document.getElementById("doesntexistyet").value = "Fill Me";
console.log("nextfunct ran");
I will get the error message stating there is no element with the id "doesntexistyet" immediately, not after a delay of 10 seconds. The element on the next page will not update.
2) The DOM updating interrupts my script. In the above code, the console output will not ever appear in my console. If I comment out the missing element, so the function only prints a comment, it will still not appear in my console. However, if I comment out the code and I switch the setTimeout to 1ms, "nextfunct ran" will appear in my console, until the page updates, at which time the console will be deleted.
Are there ways around this which I can implement using only vanilla JS and a browser? I'm sure there's a keyword I can search for where someone has discussed this before, but it seems like the vast majority of JS autofilling discussions are oriented towards people designing code to be integrated into a website,
Thanks
I have a chrome extension that modifies the DOM based on keywords. The problem is, for websites like twitter that have an infinite scroll, I need a way for my function to keep firing as the user scrolls through the page.
Is .livequery() the only way to do this or is there a better way?
Right now all of the logic is plain JavaScript/Jquery, but I'm open to using a framework like Angular if that's the best way to do it.
I have several functions that interact -
1) a hide() function that adds a class to divs containing words I want hidden
2) a walk() function that walks the DOM and identifies divs to call hide() on
3) walkWithFilter() function that gets words to filter from localstorage and calls walk() function
The last function walkWithFilter() is called in a window.onload() event
It seems like the onScroll event would be a natural match for this. The trick would be that you'd need to keep track of what's already been processed to avoid reprocessing old content. If you're assuming that the user is always exposing new content below the existing content, that could be as simple as keeping a pointer to the last processed item and restarting the walkWithFilter method from there. That doesn't seem like an entirely safe assumption to me, though.
If you want to be more robust in that regard, you could try a virtual DOM approach: you maintain a copy of the DOM as you last saw it, compare it to the DOM as it currently exists, and take a diff. I know there are a bunch of premade libraries for this kind of thing, but I haven't used any and can't recommend a specific one (the link just goes to the first example that showed up in Google). It also doesn't appear to be overly burdensome to roll your own, if you're so inclined.
I have a situation where I am working on a large site and what I have been doing is using one main .js file to store all my bound js code that I want to use on elements such as onclick, onchange etc etc.... these are all held within the one onDomReady method.
Now I'm wondering is it such a good idea to have each page have to go over these and "search" for each element to see if it has to bind anything?
..or should I perhaps use more specificity to prevent this such as the main page ID like #page1, #page2 etc OR should I store these in the specifics pages header (I don't really want to do that as I prefer to keep it all in one place).
Just trying to optimize things and get rid of unnecessary overhead! :)
If I understand correctly, you have one js file with all your event handlers.
This file is included i many pages.
So for example, if there are 100 event handlers in the file, each page may be using only 10 of these.
If thats the case, then its not efficinet, because you have lots of
document.getElementBy... that are not fnding the elements, because they belong to a different page, or worse, finding elements with same selector on multile pages that should not be binded to handlers on a specific page.
also, you are adding script to a page that it does not need.
Best to give each page only what it needs, be it in external js or if very little script, in doucment head.
js that you share across pages should be code that you intend to re-use often.
EDIT:
In response to comment:
regarding reducing http requests, you mean the one file will be in cache, for other pages to use? fair enough, that counts as a benefit. Though there are tradeoffs, such as increased memory usage due to javascript that you dont need in page.
using more specific selector will reduce the risk of attaching event handler to wrong element in a page that you did not mean to target, but there is a safer option:
If you insit on sharing one event handler file across pages,
Group them by wrapping them a function, one for each page. call that function from the page.
This way, you dont have to execute a bunch of code that you dont need, and don't risk adding wrong event handlers to simmilar elements accross pages.
I'm developing a single page application that uses a lot of widgets (mainly grids and tabs) from the jqWidgets library that are all loaded upon page load. It's getting quite large and I've started to notice after using (I emphasize using because it doesn't start to lag after simply being open for any amount of time, but specifically, after opening and closing a bunch of tabs on my page, each tab containing multiple grids loaded thru Ajax that have multiple event listeners tied to each) the site for a couple minutes the UI becomes quite slow and sometimes non-responsive, when the page is refreshed everything works smooth again for a few minutes then back to laggy. I'm still testing on localhost. My initial reaction was that the DOM has too many elements (each grid creates hundreds of divs! And I have a lot of them) so event listeners which are tied to IDs have to search through too many elements and become slow. If this is the case it won't be too hard to fix, is my assumption likely to be the culprit or do I have worse things to fear?
UPDATE: here are captures of the memory time line and heap snapshot. On the memory timeline there was no interaction with the site, the two large increases are page refreshes, the middle saw tooth section is just letting my site idle.
Without seeing any code examples it doesn't sound too bad.
If you have a LOT of jQuery selectors try and make those specific as possible. Especially if you're selecting a lot of items a lot of the time.
For example, if you have a bunch of class "abc", try and specify before that where to look - e.g. are they only found within table cells? are they only found within paragraph tags? The more specific you make your selector the better as if you specify the selector like this:
$('.class')
Then it will search the entire DOM for anything that matches .class, however, if you specify it as follows: $('p .class') then it will only search all paragraph tags for the class.
Other performance killers are wiring up events and then never removing them. If you have any code that removes elements that have event handlers attached to them then best practice is to remove the event handlers when the element is removed. Otherwise you will start piling up orphaned events.
If you are doing a large single page application look to a library like backbone (http://backbonejs.org/) or angular (http://angularjs.org/) to see if this can help you - they alleviate a lot of these issues that people who use plain jQuery will run in to.
Finally, this post (http://coding.smashingmagazine.com/2012/11/05/writing-fast-memory-efficient-javascript/) is seriously good at outlining out you can write fast, efficient javascript and how to avoid the common performance pitfalls.
Hope this helps.
It does sound like you have a memory leak somewhere. Are you using recursion that's not properly controlled or do you have loops that could be ended early, but you fail to break out of them when you find something you're looking for before the loop naturally ends. Are you using something like this:
document.getElementById(POS.CurrentTableName + '-Menus').getElementsByTagName('td');
where the nodelist returned is huge and you only end up using a tiny bit of it. Those calls are expensive.
It could be your choice of architecture also. Hundreds of divs per grid doesn't sound manageable logically by a human brain. Do you address each div specifically by id or are they just an artifact of the lib you're using and are cluttering up the DOM? Have you checked the DOM itself as you're using it to see if you're adding elements in the hinterland by mistake and cluttering up the DOM with junk you don't use causing the DOM to grow continuously as you use the app. Are you adding the event handlers to the elements numerous times instead of just once?
For comparison, I too have a single page app (Google-Chrome App - Multi currency Restaurant Point of Sale) with anywhere from 1,500 to 20,000 event handlers registered making calls to a sqlite back end on a node.js server. I used mostly pure JS and all but 50 lines of the HTML is written in JS. I tie all the event handlers directly to the lowest level element responsible for the event. Some elements have multiple handlers (click, change, keydown, blur, etc).
The app operates at eye blink speed and stays that fast no matter how long its up. The DOM is fairly large and I regularly destroy and recreate huge portions of it (a restaurant table is cleared and recreated for the next sitting) including adding up to 1,500 event handlers per table. Hitting the CLEAR button and it refreshing the screen with the new table is almost imperceptible, admittedly on a high end processor. My development environment is Fedora 19 Linux.
Without being able to see your code, its a little difficult to say exactly.
If the UI takes a little bit before it starts getting laggy, then it sounds likely that you have a memory leak somewhere in your JavaScript. This happens quickly when using a lot of closures as well as nested function and variable references without cleaning them up when your done with them.
Also, event binding to many elements can be a huge drain on browser resources. If possible, try to use event delegation to lower the amount of elements listening to events. For example:
$('table').on('click','td', myEventHandler);
Be careful to make sure that event bindings only occur once as to avoid actions being unintentionally fired many times.
Good luck!
I am preparing to implement a Twitter-like infinite scrolling to my product pages. That is, loading additional page portions using AJAX when I am crossing certain scroll thresholds. But I am unsure how the topics in the title are affected after such loading. My questions are the following:
For each new batch of elements being loaded with AJAX, will the DOM be updated for these new elements OR totally renewed? What happens with the old DOM?
Will I be able to use Javascript and jQuery on these new DOM elements exactly like I have on the DOM I start off with for the page? I guess this relates to the first question.
For each load, I will load say 9 new products. Each product has a FB Like button which is utilising FB Open Graph API. Will the new products Like elements go through the same asynchronous modification which happens to the DOM elements I start off with so that a proper Like submission is possible?
Let's begin one by one.
The DOM, in your intent, should only be updated, not renewed. There
is no old DOM since what you do is to insert new elements on them.
Yes, you'll be able to do that. Be careful though with event
listeners because if you start them wrong, you'll have to attach new
event listener to those new nodes again. For example:
$('body').on('click','a.addToCart',function(){}) // Will match present and future nodes
$('a.addToCart').on('click',function(){}); // Will only match present nodes
Yes, you'll need to do the same process for each button again.
Bonus tip: If you care about mobile environments, you should keep your DOM as clean as you can by deleting nodes you won't need.
Just use document.appendChild() for adding new Elements to the DOM. (http://www.w3schools.com/jsref/met_node_appendchild.asp)
Rebuilding the hole page would be a waste of time ;)
There should be no problem. I've done it many times without any problems. If you are using jQuery Mobile you have to refresh the new elements maybe. (Look at the methods of die jQuery Mobile Widget you are using, if it needs refreshing)
I don't have much experience with the facebook api, but i would say "yes" =)
edit: I had a link to a german site in my answer. I've forgot that this is an english site ;)