I have the following javascript code, to loop through a number of pages on my server (Used for digital signage).
This code is used in all pages and (at the moment) loops at a page every 3 seconds (see timeout). But, the memory usage of the browser goes up, slowly but steady. After 2 hours it went from 192mb in use to 436mb in use. Since this is on a Raspberry pi with only 512mb memory dedicated to cpu it's not very practical.
Are there any obvious memory leaks in this code? I'm not an expert myself, but since these things will be running 8-12hours a day probably I'm talking about 20reloads/min, so +/- 9600-14400 reloads a day. More if it doesn't get shutdown..
$(document).ready(function() {
versionPage = parseInt(document.getElementById("version").innerHTML);
versionServer = 0
urls = 0;
getVersion();
currentPage = getPage();
getContent();
main();
function getPage() {
page = window.location.href.split("/");
return page[page.length-1];
}
function getVersion() {
$.ajax({
url: "http://localhost/getVersion",
type: "GET",
dataType: "json",
success: function(json) {
console.log("json" + json);
versionServer = json;
if (versionServer != versionPage) {
console.log("Difference!");
}
else {
console.log("Same!");
}
},
});
}
//saves how much urls there are
function getContent() {
$.ajax({
url: "http://localhost/getContent",
type: "GET",
dataType: "json",
success: function(json) {
console.log(json);
urls = json;
},
});
}
//main function loop
function main() {
//check version every
window.setInterval(function() {
getVersion();
if(versionServer != versionPage) {
window.location.href = "http://localhost:5000/1"
}
if(urls != 1) {
nextPage =(parseInt(currentPage) % urls) + 1;
window.location.href = "http://localhost:5000/" + nextPage;
}
}, 3000);
}
});
I had to ask you this in comment but it required "50 reputation" to comment.
Have you tried putting your code in an external Javascript file, something like "signagelooper.js" and looing your pages sequentially. This way your looper functions always have one instance running. Correct me if this is what you do not want to do.
Related
The thing:
I have a page, which has to display undetermined number of images, loaded through AJAX (using base64 encoding on the server-side) one by one.
var position = 'front';
while(GLOB_PROCEED_FETCH)
{
getImageRequest(position);
}
function getImageRequest(position)
{
GLOB_IMG_CURR++;
$.ajax({
url: urlAJAX + 'scan=' + position,
method: 'GET',
async: false,
success: function(data) {
if ((data.status == 'empty') || (GLOB_IMG_CURR > GLOB_IMG_MAX))
{
GLOB_PROCEED_FETCH = false;
return true;
}
else if (data.status == 'success')
{
renderImageData(data);
}
}
});
}
The problem is that images (constructed with the renderImageData() function) are appended (all together) to the certain DIV only when all images are fetched. I mean, there is no any DOM manipulation possible until the loop is over.
I need to load and display images one by one because of possible huge number of images, so I can't stack them until they all will be fetched.
Your best bet would be to restructure your code to use async ajax calls and launch the next call when the first one completes and so on. This will allow the page to redisplay between image fetches.
This will also give the browser a chance to breathe and take care of its other housekeeping and not think that maybe it's locked up or hung.
And, use async: 'false' is a bad idea. I see no reason why properly structured code couldn't use asynchronous ajax calls here and not hang the browser while you're fetching this data.
You could do it with asynchronous ajax like this:
function getAllImages(position, maxImages) {
var imgCount = 0;
function getNextImage() {
$.ajax({
url: urlAJAX + 'scan=' + position,
method: 'GET',
async: true,
success: function(data) {
if (data.status == "success" && imgCount <= maxImages) {
++imgCount;
renderImageData(data);
getNextImage();
}
}
});
}
getNextImage();
}
// no while loop is needed
// just call getAllImages() and pass it the
// position and the maxImages you want to retrieve
getAllImages('front', 20);
Also, while this may look like recursion, it isn't really recursion because of the async nature of the ajax call. getNextImage() has actually completed before the next one is called so it isn't technically recursion.
Wrong and wrong. Don't user timers, don't chain them. Look at jQuery Deferred / when, it has everything you need.
var imgara = [];
for (image in imglist) {
imgara[] = ajax call
}
$.when.apply($, imgara).done(function() {
// do something
}).fail(function() {
// do something else
});
Try using setInterval() function instead of while().
var fetch = setInterval(loadImage, 2000);
function loadImage(){
position= new position; //Change variable position here.
getImageRequest(position);
if(!GLOB_PROCEED_FETCH){
clearInterval(fetch);
}
}
I am doing an AJAX call to my webserver which fetches a lot of data. i show a loading image that spins while the ajax call is executed and then fades away.
the thing i have noticed is that all of the browsers on this particular call will make it non-responsive for about 7 seconds. That being said, the loading image is NOT spinning as what i had planned while the fetch was occurring.
I did not know if this was something that happens or if there is a way around to, in a sense cause there to be a fork() so that it does 1 thing, while my loading icon still spins.
THoughts? Ideas?
below is the code as someone wanted to see it:
$("div.loadingImage").fadeIn(500);//.show();
setTimeout(function(){
$.ajax({
type: "POST",
url: WEBSERVICE_URL + "/getChildrenFromTelTree",
dataType: "json",
async: true,
contentType: "application/json",
data: JSON.stringify({
"pText": parentText,
"pValue": parentValue,
"pr_id": LOGGED_IN_PR_ID,
"query_input": $("#queryInput").val()
}),
success: function (result, textStatus, jqXHR) {
//alert("winning");
//var childNodes = eval(result["getChildrenFromTelTreeResult"]);
if (result.getChildrenFromTelTreeResult == "") {
alert("No Children");
} else {
var childNodes = JSON.parse(result.getChildrenFromTelTreeResult);
var newChild;
//alert('pText: '+parentText+"\npValue: "+parentValue+"\nPorofileID: "+ LOGGED_IN_PR_ID+"\n\nFilter Input; "+$("#queryInput").val() );
//alert(childNodes.length);
for (var i = 0; i < childNodes.length; i++) {
TV.trackChanges();
newChild = new Telerik.Web.UI.RadTreeNode();
newChild.set_text(childNodes[i].pText);
newChild.set_value(childNodes[i].pValue);
//confirmed that newChild is set to ServerSide through debug and get_expandMode();
parentNode.get_nodes().add(newChild);
TV.commitChanges();
var parts = childNodes[i].pValue.split(",");
if (parts[0] != "{fe_id}" && parts[0] != "{un_fe_id}") {
newChild.set_expandMode(Telerik.Web.UI.TreeNodeExpandMode.ServerSide);
}
}
}
//TV.expand();
//recurseStart(TV);
},
error: function (xhr, status, message) {
alert("errrrrror");
}
}).always(function () {
$("div.loadingImage").fadeOut();
});
},500);
A corworker of mine noticed this issue, and suggested i add a setTimeout(function(){..},500); but it does not fix the issue at hand, so it will most likely be removed.
Since JavaScript is single threaded, a lot of sync processing will hang up the event queue and prevent other code from executing. In your case, it's the for-loop thats locking up the browser while it's executing.
What you can try is putting all your iterations into your event queue.
for (var i = 0 ; i < childNodes.length ; i = i + 1) {
(function(i) {
setTimeout(function(i) {
// code-here
}, 0)
})(i)
}
This should space out the processing and not force the browser to finish them all at once. The self executing function is there to create a closure to hold on to the value of the loop counter i.
i got a strange one. I have to make several consecutive ajax calls, and when a call is complete i update a progress bar. This works perfectly on FF but on the rest of the browsers what happens is that the screen freezes until all the calls are complete.
I am not executing the calls in a loop, but by using some sort of recursion cause there's a lot of checking that needs to be done and a loop is not convenient.
When i tried the same thing using a loop the outcome was more or less the same. Chrome or IE did not update the screen until all the ajax requests where done.
What i noticed is that it works ok on FF and opera, but chrome (safari too i suppose) and IE9 are behaving strange. Also on Chrome, during these requests, the response body of the previous request is empty and will remain like that until all requests are done.
Any ideas?
Code is extensive, but here goes. There is a wrapper to ajax, $(db).bind is a callback for success. db.records is the Json result. Model is an object holding several controller functions
$(db).bind('tokenComplete',function(){
var x = db.records;
if (!x.success) { model.callRollBack(); return false; }
var next = parseInt(x.get.num)+ 1;
if (typeof x.post.tokens[next] != 'undefined') {
model.executeToken(next,x.post);
}
else {
model.progressCurrent.find('div.report').html('all done!!');
}
});
model = {
drawProgressBarsTotal : function(el,i,v) {
var p = Math.floor(100 * i / versions.total);
el.find('span').html(p);
el.find('div.report').html('updating to : ' + v.version);
el.find('.changeLog').html(v.changeLog);
el.find('.changeLog').parents('div').show();
el.find('img').css({'background-position': 100 - p + '% 100%'});
},
executeToken : function(i,x) {
if (this.fail == true) { return; }
this.drawProgressBarsCurrent(this.progressCurrent,i+1,x);
db.trigger = 'tokenComplete';
db.data = x;
db.url = dbDefaults.url + '?num='+i+'&action='+x.tokens[i];//bring the first
$(db).loadStore(db);
}
}
loadStore :
$.dataStore = function( ds ) {
$.fn.loadStore = function(ds){
$.ajax({
type: ds.method,
url: ds.url,
data: ds.data,
dataType: ds.dataType,
cache:false,
async:true,
timeout:ds.timeout?ds.timeout:10000,
queue: "autocomplete",
contentType:'application/x-www-form-urlencoded;charset=utf-8',
accepts: {
xml: "application/xml, text/xml",
json: "application/json, text/json",
_default: "*/*"
},
beforeSend:function(){
loadStatus = true;
},
success: function(data) {
loadStatus = false;
if(data)
{ds.records=data;}
$(ds).trigger(ds.trigger);
},
error: function()
{
loadStatus = false;
$(ds).trigger('loadError');
}
});//END AJAX
};//END LOADSTORE
try {
return ds;
} finally {
ds = null;
}
}
}
Haven't followed your entire code, but it sounds like your problem may be related to continuous code execution. Typically the UI will not update during continuous code execution. To fix this, any call to a setTimeout() or any ajax calls should allow the browser time to update the UI. Basically, you must stop the code briefly, then start it again.
function updateUI () {
// change ui state
document.setTimeout( updateUI, 1 );
}
If I am off base here, let me know.
I have some code that looks like this (I left out the parts that are unimportant for the question):
$.ajax({
type: "POST",
url: "prepXML.php",
data: "method=getartists&user=" + userName + "&amount=" + amount,
dataType: "xml",
success: function(xml) {
$("artist", xml).each(function(){
// calculate and output some stuff and then call getTracks()
getTracks(artistName, artistNamePOST, artistPlaycount, divId, artistId);
});
}
});
function getTracks(artistName, artistNamePost, artistPlaycount, divId, artistId){
$.ajax({
type: "POST",
url: "prepXML.php",
data: "method=gettracks&user=" + userName + "&artist=" + artistNamePOST + "&playcount=" + artistPlaycount,
dataType: "xml",
success: function(xml){
// calculate and output some stuff
});
}
});
When this is run, it calls getTracks() fifty times and makes quite some (server) CPU load in very short time until it all gets done. What I would like to do is to group AJAX getTrack() queries by for example 5 at a time, wait until those five are done, then call the next five, wait until the next five are done, call the next five etc. The purpose of this would be to make fewer queries at basically the same time (less and more evenly spread CPU load).
I am unsure how or even if this can be done since it kind of partially beats the point of AJAX, but I would still like to make this work if possible. Could someone please point me into the right direction? Thank you.
To better understand what I need this for and what the app does, here is a link to the app. it can be tried out with any lastfm nick (if you don't have one, you can use mine - "pootzko"). I hope I am allowed to put the link in the post(?), if not, feel free to remove it..
I'd consider only retrieving track info for artists that a user has clicked on. Or possibly retrieving all of the data via a single request (perhaps then after it is retrieved process it in batches via setTimeout()).
But something along the lines of the following might work to do only five requests at a time:
$.ajax({
type: "POST",
url: "prepXML.php",
data: "method=getartists&user=" + userName + "&amount=" + amount,
dataType: "xml",
success: function(xml) {
var artists = $("artist", xml),
i = 0,
c = 0;
function getTracksComplete() {
if (--c === 0)
nextBatch();
}
function nextBatch() {
for(; c < 5 && i < artists.length; i++, c++) {
// artists[i] is the current artist record
// calculate and output some stuff and then call getTracks()
getTracks(artistName, artistNamePOST, artistPlaycount, divId, artistId,
getTracksComplete);
}
}
// Optional - if you need to calculate statistics on all the artists
// then do that here before starting the batches of getTracks calls
artists.each(function() { /* do something */ });
// Kick of the first batch of 5
nextBatch();
}
});
function getTracks(artistName, artistNamePost, artistPlaycount, divId, artistId,
callback){
$.ajax({
type: "POST",
url: "prepXML.php",
data: "method=gettracks&user=" + userName + "&artist=" + artistNamePOST + "&playcount=" + artistPlaycount,
dataType: "xml",
success: function(xml){
// calculate and output some stuff
},
complete : function() {
if (callback) callback();
});
}
});
The above was just off the top of my head (so I didn't have time to build it to scale or to paint it), but the idea is that instead of using .each() to loop through all of the artists at once we will cache the jQuery object and do a few at a time. Calling the function nextBatch() (which is local to the success handler and thus has access to the local variables) will run a loop that calls getTracks() only 5 times, but starting processing from where we left off the previous time. Meanwhile getTracks() has been updated slightly to accept a callback function so when its ajax call completes (and note we do this on complete rather than success in case of errors) it can let the main process know that it has finished. Within the callback we keep track of how many have completed and when they all have call nextBatch() again.
Either include all the track information in the data that is returned from getartists OR only call getTracks when someone wants to see the tracks for a specific Artist.
e.g.
Display all the Artists, and have a "View Tracks" options. Only once this is clicked then you get the tracks.
To me this is the best option because if you are loading up ALL the tracks for ALL the artists, then there is a lot of data that that probably isnt needed. No one is going to want to look through all the artists and all the artists tracks (unless the specifically want to).
My solution is to process all the artist data into the array and then start executing .ajax() requests in the batches of 5. When those 5 requests are done, next batch is executed, etc.
HERE is a working demonstration.
// just success changed
$.ajax({
type: "POST",
url: "prepXML.php",
data: "method=getartists&user=" + userName + "&amount=" + amount,
dataType: "xml",
success: function(xml) {
var data = [];
// prepare the data
$("artist", xml).each(function(){
data.push({ name: artistName /** rest of data */ } );
});
// start processing the data
processData(data, 0);
}
});
// process the data by sending requests starting from index
function process(data, index) {
var xhrs = [];
for (var i = index; i < index + 5 && i < data.length; i++) {
(function(data) {
xhrs.push($.ajax({
type: "POST",
url: "prepXML.php",
data: "method=gettracks&user=" + data.name /** rest of the data */
dataType: "xml",
success: function(xml) {
// calculate and output some stuff
}
}));
})(data[i]);
}
// when current xhrs are finished, start next batch
$.when.apply(this, xhrs).then(function() {
index += 5;
if (index < data.length) {
process(data, index);
}
});
}
It's difficult to clearly understand the logic of you application, but I think a better way should be collecting previously all the data you need to send (50 entries in a JSON object) and then call getTracks function only once, passing only one JSON object.
I'm trying to work with two XML files. I use the second highlighted answer in this thread [1] as a base script.
This is what I got:
jQuery.extend({
getValues: function(url) {
var result = null;
$.ajax({
url: url,
type: 'get',
dataType: 'xml',
async: false,
success: function(data) {
result = data;
}
});
return result;
}
});
var party1 = $.getValues('http://data.riksdagen.se/voteringlista/?rm=2010%2F11&bet=&punkt=parti=M&valkrets=&rost=&iid=&sz=500&utformat=xml&gruppering=bet')
var party2 = $.getValues('http://data.riksdagen.se/voteringlista/?rm=2010%2F11&bet=&punkt=&parti=S&valkrets=&rost=&iid=&sz=500&utformat=xml&gruppering=bet')
$(party1).find('votering').each(function(){
var id = $(this).find("forslagspunkt").text()
partyTwo(id)
//-------------------------------------
//HERE I RUN A FEW SIMPLE IF STATEMENTS
//------------------------------------
})
function partyTwo(id) {
$(party2).find('votering').filter(function() {
return $(this).find("forslagspunkt").text() == id;
}).each(function () {
//-------------------------------------
// AGAIN, A FEW SIMPLE IF STATEMENTS
//------------------------------------
return vote
})
}
This leaves me with two problems:
1) partyTwo(id) returns 'undefined', but works fine if I manually insert an id outside.
2) The whole script runs very slow (+5 sec to load).
Any thoughts?
[1] JQuery - Storing ajax response into global variable