I am using the DHTMLX scheduler. Basically what I am doing is a AJAX GET request to the server to get my data, and then load the events using their method addEvent(). So I have quite a bit of data to load on the scheduler and I understand that this can take time. I can have from 20 to 2500 events to add to the scheduler, I use personnalized query to my server to optimize the request on each view. The GET/AJAX request takes no time. But loading the events in the calendar takes forever and not only does it take a long time, it freezes the browser. I thought the events were loading but weren't showing because it was just slow so I created a progress bar. But I then realized that the browser hangs while doing the loop so I don't even see the spinner I implemented. The only way to see the events actually being loaded and to see the spinner is to add breakpoints like you can see here :
Can anyone help me with this? Is there a way to make my code better or at least make the spinner show as it is loading the events? So the user knows what is hapenning? When I add a console.log in the for each I can also see it in the console incrementing, and it does it pretty fast, considering that there's a lot of data it can take between 1 second and 35 or so, and I'm okay with that, I just wish it didn't hang.
Here's my code :
$.each( data, function( key, event ) {
var eventObj = scheduler.getEvent(event.Activity_Id_int);
var type = typeof(me.scheduler.getEvent(event.Activity_Id_int));
if(typeof(me.scheduler.getEvent(event.Activity_Id_int)) === 'undefined')
{
var text;
if(event.Titre != null)
text = event.Titre + " " + event.Ressource + '-' + event.Employe;
else
text = event.Ressource + ' - ' + event.Employe;
me.scheduler.addEvent({
id: event.Activity_Id_int,
start_date: Global.formatDateTime(event.Local_Start_DateTime),
end_date: Global.formatDateTime(event.Local_End_DateTime),
text : text,
color: Global.RandGandB_To_SchedulerRGB(event.Color_R,event.Color_G,event.Color_B),
desc_act : event.Desc_Act,
priorite: event.Priorite,
ressource_id: event.Resource_Id,
ressource_name: event.Ressource,
textColor: "black"
});
}
n++;
progress.update(n/data.length * 100);
console.log("Loading these events y'all"); });
Also instead of clearing the events completly when I change view, I just check if the events from the request are already loaded, which increases performance immensly but it still hangs even if I don't add any event aka if I come back to a view where I have already loaded all the events.
i would consider using for loop insted of $.each for faster performance.
you can check this link to see the difference:
https://jsperf.com/browser-diet-jquery-each-vs-for-loop
Related
So lately I have been learning JS and trying to interact with webpages, scraping at first but now also doing interactions on a specific webpage.
For instance, I have a webpage that contains a button, I want to press this button roughly every 30 seconds and then it refreshes (and the countdown starts again). I wrote to following script to do this:
var klikCount = 0;
function getPlayElement() {
var playElement = document.querySelector('.button_red');
return playElement;
}
function doKlik() {
var playElement = getPlayElement();
klikCount++;
console.log('Watched ' + klikCount);
playElement.click();
setTimeout(doKlik, 30000);
}
doKlik()
But now I need to step up my game, and every time I click the button a new window pops up and I need to perform an action in there too, then close it and go back to the 'main' script.
Is this possible through JS? Please keep in mind I am a total javascript noob and not aware of a lot of basic functionality.
Thank you,
Alex
DOM events have an isTrusted property that is true only when the event has been generated by the user, instead of synthetically, as it is for the el.click() case.
The popup is one of the numerous Web mechanism that works only if the click, or similar action, has been performed by the user, not the code itself.
Giving a page the ability to open infinite amount of popups has never been a great idea so that very long time ago they killed the feature in many ways.
You could, in your own tab/window, create iframes and perform actions within these frames through postMessage, but I'm not sure that's good enough for you.
Regardless, the code that would work if the click was generated from the user, is something like the following:
document.body.addEventListener(
'click',
event => {
const outer = open(
'about:blank',
'blanka',
'menubar=no,location=yes,resizable=no,scrollbars=no,status=yes'
);
outer.document.open();
outer.document.write('This is a pretty big popup!');
// post a message to the opener (aka current window)
outer.document.write(
'<script>opener.postMessage("O hi Mark!", "*");</script>'
);
// set a timer to close the popup
outer.document.write(
'<script>setTimeout(close, 1000)</script>'
);
outer.document.close();
// you could also outer.close()
// instead of waiting the timeout
}
);
// will receive the message and log
// "O hi Mark!"
addEventListener('message', event => {
console.log(event.data);
});
Every popup has an opener, and every different window can communicate via postMessage.
You can read more about window.open in MDN.
I am trying to circumvent some difficult query result pagination by hijacking a form element that controls the number of query results shown on a page. When I've tested my Javascript modification in the Firebug console against the live site it works like a champ but when the same Javascript is injected into the DOM via the casper.evaluate method I get inconsistent results.
My code is as follows:
var s = document.getElementById("requisitionListInterface.dropListSize");
s.options[4].value = 1000;
s.options[4].selected = true;
var e = document.createEvent("HTMLEvents");
e.initEvent("change", false, true );
setTimeout( function(s, e){ s.dispatchEvent(e); }, 2000, s, e );
I've had to create then event handler 'e' and attach it to element 's' to in order to replicated what was taking place on the form (submit page when select.change occurred).
Again the above code functions as expected in Firefox every time.
If the select box is firing Ajax you may not be giving the browser enough time to get the results. I would put some wait statements in there to give things enough time to find and execute.
I have a small chat implementation, which uses a Message model underneath. In the index action, I am showing all the messages in a "chat-area" form. The thing is, I would like to start a background task which will poll the server for new messages every X seconds.
How can I do that and have my JS unobtrusive? I wouldn't like to have inline JS in my index.html.erb, and I wouldn't like to have the polling code getting evaluated on every page I am on.
This would be easiest using a library like mootools or jquery. On domready/document.ready, you should check for a given class, like "chat-area-container". If it is found, you can build a new <script> tag and inject it into DOM in order to include the javascript specific for the chat area. That way, it isn't loaded on every page. The "chat-area-container" can be structured so that it is hidden or shows a loading message, which the script can remove once it is initialized.
On the dynamically created <script> tag, you add an onLoad event. When the script is finished loading, you can call any initialization functions from within the onLoad event.
Using this method, you can progressively enhance your page - users without javascript will either see a non-functioning chat area with a loading message (since it won't work without js anyway), or if you hide it initially, they'll be none-the-wiser that there is a chat area at all. Also, by using a dynamic script tag, the onLoad event "pseudo-threads" the initialization off the main javascript procedural stack.
To set up a poll on a fixed interval use setInterval or setTimeout. Here is an example, using jQuery and making some guesses about what your server's ajax interface might look like:
$(function() {
// Look for the chat area by its element id.
var chat = $('#chat-area');
var lastPoll = 0;
function poll() {
$.getJSON('/messages', { since: lastPoll }, function(data) {
// Imagining that data is a list of message objects...
$.each(data, function(i, message) {
// Create a paragraph element to display each message.
var m = $('<p/>', {
'class': 'chat-message',
text: message.author +': '+ message.text;
});
chat.append(m);
});
});
// Schedules the function to run again in 1000 milliseconds.
setTimeout(poll, 1000);
lastPoll = (new Date()).getTime();
}
// Starts the polling process if the chat area exists.
if (chat.length > 0) {
poll();
}
});
I'm trying to add functionality to a firefox extension to time how long it takes a webpage to perform DNS lookup. Looking at Firebug, I figured it's possible to do so by adding a web progress listener to the browser object and listening for events.
First I register an event listener when a page is loaded:
window.addEventListener("load", function(e) { myObj.onLoad(e); }, false);
Inside myObj.onLoad() I register my web progress listener as such:
gBrowser.addProgressListener(this, Components.interfaces.nsIWebProgress.NOTIFY_ALL);
Finally I implement 'onStatusChange' inside myObj, along with QueryInterface and others:
onStatusChange: function(aWebProgress, aRequest, aStatus, aMessage) {
this.logInfo("onStatusChange: " + aMessage + ". Time = " + (new Date().getTime()));
}
However, when onStatusChange is called, aStatus is always 0 even though aMessage displays the correct event. I've spent hours trying to figure this out. Any ideas why??
Also it seems that onStatusChange with status of 'Ci.nsISocketTransport.STATUS_RESOLVING' is only being called on some components, without being called for others, even though they may have a different domain name that needs to be translated and the DNS has not been cached. Need help plz!
If you attach a progress listener to the tabbed browser you only get a filtered view of progress events. You might find that you need to attach your progress listener to the inner browser instead. (If you want to watch multiple tabs/windows you might find it better to attach your progress listener to the window itself, or, for a service, to the root document loader.)
I'm writing WatiN tests to test an Ajax web application and have come across a timing issue with Ajax requests.
After an Ajax request is triggered by an action on the page, I'd like WatiN to wait until the request is complete before validating that the page was updated correctly.
I have a feeling that the solution will involve eval-ing JavaScript to register handlers for $.ajaxStart and $.ajaxComplete to track whether requests are in progress. I'll dig into that shortly, but wanted to see if anybody else has already solved this. Seems like it would be a common problem with Ajax testing.
I've created a few WatiN Browser extension methods to solve this problem, but am still interested in other solutions.
The InjectAjaxMonitor method creates a javascript global variable that attaches to the ajaxStart and ajaxComplete events to track the number of requests in progress.
Whenever you need to wait for AJAX requests to complete before moving on, you can then call browserInstance.WaitForAjaxRequest();.
public static class BrowserExtensions
{
public static void WaitForAjaxRequest( this Browser browser )
{
int timeWaitedInMilliseconds = 0;
var maxWaitTimeInMilliseconds = Settings.WaitForCompleteTimeOut*1000;
while ( browser.IsAjaxRequestInProgress()
&& timeWaitedInMilliseconds < maxWaitTimeInMilliseconds )
{
Thread.Sleep( Settings.SleepTime );
timeWaitedInMilliseconds += Settings.SleepTime;
}
}
public static bool IsAjaxRequestInProgress( this Browser browser )
{
var evalResult = browser.Eval( "watinAjaxMonitor.isRequestInProgress()" );
return evalResult == "true";
}
public static void InjectAjaxMonitor( this Browser browser )
{
const string monitorScript =
#"function AjaxMonitor(){"
+ "var ajaxRequestCount = 0;"
+ "$(document).ajaxSend(function(){"
+ " ajaxRequestCount++;"
+ "});"
+ "$(document).ajaxComplete(function(){"
+ " ajaxRequestCount--;"
+ "});"
+ "this.isRequestInProgress = function(){"
+ " return (ajaxRequestCount > 0);"
+ "};"
+ "}"
+ "var watinAjaxMonitor = new AjaxMonitor();";
browser.Eval( monitorScript );
}
}
This solution doesn't work very well because .ajaxStart is called only for the first Ajax request, while .ajaxComplete is called each time an ajax request is finished. if you run a this simple code in your console :
$.ajax({url:"/"}); $.ajax({url:"/"})
and add some logging in the .ajaxStart and .ajaxComplete handler methods, you can see that .ajaxStart handler will be called only once and .ajaxComplete handler twice. So ajaxRequestCount will become negative and all your design is screwed.
I suggest that you use .ajaxSend instead of .ajaxStart if you want to keep your design.
Another solution would be to use .ajaxStop instead of .ajaxComplete, but by doing so, you don't need the ajaxRequestCount, you only need a boolean that say if there are ajax requests running behind the scene.
Very useful information can be found : http://api.jquery.com/category/ajax/global-ajax-event-handlers/
Hope this helps.
I just ran into this issue myself while working on some tests using WatiN. I found that in version 1.1.0.4000 of WatiN (released on May 2nd 2007 (latest version being 2.0 RC2 from December 20th 2009)), it is claimed that better support for handling Ajax in tests were added:
To better support testing of AJAX
enabled websites, this release adds
some more options to your toolbox.
A new method is added that will wait
until some attribute has a certain
value. This might be handy in
situations where you need to wait
until a value of an element gets
updated.
Example:
// Wait until some textfield is enabled
textfield.WaitUntil("disable", false.ToSting, 10);
// Wait until some textfield is visible and enabled
textfield.WaitUntil(new Attribute("visibile", new BoolComparer(true)) && new Attribute("disabled", new BoolComparer(false)));
See the link to the release notes for more information.
I haven't looked into it in detail yet, so I cannot tell in which cases it might be useful or not. But thought it could be worth mentioning in case anybody else comes across this question.