These days many webpages have custom Javascript to execute things on page load. They either modify page content or load external widgets.
My extension tries to read the DOM and insert some data in the pages. However in some cases where the page has its own Javascript, my extension executes before the page Javascript.
Due to that the page Javascript may overwrite my insertions or insert data which my extension cannot read. How can I wait to execute my extension until after the page Javascript functions have loaded/executed?
Maybe this will help, but it has some undesired side-effects:
$(function () {
setTimeout(function () {
alert("At last!");
}, 1000); // 1000 millis after page loaded and most probably all onload handlers have been run
});
The main side-effect is that it will be executed after 1000 millis so for that amount of time the user will see the unprocessed page and then your script will manipulate it. Under circumstances this may get ugly and be a detriment to the users' experience.
EDIT:
You may also try this. On the body's end (inside, but at last) add this script tag:
<script>
$(function () { alert("LAST!"); });
</script>
By the rules of script execution and the fact that jQuery honors the order in which onload handlers are added through the $(function () { ... }); idiom you can be pretty sure that your code is executed last. The problem is that any asynchronous execution, such as AJAX callback handlers, will be executed out-of-order (that is, asynchronously in respect to the window.onload handlers that form a chain of responsibility). To cope with that you need a pattern but it's probably overshoot.
Related
I thought I understood how the setTimeout method worked, but this is confusing me.
test.html (I'm purposefully loading the test.js file before the jQuery file for demonstration. Let's say the jQuery file is hosted locally).
<body>
// ...code
<div id="area"></div>
// ...code
<script src="test.js"></script>
<script src="jquery.js"></script>
</body>
test.js
$('#area').text('hello');
I understand in this case "hello" won't get printed on the browser because jQuery is being loaded after the test.js file. Switching the order of these files solves the problem. But if I leave the order alone, and alter the test.js file, a setTimeout makes it work:
function wait() {
if(window.jQuery) {
$('#area').text("hello");
}
else
{
setTimeout(wait, 10);
}
}
wait();
In this case the "hello" text gets printed on the browser. But I'm sort of scratching my head because somehow the jQuery file does get loaded. But how? Why doesn't the test.js file get caught in an infinite loop forever checking to see if jQuery has loaded? I'd be grateful for some insight on the mechanics of what's going on.
There would be an infinite loop if jQuery never loaded. But in the normal case:
The first time through, jQuery isn't loaded, so we setTimeout()
1a. Other things happen in the meantime, including loading of resources like jQuery
10ms later, we check again.
Is jQuery loaded now? If not, set a timeout and go back to step two
After some number of retries, jQuery does load, and we're off.
The better way to do all of this, of course, would be to
Load jQuery first
Run your wait() function in a ready() handler so it doesn't run until it's needed.
<script src="jquery.js"></script>
<script src="test.js"></script>
// test.js
$(document).ready(
function()
{
$('#area').text("hello");
}
);
Why doesn't the test.js file get caught in an infinite loop forever checking to see if jQuery has loaded?
setTimeout works asynchronously. It does not pause the browser. It simply asks it to execute a certain function after a certain amount of milliseconds.
jquery.js gets loaded and executed inbetween wait() invocations.
Without that setTimeout() code, when the contents of "test.js" are evaluated the browser will immediately run into the problem of $ (jQuery) not being defined. With the setTimeout(), however, the code does not attempt to use the global jQuery symbols until it verifies that the symbols are defined.
Without the setTimeout the code fails with a runtime error. The code in the other version explicitly tests for that failure possibility to avoid it.
setTimeOut method runs in a separate queue called asynchronous callback. so once the interpreter comes to this line, the code is moved to a separate queue and continues with it parsing(which then executes jQuery.js). After this is executed , it looks for items the asynchronous queue to check if the timeout is completed and then executed method inside setTimeout. By this time jQuery.js is already loaded.
More on this
https://youtu.be/8aGhZQkoFbQ
JavaScript is not pre-compiled. It's working "on the fly".
You can add code on the fly, whenever you want, and this includes loading whole libraries. Once the browser loaded an external JS file it parses it, and it's all ready to use.
So if you wait for jQuery, and do have the proper code to load it, it will eventually be loaded by the browser and work.
I know it may sound very strange, but I need to know if there is any active/running javascript in the page.
I am in situation in which I have to run my javascript/jquery code after everything on the page is rendered and all other scripts have finished.
Is it possible to detect this?
EDIT:
Thank you all for the answers. Unfortunately, I was not able to find a solution, because I have no full control of what is going on the page.
Even, I was able to put my javascript in the end of the page, I think it will not be solution again. The reason is that when the page is rendering a function is triggered, it calls other functions and they calls other and so on. As a result, some of the data is incorrect and that's why i need to run my code to correct it.
I use setTimeout with 2 seconds to ensure that my code will be executed last, but this is ugly...
So, thank you all, but this is more problem with the system, not the js.
JavaScript on web browsers is single-threaded (barring the use of web workers), so if your JavaScript code is running, by definition no other JavaScript code is running.*
To try to ensure that your script occurs after all other JavaScript on the page has been downloaded and evaluated and after all rendering has occurred, some suggestions:
Put the script tag for your code at the very end of the file.
Use the defer and async attributes on the tag (they'll be ignored by browsers that don't support them, but the goal is to make yours the last as much as we can).
Hook the window load event via a DOM2 style hookup (e.g., addEventListener on browsers with standards support, or attachEvent on older IE versions).
In the load event, schedule your code to run after a setTimeout with a delay of 0ms (it won't really be zero, it'll be slightly longer).
So, the script tag:
<script async defer src="yourfile.js"></script>
...and yourfile.js:
(function() {
if (window.addEventListener) {
window.addEventListener("load", loadHandler, false);
}
else if (window.attachEvent) {
window.attachEvent("onload", loadHandler);
}
else {
window.onload = loadHandler; // Or you may want to leave this off and just not support REALLY old browsers
}
function loadHandler() {
setTimeout(doMyStuff, 0);
}
function doMyStuff() {
// Your stuff here. All images in the original markup are guaranteed
// to have been loaded (or failed) by the `load` event, and you know
// that other handlers for the `load` event have now been fired since
// we yielded back from our `load` handler
}
})();
That doesn't mean that other code won't have scheduled itself to run later (via setTimeout, for instance, just like we did above but with a longer timeout), though.
So there are some things you can do to try to be last, but I don't believe there's any way to actually guarantee it without having full control of the page and the scripts running on it (I take it from the question that you don't).
(* There are some edge cases where the thread can be suspended in one place and then allow other code to run in another place [for instance, when an ajax call completes while an alert message is being shown, some browsers fire the ajax handler even though another function is waiting on the alert to be dismissed], but they're edge cases and there's still only one thing actively being done at a time.)
There is no definitive way to do this because you can't really know what the latest is that other scripts have scheduled themselves to run. You will have to decide what you want to target.
You can try to run your script after anything else that may be running when the DOM is loaded.
You can try to run your script after anything else that may be running when the page is fully loaded (including images).
There is no reliable, cross-browser way to know which of these events, the scripts in the page are using.
In either case, you hook the appropriate event and then use a setTimeout() to try to run your script after anything else that is watching those events.
So, for example, if you decided to wait until the whole page (including images) was loaded and wanted to try to make your script run after anything else that was waiting for the same event, you would do something like this:
window.addEventListener("load", function() {
setTimeout(function() {
// put your code here
}, 1);
}, false);
You would have to use attachEvent() for older versions of IE.
When using this method, you don't have to worry about where your scripts are loaded in the page relative to other scripts in the page since this schedules your script to run at a particular time after a particular event.
A way to know when multiple functions have all finished executing
This can be useful if you have to wait multiple API calls or initialisation functions
let processRemining = 0;
async function f1() {
processRemining++
await myAsyncFunction()
processFinished()
}
async function f2() {
processRemining++
await myAsyncFunction2()
processFinished()
}
function processFinished() {
processRemining--
setTimeout(() => { // this is not needed is all the functions are async
if (processRemining === 0) {
// Code to execute when all the functions have finished executing
}
}, 1)
}
f1()
f2()
I often couple it with a freezeClic function to prevent users to interact with the page when there is a script that is still waiting an ajax / async response (and optionnaly display a preloader icon or screen).
I'm adding dynamic script by creating a script tag, setting its source and then adding the tag to the DOM. It works as expected, the script is getting downloaded and executes. However sometimes I would like to cancel script execution before it was downloaded. So I do it by removing the script tag from the DOM.
In IE9, Chrome and Safari it works as expected - after the script tag is removed from the DOM it doesn't execute.
However it doesn't work in Firefox - script executes even if I remove it from the DOM or change it its src to "" or anything else I tried, I cannot stop the execution of a script after it was added to the DOM. Any suggestions?
Thanks!
How about some sort of callback arrangement? Rather than have the dynamically added script simply execute itself when it loads, have it call a function within your main script which will decide whether to go ahead. You could have the main script's function simply return true or false (execute / don't execute), or it could accept a callback function as a parameter so that it can decide exactly when to start the dynamic script - that way if you had several dynamic scripts the main script could wait until they're all loaded and then execute them in a specific order.
In your main script JS:
function dynamicScriptLoaded(scriptId,callback) {
if (scriptId === something && someOtherCondition())
callback();
// or store the callback for later, put it on a timeout, do something
// to sequence it with other callbacks from other dynamic scripts,
// whatever...
}
In your dynamically added script:
function start() {
doMyThing();
doMyOtherThing();
}
if (window.dynamicScriptLoaded)
dynamicScriptLoaded("myIdOrName",start);
else
start();
The dynamic script checks to see if there is a dynamicScriptLoaded() function defined, expecting it to be in the main script (feel free to upgrade this to a more robust test, i.e., checking that dynamicScriptLoaded actually is a function). If it is defined it calls it, passing a callback function. If it isn't defined it assumes it is OK to go ahead and execute itself - or you can put whatever fallback functionality there that you like.
UPDATE: I changed the if test above since if(dynamicScriptLoaded) would give an error if the function didn't exist, whereas if(window.dynamicScriptLoaded) will work. Assuming the function is global - obviously this could be changed if using a namespacing scheme.
In the year since I originally posted this answer I've become aware that the yepnope.js loader allows you to load a script without executing it, so it should be able to handle the situation blankSlate mentioned in the comment below. yepnope.js is only 1.7kb.
Here's my issue - I need to dynamically download several scripts using jQuery.getScript() and execute certain JavaScript code after all the scripts were loaded, so my plan was to do something like this:
function GetScripts(scripts, callback)
{
var len = scripts.length
for (var i in scripts)
{
jQuery.getScript(scripts[i], function()
{
len --;
// executing callback function if this is the last script that loaded
if (len == 0)
callback()
})
}
}
This will only work reliably if we assume that script.onload events for each script fire and execute sequentially and synchronously, so there would never be a situation when two or more of the event handlers would pass check for (len == 0) and execute callback method.
So my question - is that assumption correct and if not, what's the way to achieve what I am trying to do?
No, JavaScript is not multi-threaded. It is event driven and your assumption of the events firing sequentially (assuming they load sequentially) is what you will see. Your current implementation appears correct. I believe jQuery's .getScript() injects a new <script> tag, which should also force them to load in the correct order.
Currently JavaScript is not multithreaded, but the things will change in near future. There is a new thing in HTML5 called Worker. It allows you to do some job in background.
But it's currently is not supported by all browsers.
The JavaScript (ECMAScript) specification does not define any threading or synchronization mechanisms.
Moreover, the JavaScript engines in our browsers are deliberately single-threaded, in part because allowing more than one UI thread to operate concurrently would open an enormous can of worms. So your assumption and implementation are correct.
As a sidenote, another commenter alluded to the fact that any JavaScriptengine vendor could add threading and synchronization features, or a vendor could enable users to implement those features themselves, as described in this article: Multi-threaded JavaScript?
JavaScript is absolutely not multithreaded - you have a guarantee that any handler you use will not be interrupted by another event. Any other events, like mouse clicks, XMLHttpRequest returns, and timers will queue up while your code is executing, and run one after another.
No, all the browsers give you only one thread for JavaScript.
To be clear, the browser JS implementation is not multithreaded.
The language, JS, can be multi-threaded.
The question does not apply here however.
What applies is that getScript() is asynchronous (returns immediately and get's queued), however, the browser will execute DOM attached <script> content sequentially so your dependent JS code will see them loaded sequentially. This is a browser feature and not dependent on the JS threading or the getScript() call.
If getScript() retrieved scripts with xmlHTTPRequest, setTimeout(), websockets or any other async call then your scripts would not be guaranteed to execute in order. However, your callback would still get called after all scripts execute since the execution context of your 'len' variable is in a closure which persists it's context through asynchronous invocations of your function.
JS in general is single threaded. However HTML5 Web workers introduce multi-threading. Read more at http://www.html5rocks.com/en/tutorials/workers/basics/
Thought it might be interesting to try this out with a "forced", delayed script delivery ...
added two available scripts from
google
added delayjs.php as the 2nd
array element. delayjs.php sleeps
for 5 seconds before delivering an empty js
object.
added a callback that
"verifies" the existence of the
expected objects from the script
files.
added a few js commands that
are executed on the line after the
GetScripts() call, to "test" sequential js commands.
The result with the script load is as expected; the callback is triggered only after the last script has loaded. What surprised me was that the js commands that followed the GetScripts() call triggered without the need to wait for the last script to load. I was under the impression that no js commands would be executed while the browser was waiting on a js script to load ...
var scripts = [];
scripts.push('http://ajax.googleapis.com/ajax/libs/prototype/1.6.1.0/prototype.js');
scripts.push('http://localhost/delayjs.php');
scripts.push('http://ajax.googleapis.com/ajax/libs/scriptaculous/1.8.3/scriptaculous.js');
function logem() {
console.log(typeof Prototype);
console.log(typeof Scriptaculous);
console.log(typeof delayedjs);
}
GetScripts( scripts, logem );
console.log('Try to do something before GetScripts finishes.\n');
$('#testdiv').text('test content');
<?php
sleep(5);
echo 'var delayedjs = {};';
You can probably get some kind of multithreadedness if you create a number of frames in an HTML document, and run a script in each of them, each calling a function in the main frame that should make sense of the results of those functions.
I'd like to measure how long it takes to run the whole $().ready() scope in each of page.
For profiling specific functions I just set a new Date() variable at the beginning of the relevant part and then check how long it takes to get to the end of the relevant part.
The problem with measuring the whole $().ready scope is that it can sometimes run some of the code asynchronously and then I can not wait for it all to finish and see how long it has taken.
Is there any event which is fired once the page has completely finished running all $().ready code?
EDIT: Using Firebug or other client debuggers are not an option since I also need to collect this profiling information from website users for monitoring and graphing our web site's page load speeds
Thanks!
There will be no event fired because its virtually impossible for ready() to know when any asynchronous functions are done processing. Thus, you'll need to bake this functionality in yourself; you could use jQuery's custom events, or perhaps set a function to run on setInterval() that can introspect the environment and deduce whether or not everything else is done.
Swap out the jQuery ready function with a function that does your start and finish tracking, and calls the original method.
jQuery.ready = (function() {
var original = jQuery.ready;
return function() {
alert('starting profiler');
original();
alert('ending profiler');
};
})();
$(function() {
alert('this message will appear between the profiler messages above...');
});
Have you tried using Profiler in Firebug?