jquery initiates automatic call for other js files - javascript

I see some strange behavior when I check network tab on chrome tools. jQuery is initiating call to other js files of page. These other files are already loaded but jQuery is adding ?_=some_random_number at the end and calling those files again.

jQuery does not load other javascript-files by default. You must have another library that loads these resources. And by the way, the ?_={timestamp} is used to invalidate browser-caching (the url changes every second, so every second a real request is made, not just a browser-cache lookup).
You can try to enforce the ajax-caching mechanism with:
$.cache = true;
while initializing your code. Thus, the requests will be cached by your browser. But, be aware, that this might tamper with other parts of your code. A better way would be to identify the reason why these extra resources are loaded, and avoid loading.

Related

How to use javascript without appending it to DOM

I am developing an Single Page Application (SPA) from scratch. I am doing it from scratch using only HTML, CSS and vanilla JavaScript and not using any external frameworks.
My application will initially load Web page but upon navigating to some other page say page2, it will only load required data and functions about other page2 from page2.js and not reload the entire Web page.
To use the JavaScript I will append it to body. But the problem is that when I navigate same page again it will append the same JavaScript again. The more pages I visit the more scripts are attached.
I have tried removing existing script tag in favour or upcoming script and it works good, but is there a way that I don't have to append script to DOM in the first place?
So my question is, is there a way we can parse (not just plain read) or execute JavaScript file without using any physical medium (DOM)
Although I am expecting pure JavaScript, libraries would also work, just need a logical explaination
So my question is, is there a way we can parse (not just plain read) or execute JavaScript file without using any physical medium (DOM)
Yes, you can. How you do it depends on how cutting-edge the environment you're going to support is (either natively, or via tools that can emulate some things in older environments).
In a modern environment...
...you could solve this with dynamic import, which is new in ES2020 (but already supported by up-to-date browsers, and emulated by tools like Webpack and Rollup.js). With dynamic import, you'd do something like this:
async function loadPage(moduleUrl) {
const mod = await import(moduleUrl);
mod.main();
}
No matter how many times it's requested, within a realm a module is only loaded once. (Your SPA will be within a realm, so that works.) So the code above will dynamically load the module's code the first time, but just give you back a reference to the already-loaded module the second, third, etc. times. main would be a function you export from the module that tells it you've come (back) to the "page". Your modules might look like this:
// ...code here that only runs once...
// ...perhaps it loads the markup via ajax...
export function main() {
// ...this function gets called very time the user go (back) to our "page"
}
Live example on CodeSandbox.
In older environments...
...two answers for you:
You could use eval...
You can read your code from your server as text using ajax, then evaluate it with eval. You will hear that "eval is evil" and that's not a bad high-level understanding for it. :-) The arguments against it are:
It requires parsing code; some people claim firing up a code parser is "slow" (for some definition of "slow).
It parses and evaluates arbitrary code from strings.
You can see why #2 in particular could be problematic: You have to trust the string you're evaluating. So never use eval on user-supplied content, for instance, in another user's session (User A could be trying to do something malicious with code you run in User B's session).
But in your case, you want and need both of those things, and you trust the source of the string (your server), so it's fine.
But you probably don't need to
I don't think you need that, though, even in older environments. Your code already knows what JavaScript file it needs to load for "page" X, right? So just see whether that code has already been loaded and don't load it again if it is. For instance:
function loadPage(scriptUrl, markupUrl) {
// ...
if (!document.querySelector(`script[src="${scriptUrl}"]`)) {
// ...not found, add a `script` tag for it...
} else {
// ...perhaps call a well-known function to run code that should run
// when you return to the "page"
}
// ...
}
Or if you don't want to use the DOM for it, have an object or Map or Set that you use to keep track of what you've already loaded.
Go back to old-school -- web 1.0, DOM level 1.0, has your back. Something like this would do the trick:
<html><head>
<script>
if (!document.getElementById('myScriptId')) {
document.write('<script id="myScriptId" src="/path/to/myscript"></scri' + 'pt>');
}
</script>
This technique gets everybody upset, but it works great to avoid the problems associated with doing dynamic loading via DOM script tag injection. The key is that this causes the document parser to block until the script has loaded, so you don't need to worry about onload/onready events, etc, etc.
One caveat, pull this trick near the start of your document, because you're going to cause the engine to do a partial DOM reparse and mess up speculative loading.

Page Specific JavaScript using Content Security Policy (CSP) [duplicate]

This question already has answers here:
Best way to execute js only on specific page
(5 answers)
Closed 5 years ago.
I want to use Content Security Policy (CSP) across my entire site. This requires all JavaScript to be in separate files. I have shared JavaScript used by all pages but there is also page specific JavaScript that I only want to run for a specific page. What is the best way to handle page specific JavaScript for best performance?
Two ways I can think of to workaround this problem is to use page specific JavaScript bundles or a single JavaScript bundle with switch statement to execute page specific content.
there is lots of ways to execute page specific javascript
Option 1 (check via class)
Set a class to body tag
<body class="PageClass">
and then check via jQuery
$(function(){
if($('body').hasClass('PageClass')){
//your code
}
});
Option 2 (check via switch case)
var windowLoc = $(location).attr('pathname'); //jquery format to get window.location.pathname
switch (windowLoc) {
case "/info.php":
//code here
break;
case "/alert.php":
//code here
break;
}
Option 3 Checking via function
make all the page specific script in the function
function homepage() {
alert('homepage code executed');
}
and then run function on specific page
homepage();
Sorry, I know this ended up being a long read, but it'll be worth it to do it as you'll be able to make the choice that's right for your site. For a tl;dr, read the first sentence of each paragraph.
First of all, no matter which route you choose you should put all of the JS common to each page in the same file to take maximum advantage of caching. That's just common sense. Also, in all cases, I assume you're using a competent minifier since that will make a bigger difference than anything else. Packagers also exist if you need one of those -- Google is your friend if you need either of these.
For the page specific JS, you should decide whether it's most important to have your first page load (the user's first contact with your site) be 'fast', or if it's most important to have the following page loads (the user's first contact with any given page) be 'fast'. Modern browser caching is quite good now, so you can rely on the browser loading from cache whenever it can. In general, if it's most important for the first page load to be fast, then create separate JS files (this way, the user isn't stuck downloading 10 MB of data before they even get to your site). If not, then put all the JS in the same file, keeping in mind that if one page has significantly more JS than others, it will adversely affect the load time of every page on your site. Note that this extra load time can be mitigated with the use of async or defer tags, more on that later.
Consider the case where page A has 5 KB of JS and page B has 5 MB of JS. If you put both scripts in the same file, page A will load more slowly (since it needs to load ~5 MB of JS) but page B will load much faster due to the JS file being cached already. If you keep them separate, page A will load much faster than page B, but there will be an average speed decrease compared to the first case. If one page doesn't have significantly more JS than another, use separate files. You'll encounter much better average load time since the "savings" of loading the big file ahead of time will be greatly diminished (you'll also avoid the issue mentioned below).
Another consideration is whether one of the JS files will change often, as this will invalidate the cached version and require the browser to redownload it. If you put all your JS together and only one of the files is volatile (especially if it's a page not often visited, such as a registration page), the end user will face a higher average load time than if you keep them separate. Stack Overflow themselves took an interesting approach to this. It appears they have a function to invalidate the cache of JS unrelated to the page and load it (if necessary) when the JS on the page loads from the cache to save loading time later.
One more thing! Beyond all this, you should also decide whether or not you should use async or defer in your script tags since you're migrating to fully "external" JS.
async allows the page to load and display to the user before the JS is finished downloading. This is a great way to hide the download of a big JS file if you decide to go the "one file to rule them all" route. However, you might also find the JS needs to be downloaded and execute in order for the page to display properly (as is the case when not using async or defer).
As a result, it might be a good idea to use a hybrid of the two suggestions and split your js into individual files that need to be loaded per page for the page to display correctly (one per page), and put all the js that doesn't into a script that loads through an async or defer tag (this being the "one big file"). defer lets the browser load it in the background after the page is displayed to the user.
Ultimately, only you can make the decisions that are right for your app. There's no one magic option that will work in all cases, but that's the reality of software design/engineering. I hope I've made the process clearer for you so you can arrive at the right choice more easily, though.

JS on ajax-loading pages getting out of sync with server-side code and HTML

I have a website that loads mostly using AJAX calls. The javascript and CSS files are only loaded once when the page first loads.
My issue is that the javascript/CSS can get out of sync with the HTML and server-side code. The page can be using an old versions of the javascript file (from when the page first loaded) while the server-side code and ajax-loaded HTML files always use the latest code and files.
What are some strategies for dealing with this?
I have considered polling the server at set intervals and asking if there is a newer version of the JS. Then, if there is, reloading the page. But, it seems that this can get ugly, with the page suddenly reloading at awkward moments instead of, for example, as the result of a user-initiated call.
Also, there are some changes to the javascript that do not necessarily require that a page be reloaded. For example, the changes might affect a different page/module than the one that the user is on.
Re-loading the javascript with every Ajax call is not viable
I can imagine ugly solutions to this, but thought I'd ask first.
EDIT (in response to comments and suggested answers)
The only way to get the JS back into sync is to reload the page, which then loads the new JS. Adding new JS to an old page won't work as it doesn't get rid of any old functions, listeners, etc. I'm not asking how to reload a page or how to load javascript. I'm asking for a strategy of knowing WHEN to do it, especially in a way that does not seem awkward to the user. Do people incorporate polling to ask if there is a new JS version? Do they then suddenly (from the user's point of view) reload the page? Do they poll even when the tab is hidden? Is this a problem for the server? Where do they keep track of the latest required JS version? Or, do they ask with every AJAX request - hey, should I reload? Did they write a special function for that? Do they keep all new html/server code backwards compatible with the js?
Someone who has dealt with this, how do you do it?
Two possible solutions include
calling $.getScript() to retrieve, update variables at document from at server-side to match variables at document before calling $.ajax(/* settings */) ;
alternatively could use web workers to update original document variables to match server-side variables at beforeSend of $.ajax(/* settings */)
At result of first step of either approach, abort $.ajax() call, call error handlers, notify user, send message to server about error.
var head = document.getElementsByTagName("head")[0],
scripts = {};
function load_script(name){
var myscript = document.createElement('script');
myscript.setAttribute("type","text/javascript");
myscript.setAttribute("src", name);
if (scripts[name]) head.replaceChild(myscript, scripts[name]);
else head.appendChild(myscript);
scripts[name] = myscript;
}
// the first call to load the script code
// and then call if you decide to upgrade a newer version
load_script('js1.js');
load_script('js2.js');

YepNope - Waiting until dependencies loaded

In a ASP.NET Masterpage I am using YepNope to unconditionally and asynchronously load jQuery (from the Google CDN, with local fallback) and some scripts which are used on all pages in the site. In the MasterPage I have created a ContentPlaceHolder before the closing body tag (and below the YepNope script that loads those used on all pages) which is used for scripts used on individual page. As jQuery should be available on every page in the site it should not be loaded individually on those pages where there are specific scripts that use it.
The problem I have is that I can't use the callback or complete functions in the yepnope script where jQuery is loaded, as this is on the MasterPage and these are individual page scripts which are only used or added on that page, yet I need to be able to delay the execution of the individual page scripts until yepnope (which appears above the page scripts) has finished loading any dependencies (such as jQuery) used in them.
I can think of two options-
1- Make the script used on the page an external file and load that using the syntax -
yepnope('/url/to/your/script.js');
or
yepnope({ load: '/url/to/your/script.js' });
I'm not sure I like this idea as it introduces an extra HTTP request for a few lines of javascript which isn't going to be used on any other page.
2- Load jQuery again in another yepnope test object block, with the complete function wrapping up the page scripts (calling complete without a test seems to execute the function immediately, before the previous scripts are loaded) and relying on the following-
I am requesting a file twice and it's only loading once? By popular
demand, in yepnope 1.5+ we added the feature that scripts that have
already been requested not be re-executed when they are requested a
second time. This can be helpful when you are dealing with less
complex serverside templating system and all you really care about is
that all of your dependencies are available.
In the page I could presumably load the same version of jQuery from the Google CDN, which based on the above would not actually be loaded twice, and then load the page scripts in an anonymous function called from the complete function of the yepnope test object.
On the plus side this would mean that the page is no longer dependent on jQuery being loaded from the MasterPage, but a negative would be that (even assuming YepNope does not load the script twice now) we would be loading multiple versions of jQuery should the version in the MasterPage be changed without the same happening in the page in the future. From a maintenance point of view I don't feel this is a good idea, especially on the assumption (which I feel you should always make) that another developer would be the one making the changes.
It also does not seem especially elegant.
On balance I will almost certainly use the first option but I would like to know if there is a way to delay or defer scripts on a page until asynchronous loading is completed, and this cannot be done as part of the YepNope test object loading the resources.
How do other developers approach this problem?
I have come up with this as a solution I rather like.
In the MasterPage YepNope test object add the code-
complete: function() {
if (window.pageFunctions !== null && typeof (window.pageFunctions) === "function") {
window.pageFunctions();
}
}
If I want to add any JavaScript code or functions that rely on the dependencies loaded in the MasterPage I just wrap them in a function named "pageFunctions" like so-
<script type="text/javascript">
function pageFunctions() {
$(document).ready(function () {
...
});
}
</script>
I'm still interested in other (possibly better) solutions so I'm going to leave the question open for a couple of days.
I'd also appreciate comments on this as a solution.

Modernizr - Which scripts get loaded asynchronously?

I have the following:
Modernizr.load([
{
load : '//ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js',
complete : function () {
if ( !window.jQuery ){
Modernizr.load('/js/jquery-1.6.2.min.js');
}
}
},
{
load : ["/js/someplugin.js", "/js/anotherplugin.js"],
complete : function()
{
// do some stuff
}
},
{
load: ('https:' == location.protocol ? '//ssl' : '//www') + '.google-analytics.com/ga.js'
}
]};
I read that Modernizr loads scripts Asyncronously. But in the above example, which ones are loaded async?
Do all of the following get loaded asyncronously?
jquery.min.js
someplugin.js
anotherplugin.js
ga.js
Or is it a combination of async and ordered loading like this:
jquery.min.js is loaded first
Then someplugin.js and anotherplugin.js is loaded async
finally, ga.js is loaded
I'm having a hard time testing which case it is.
You've selected a fairly complex example to dissect. So lets do it in steps.
The three parameter sets {...},{...},{...} will execute sequentially.
Inside the first parameter set you will load jQuery from google's CDN, then when complete you test if jQuery actually loaded. If not (perhaps you are developing offline and the google CDN is not reachable), you load a local copy of jQuery. So these one is "sequential", but really only one of them will load.
In the second parameter set you load someplugin.js and 'anotherplugin.js` simultaneously and asynchronously. So they will be downloaded in parallel. Its great when you can parallel 2 items at a time as that is the "weakest link" for browsers today (yup only IE, Every other browser will parallel 6-8 files at a time).
In the third parameter set you load the google analytics script.
Remember modernizr is a collection of tools. The included loader is actually just a repackaged yepnope. So you can google for more about yepnope.
The idea with the sequential loads is to be able to ensure that dependencies load in order (example your jQuery plugins must load after jQuery framework). The purpose of the parallel downloads syntax in parameter set two is to speed up performance for multiple files that are not co-dependent (example you could load multiple jQuery plugins in parallel once jQuery is loaded as long as they don't depend on eachother).
So to answer your question, your hypothesis #2 is correct. If you'd like to explore more using firebug, add some console.log statements in each parameter set's complete function. You should see the 3 groups complete sequentially every time. Now switch firebug onto the "Net" tab to watch the file requests go out. The files someplugin.js and 'anotherplugin.js` won't necessarily load in the same order everytime. The requests will go out in order, but there timing bars should overlap (showing them as parallel). Testing this locally will be hard because it'll be so fast. Put your two test files on a slow server somewhere or bias them the opposite of what you are expecting (make the first file 1mb and the second <1kb) so you can see the overlapping downloads in the network monitor tab of firebug (this is just an artificial scenario for testing purposes, you could fill a file with repeated JS comments just to make an artificially slow file for testing).
EDIT: To clarify a little more accurately, I'd like to add a quote from the yepnopejs.com homepage. yepnopejs is the resource loader that is included (and aliased) inside modernizr:
In short, whatever order you put them in, that's the order that we
execute them in. The 'load' and 'both' sets of files are executed
after your 'yep' or 'nope' sets, but the order that you specificy
within those sets is also preserved. This doesn't mean that the files
always load in this order, but we guarantee that they execute in this
order. Since this is an asynchronous loader, we load everything all at
the same time, and we just delay running it (or injecting it) until
the time is just right.
So yes, you could put jquery, followed by some plugins in the same Modernizr.load statement and they will be downloaded in parallel and injected into the DOM in the same order as specified in the arguments. The only thing you are giving up here is the ability to test if jQuery loaded successfully and perhaps grab a backup non-CDN version of jQuery if necessary. So it's your choice how critical fallback support is for you. (I don't have any sources, but I seem to recall that the google CDN has only gone down once in its entire lifetime)

Categories