I'm working for a web customer who read on a blog that Speed Insights by Google matters. So now they want to have a "high Speed Insights rating because it's good for Google". While I think this is absurd, I don't have a choice but to comply.
Problem is, this customer is using an external javascript script provided by PureChat. This script does not come compressed in any way, leading Speed Insights to scream about how it's not gzipped or deflated.
Is there a way to force PureChat to be served deflated?
The problem with doing that is that PureChat most likely update their scripts quite often. Meaning that if you were to use things like Far-Future headers etc, it could cause issues with the customers cache still running the older version. Which could consist of harmful / exploitable bugs. So as you said, I really would NOT recommend doing this.
Another problem I find a number of people complain about is Google Analytics affecting their page speed ranking. Even though this is irrelevant, it comes through Google. It can be fixed using a Cron, you could do something similar by loading the JavaScript locally.
See article here: http://diywpblog.com/leverage-browser-cache-optimize-google-analytics/ It describes how to do this, you could use it for PureChat I imagine.
Related
We are developing a master dynamic web template for our users. There are certain pages that may require calls to jquery-1.11.3.min.js for some functionality. The question came up about including it in the master page so it is available on every page if needed. I am curious if there is a performance or security penalty for doing this.
It's considered a bad practice to include files that are unused. Even though they might be cached, they still take up memory. If they are not cached then extra connection and trips are made over the wire and slow down your page.
I would recommend to use RequireJS for all your javascript dependency resolution
The first time it is requested, there may be a hit.
However, because jQuery is used widely, the chances are that it will be cached at some point along the server request chain, and quite possibly in the user's browers anyway. It's also minified, so the size is quite small.
That said, even if not cached, once it's been requested once, it will be cached in the users browser (almost certainly).
Here are some interesting questions / articles for further reading / info:
Yahoo is well known for contributing to general knowledge on speeding up websites - read about this here:
http://developer.yahoo.com/performance/rules.html
Google's equivalent:
https://developers.google.com/speed/docs/insights/rules
The best answer here is really an opinion: your opinion.
This kind of thing is not necessarily bad practice; however, it can cause your website to slow down if the viewer has not cached the particular version of jQuery that you are using.
The answer to your question is another question: are you willing to make that sacrifice in order to be lazy and not include the link in every page that requires jQuery?
We are currently developing an ASP.NET MVC application which will be deployed on a corporate intranet, with a slightly-modified customer facing version available on the public internet.
We're making use of a number of external javascript libraries (e.g. jQuery) and a discussion has come up regarding referencing the libraries - should we reference them from an external source (e.g. via the Google load jQuery method) or keep our own version locally and reference from there?
The project manager is a little concerned about having a 'dependency' on Google (or whoever) if we reference from there, and thinks that having our own copy of the library makes us more independent. On the other hand, I have heard there are a number of advantages to letting someone else host the library - for example, they handle versioning for us, Google aren't going anywhere anytime soon...
(for the purpose of the discussion assume the intranet we're hosting on has external access - obviously if it turns out it doesn't the decision is very much made for us!)
So. Does this matter? And if so, what should we do and why?
(I appreciate this is subjective - but it would be very useful to get advice from anyone with experience or thoughts on the matter. Not sure if this is a candidate for community wiki or not, let me know if I should have put it there and I'll know for future!)
Thanks :)
Is the application critical to your internal users? If the 'internet is down' will your business suffer because internal users cannot access the applications? It's really more about risk than anything else.
Besides if there is change - breaking or otherwise, would you not want to manage the 'upgrade'?
Usually the reason of using a hosted library is performance. I browser will only download a limited number of files per host. So using a hosted library will load the files from a different host and therefore in parallel to the other files.
The second reason is that those files are usually being compressed and the cache headers are set correctly. And those files are usually stored in a CDN which means that your users will download the file from the hosts which is closest to them.
But all those reasons are not so important in a intranet environment.
you can do something like
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript"/>
if( window.$ ) {
// jquery script goes here
} else {
var elmScript = document.createElement('script');
elmScript .src = 'jQuery.js'; // path to jquery
elmScript .type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild( elmScript );
}
</script>
Notwithstanding the complexity-issues that arise from having your code scattered over multiple sites, some of which you don't control, there is also a slight performance penalty to pay for pulling content from multiple domains (Eg. an extra dns lookup). For a public web site, where a lot of users will be first-timers, there may be some benefit to having the code cached already, but otherwise I don't really see the point.
The main (only?) reason to do this would be for performance reasons as #Kau-Boy has already mentioned. JQuery is a relatively small library (for what it gives you) and it will be cached locally by your external user's browsers/proxies/ISPs. So it doesn't really give you anything except a small latency gain.
If you find yourself adding many external Javascript files then you should perhaps reconsider your overall design strategy. Choose a good general purpose library (I personally consider jQuery to be unbeatable - this is the subjective part) and learn it inside-out. You shouldn't have to add many more libraries apart from the odd plugin, which should be included on a page-by-page basis anyway.
Some really good answers from the others on here by the way.
During a recient PCI audit the auditor said that we had major security risks because
It was possible to download static resources from our website such as images css and javascript without prior authentication.
Our javascript had comments in it.
Personally I think that this is not a security risk at all. The images css and javascript where not dynamically created and they contained no data on our backend, our customer details and on mechanisms.
The comments within the javascript were just simply explaining what the methods in the javascript file did. Which anyone who reads JS could have found out anyway.
How does that show "information leakage"?
Are comments within javascript really a security risk?
Depending on how strict the audit, downloading images etc without authentication COULD be seen as a security risk (think diagrams, charts, graphs...).
Removing comments in the javascript is like obfuscating the code: it makes it a bit harder, but still not impossible to understand what's going on. JavaScript should be seen as enhancing-only anyway, all your security should be (duplicated) at server-side. Having anyone understand what the JS does should not be considered a risk.
It depends on the content of the commentary. Because there is no way, without human intervention, to examine the content of comments to determine whether they are risky, the most efficient way to audit this is to declare all comments in client-facing source code to be risky.
The following are examples of potentially risky comments.
// doesn't really authenticate, placeholder for when we implement it.
myServer.authenticate(user,pass);
or
// don't forget to include the length,
//the server complains if it gets NaN or undefined.
function send_stuff(stuff, length) {
...
}
or
function doSomething() {
querystring = ""
//querystring = "?TRACING_MODE=true&"
...
//print_server_trace();
}
Another example might be if you include a source code history header, someone might be able to find some security weakness by examining the kinds of bugs that have been fixed. At least, a cracker might be able to better target his attacks, if he knows which attack vectors have already been closed.
Now, all of these examples are bad practices anyway (both the comments and the code), and the best way to prevent it is by having code reviews and good programmers. The first example is particularly bad, but innocent warnings to your team mates, like the second example, or commented-out debugging code, like the third, are the kinds of security holes that could slip through the net.
Without getting into if they are a security risk or not, minify your JS on production environment, this will prevent the "information leakage" and help (in some way at least) to secure the information of your website.
regarding the security risk, I don't think JS comments are a risk at all, every website content (static) can be downloaded without authentication. (unless defined otherwise)
Not if they only reveal how the code works. Any sufficiently determined person could find that out anyway.
That said, it is probably a good idea to minify the JavaScript; not because of security, but because it will reduce download times and therefore make your site a bit more responsive.
JavaScript comments can be. depends on your logic, but certainly as it is publically available, you are giving more visibility to the workings of your code.
There are other reasons for removing this as as well, such as file size, and as a result download size.
Tools such asd JSMin can help you remove the comments and perfrom a crude obfuscation of the code.
It's well known that Google and Microsoft host several common javascript libraries on their CDNs (content distribution networks). Unfortunately neither seems to host JSON2.js.
I'm aware that I could upload a copy of JSON2.js to my server and serve it myself, but there are a number advantages CDNs offer that I would like to take advantage of.
So with that in mind, are there any publicly available CDNs that host JSON2? If not, any idea why? Is there some sort of copyright reason?
Checkout cdnjs.com
http://cdnjs.com/libraries/json2/
Might also be worth investigating Json3
http://cdnjs.com/libraries/json3/
UPDATE: Some of the information was out of date, changed to better links.
json2.js can be found on Yandex CDN servers.
Full version: http://yandex.st/json2/2011-10-19/json2.js
Minified: http://yandex.st/json2/2011-10-19/json2.min.js
HTTPS also works.
I think probably it's too early to expect the big CDNs to start doing this. When enough sites are using a library, the benefits become clear: greater availability, more frequent use, reduced client requests, increased performance for the end user. If only a few sites are using it, chances of client having a copy in their cache already is low and all performance boosts are lost. So all that's left is that MS and Google offset your bandwidth charges, which is not their intention. Thus, the solution is to get more developers to use the library.
Plus the library is so tiny. The code is still only 3.5KB using conservative minification. For comparison, jQuery is 24KB and ext-core is 29KB. I'd personally recommend folding the library into your own site's base JS and get your performance boost there. At least until there's wider acceptance.
Plus, it's funny I'd have expected the JSON library to be hosted also at Yahoo, but I can't find it. I mean Crockford works there.
Thomas from cdnjs.com here with two quick reasons why there is no minified version.
1) The script may not possibly function as the author intended using the method of minification we choose.
2) As a security step we ensure that all files checksums match the original authors hosted files so community submitted updates cannot contain malformed minified code.
So for now that leaves us hosting Crockfords hosted un-minified version;
https://github.com/douglascrockford/JSON-js/raw/master/json2.js
There is now.
Douglas Crockford recently put JSON2 on github, this url will always link to the most recent version.
Edit:
Its not a good idea to use this method, see my comment below.
There are some tutorials which suggest to use jquery path which is from google eg:
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
Is that safe to use in our projects?
Aren't we dependent because we are not sure it will be there after a year or beyond?
The reason why I have asked this question is that there are some people who go in favor
of that.
From the documentation:
Google works directly with the key
stake holders for each library effort
and accepts the latest stable versions
as they are released. Once we host a
release of a given library, we are
committed to hosting that release
indefinitely.
It seems pretty low-risk to me. And more likely to be already in the user's cache. And served with the proper gzip and caching headers. Also won't eat up a http request to your domain on browsers that only allow downloading 2 requests to a domain at a time (e.g. IE6 and IE7).
I have an article for you that explains the benefits and cons of using this method: Here
I really doubt that google will put this up for people to use and then all of a sudden take it down and cause issues with thousands or more websites. It's not like they will lose their domain or run out of bandwidth. The only issue that I think you should worry about is if the end users of your sites cannot access google. Personally I just host the file on my own server anyway
Short answer is yes and I agree if that include doesn't work it is probably a sign of a much bigger problem. My general rule of thumb is for all public facing apps I use that include where as internal apps (which theoretically could be used w/o a connection to the outside world) I include a local copy instead.
There will be always a chance that it will not be there after a year, same as gmail, gdocs, google.com...
for jquery alone, i do not see a reason to use google source, as the file is small, impact to your server and BW will not be too much. But jquery UI may worth to use google's source.
It's pretty 'safe' like what the other guys mentioned. You probably ease a bit of load of your own server too. Even SO itself is using it.
But to be safe, always have a fallback plan and have a local copy, just in case.
There's not really much risk involved if you think about it. Suppose Google ceases to exist in a year (chuckle), it wouldn't take you but a couple of minutes to replace the google.load command in your common file with a reference to your own local jQuery copy.
The worse-case scenario is that in the improbable future of Google's demise, your hover-effects stop to work for 5 minutes :)
A similar question: Where do you include the jQuery library from? Google JSAPI? CDN?
Because of the answers from that question, I have started using:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
I have it running on quite a number of sites. The only issue I've had is some firewalls start blocking a site if there are too many requests (or at least this is my guess), which is the case on higher traffic sites all used in one location.