What are advantages of using google.load('jQuery', ...) vs direct inclusion of hosted script URL? - javascript

Google hosts some popular JavaScript libraries at:
http://code.google.com/apis/ajaxlibs/
According to google:
The most powerful way to load the libraries is by using google.load() ...
What are the real advantages of using
google.load("jquery", "1.2.6")
vs.
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js"></script>
?

Aside from the benefit of Google being able to bundle multiple files together on the request, there is no perk to using google.load. In fact, if you know all libraries that you want to use (say just jQuery 1.2.6), you're possibly making the user's browser perform one unneeded HTTP connection. Since the whole point of using Google's hosting is to reduce bandwidth consumption and response time, the best decision - if you're just using 1 library - is to call that library directly.
Also, if your site will be using any SSL certificates, you want to plan for this by calling the script via Google's HTTPS connection. There's no downside to calling a https script from an http page, but calling an http script from an https page will causing more obscure debugging problems than you would want to think about.

It allows you to dynamically load the libraries in your code, wherever you want.
Because it lets you switch directly to a new version of the library in the javascript, without forcing you to rebuild/change templates all across your site.

It lets Google change the URL (but they can't since the URL method is already established)
In theory, if you do several google.load()s, Google can bundle then into one file, but I don't think that is implemented.

I find it's very useful for testing different libraries and different methods, particularly if you're not used to them and want to see their differences side by side, without having to download them. It appears that one of the primary reason to do it, would be that it is asynchronous versus the synchronous script call. You also get some neat stuff that is directly included in the google loader, like client location. You can get their latitude and longitude from it. Not necessarily useful, but it may be helpful if you're planning to have targeted advertising or something of the like.
Not to mention that dynamic loading is always useful. Particularly to smooth out the initial site load. Keeping the initial "site load time" down to as little as possible is something every web designer is fighting an uphill battle on.

You might want to load a library only under special conditions.
Additionally the google.load method would speed up the initial page display. Otherwise the page rendering will freeze until the file has been loaded if you include script tags in your html code.

Personally, I'm interested in whether there's a caching benefit for browsers that will already have loaded that library as well. Seems like if someone browses to google and loads the right jQuery lib and then browses to my site and loads the right jQuery lib... ...both might well use the same cached jQuery. That's just a speculative possibility, though.
Edit: Yep, at very least when using the direct script tags to the location, the javascript library will be cached if someone has already called for the library from google (e.g. if it were included by another site somewhere).

If you were to write a boatload of JavaScript that only used the library when a particular event happens, you could wait until the event happens to download the library, which avoids unnecessary HTTP requests for those who don't actually end up triggering the event. However, in the case of libraries like Prototype + Scriptaculous, which downloads over 300kb of JavaScript code, this isn't practical.

Related

How to execute external JS file blocked by users' adblockers

We use an external service (Monetate) to serve JS to our site such that we can perform adhoc presentation-layer site updates without going through the process of a site re-deploy - which in our case is a time-consuming, monolithic process which we can only afford to do about once per month.
However, users who use adblockers in the browser do not see some of these presentation-layer updates. This can negatively affect their experience of the site as we sometimes include time-sensitive promotions that those users may not be aware of.
To work around this, I was thinking to duplicate the JavaScript file that Monetate is serving and host it on a separate infrastructure from the site. That way, it we needed to make updates to it, we could do so as needed without doing a full site re-deploy.
However, I'm wondering if there is some way to work around the blocking of the Monetate JS file and somehow execute the remote Monetate JS file from our own JS code in such a way that adblockers would not be able to block it? This avoid the need to duplicate the file.
If that file is blocked by adblockers, chances are that it is used to serve ads. In fact, your description of time-sensitive promotions sounds an awful lot like ads, just not for an external provider, but for your own site.
Since adblockers usually match the URL, the easiest solution would indeed be to rehost this file, if possible under a different name. Instead of hosting a static copy, you can also implement a simple proxy with the equivalent of <?php readfile('http://monetdate.com/file.js'); or apache's mod_rewrite. While this will increase load times and can fail if the remote host goes down, it means the client will always get the newest version of the file.
Apart from using a different URL, there is no client-side solution - adblockers are included in the browser (or an extension thereof), and you cannot modify that code for good reasons.
Beware that adblockers may decide to block your URL too, if the script is indeed used to serve ads.
Monetate if probably blacklisted in Adblock, so you can't do nothing about.
I think that self-hosting Monetate script would require to keep it updated by checking for new versions from time to time (maintaining it could become a pain in the ass).
A good solution in my opinion is to inform your users about that limitation with a clear message.
Or, you can get in touch with Monetate and ask for a solution.

How can I introduce modules to a legacy javascript project?

I'm a developer on a very large, many-page web app. We're making a push to improve the sanity of our javascript, so I'd like to introduce a module loader. Currently everything is done the old-fashioned way of slapping a bunch of script tags on a page and hoping for the best. Some restrictions are:
Our html pages are templated, inherited, and composed, such that the final page sent to the client brings together pieces from many different source html files. Any of these given files may depend on javascript resources introduced higher up the chain.
This must be achievable piecemeal. The code base is far to large to convert everything at once, so I'm looking for a good solution for new pages, and for migrating over existing pages as needed.
This solution needs to coexist on the same page as existing, non-module javascript. Some things (like menus and analytics) exist on every page, and I can't remove global jquery, for instance, as it's used all over the place.
Basically I'd like a way to carve out safe spaces that use modules and modern dependency management. Most tutorials and articles I've read all target new projects.
Requirejs looks like a decent option, and I've played with it a bit. Its fully async nature is a hindrance in some cases though - I'd like to use requirejs to package up global resources but I can't do that if I can't control the execution order. Also, it sucks to have the main function of a page (say, rendering a table) happen after secondary things like analytics calls.
Any suggestions?
That's a lot to ask for. :-) So here are my thoughts on this:
Usually, when you load a web page - everything that was being displayed is wiped so that the new incoming information does not interfere with what was already there so you really have only a couple of options (that I know of) where you can keep everything in memory in the browser and yet load new pages as needed.
The first (and easiest) way is to use FRAMES. These are not used much anymore from what I've seen but each FRAME allows you to display a different web page. So what you do is to make one frame use 100% and the second one use "*" so it isn't seen. You can then use the unseen frame to control what is going on in the 100% frame.
The second way is to use Javascript to control everything. In this scenario you create a namespace area and then use jQuery's getscript() function to load in each page as you need it. I did this once by generating the HTML, converting it to hex via bin2hex(), send it back as a javascript function and then unhexing it in the browser and applying it to the web page. If it was to completely replace the web page you do that and if it was an update to a web page you just inserted it into the HTML already on the web page. Any new javascript function always attaches itself to the namespace. If a function is no longer needed, it can be removed from the namespace to free up memory.
Why convert the HTML to hex and then send it? Because then you can use whatever characters you want to us (like single and double quotes) and it doesn't affect Javascript at all. There are fairly fast hex routines available on GitHub (see my toHex and fromHex routines).
One of the extra benefits of using getScript() is that it can sometimes also make it almost impossible for anyone to see your code then. A fluke in how getScript() works which is documented.

Js Library into browser

Since web browsers want to make the web faster.
I know google has his hosted libraries. But why not integrate them on the browser directly?
Problem nowadays is that if you navigate from one page that has jQuery to another page with jQuery since the url is different that same js is cached for that particular url. So loading time takes up longer while navigating between pages with same libraries.
Can't they make something that saves most known libraries in the browser so that when you load jQuery or jQuery-min it searches for it on the browser first.
Pros
-Faster navigation on the web.
-Makes 1 http request less if he finds the library to load.
Cons
Some problems that can occur with that is versions. Since most files have names like jquery.min.js we can't simply load them if they have the same name, on the other hand some have /1.11.0/jquery.min.js So the browser could try to find out the version with the url. If the browser couldn't find version than it would simply load the file.
What do you think ? Any suggestions on how this could work ? Any other cons ?
Edit1: I'm aware of CDNs. I'm only suggesting a way slightly faster than CDNs and doing one http request on the same process.
This problem can be avoided by using commonly used CDNs, as you mentioned.
http://cdnjs.com/
However I think integrating them into the browser could introduce a real versioning problem. Just think of how long in-between versions of IE. If you had to wait that long to download and cache new versions of libraries, it would be a disaster.
Also you would have to download a large variety of libraries to have your bases covered.
Downloading libraries is typically not very slow, its the time to parse and execute it that takes longer on mobile.
Here is a great post about this topic
http://flippinawesome.org/2014/03/10/is-jquery-too-big-for-mobile/

Recording web page events / ajax calls/results and so on

I'm mostly looking for directions here.
I'm looking to record events that happen within a web page. Somewhat similar to your average "Macro-recorder", with the difference that I couldn't care less about exact cursor movement or keyboard input. The kind of events I would like record are modification of input fields, hovers, following links, submitting forms, scripts that are launched, ajax calls, ajax results and so on.
I've been thinking of using Jquery to build a little app for this, and inserting this on whichever pages I would like to test it on (or more likely, loading the pages into an iframe or something). I however can not accommodate the scripts on these pages to work with this so it has to work regardless of the content.
So I guess my first question is: Can this be done? Especially in regards to ajax calls and various script execution.
If it can, how would I go about the ajax/script part of it? If it can't, what language should I look into for this task?
Also: maybe there's something out there that can already do what I'm looking for?
Thanks in advance
Two ways I can think of are:
Use an add on (firefox) or an extension (chrome) to inject a script tags that loads jquery and your jquery app
Set a proxy (you can use node.js or some other proxy server) and in your proxy inject script tags, be sure to adjust the ContentLength header. (tricky in https sites).
A much simpler and faster option where you don't need to capture onload is to write a JavaScript snippet that load jquery and your app by inject script tags, make that a bookmarklet and after the page loads hit the bookmarklet.
Came across this post when looking for a proxy for tag injection.
Yes it's quite possible to trap (nearly) all the function and method calls by a browser via code in a javascript loaded in the page - usually a javascript debugger (firebug?) or HTTP debugger (tamperdata / fiddler) will give you msot of what you require with a lot less effort.
OTOH if you really want to do this with bulk data / arbitary sites, then (based on what I've seen so far) you could use Squid proxy with an icap server/ecap module (not trivial - will involve a significant amount of programming) or implement the javascript via greasemonkey as a browser extension.
Just to clarify, so far I've worked out how to catch function and method (including constructor calls) and proxy then within my own code, but not yet how to deal with processing triggered by direct setting of a property (e.g. img.src='http://hackers-r-us.com') nor handle ActiveX neatly.

How to protect a site from API outages?

Watching the effects of today's Google outage across the web got me thinking about how to prevent this in the future. This may be a stupid question, but is there a good way to include external APIs (e.g., Google's AJAX libraries) so that if the API is unavailable, the including page can still soldier on without it? Is it generally a bad idea to use libraries hosted on an external server in general?
It's unavoidable to use a script tag to load cross-domain JavasScript files (this will cause a timeout if it goes down). However, in your code, check for the API objects being null to avoid errors:
E.g. instead of:
<script type="text/javascript">
google.load("maps", "2");
// Use Maps API
</script>
use:
<script type="text/javascript">
if(google != null)
{
google.load("maps", "2");
// Use Maps API
}
else
{
// Fallback
}
</script>
I don't think rare outages are worth rejecting an external API wholesale.
You will want to design your application to degrade gracefully (as others have stated) and there is actually a design pattern that can be useful in doing so. The Proxy Pattern, when implemented correctly can be used as a gatekeeper to check if a service is available (among many other uses) and return appropriately to the application either the correct data, cached data or inform the application that the service is not available.
The best general answer I can give is, degrade beautifully and gracefully & avoid sending errors. If the service can become unavailable, expect that and do the best job you can.
B
ut I don't think this is question that can be answered generically. It depends what your site does, what external libraries/API you are using, etc.
You could do some sort of caching to still serve up pages with older data. If allowed, you could run the API engine on your own server. Or you could just throw up status messages to users.
It's not a bad idea to rely on external APIs, but one of the major drawbacks is that you have little control over it. If it goes away? Welcome to a big problem. Outages? Not much you can do but wait.
I think its a great idea to use external libraries since it saves bandwidth for me (read $$). But its pretty easy to protect against this kind of api outage. Keep a copy on your server and in your JavaScript check if the api has been successfully loaded. If not load the one on your server.
I was thinking about jQuery and YUI here. The other guys are right about the problems when using actual services like mapping.
One possibility for mitigating the problem (will only work if your site is dynamically generated):
Set up a cronjob that runs every 10 minutes / hour / whatever, depending how much you care. Have it attempt to download the external file(s) that you are including, one attempt for each external host that you depend on. Have it set a flag in the database that represents whether each individual external host is currently available.
When your pages are being generated, check the external-host flags, and print the source attribute either pointing to the external host if it's up, or a local copy if it's down.
For bonus points, have the successfully downloaded file from the cronjob become the local copy. Then when one does go down, your local copy represents the most-current version from the external host anyway.
A lot of the time you need to access third party libraries over the web. The question you need to ask yourself is how much do you need this and can you cache any of it.
If your uptime needs to be as close to 100% as possible then maybe you should look at how much you rely on these third parties.
If all you are obtaining is the weather once an hour then you can probably cache that so that things carry on regardless. If you are asking a third party for data that is only valid for that milisecond then you probably need to look at error handling to cover the fact it may not be there.
The answer to the question is entirely based upon the specifics of your situation.
It'd certainly be fairly easy to include a switch in your application to toggle using Google or local web server to server your YUI, JQuery or similar library so that you can toggle provider.

Categories