I came across the problem when due to internet connection problems, some of the required JavaScript files are not loading. Body onload event gets fired however classes required for logic of the page are not present.
One more thing, problem which I want to fix is not in the website, it is in web application which does not have any image or CSS files. Just imagine a JavaScript code running in iframe. Thus, I have problems only with scripts.
Here are my ideas how to fix this, please comment/correct me if I'm wrong:
Obfuscate and combine files into when pushing to live so overall size of the files will be decreased and task will come to reliably loading 1 file
Enable gzip compression on server. So again resulting file size will be much smaller
Put proper cache headers for that file, so once loaded it will be cached in browser/proxy server
Finally, even having all this, there could be a case that file will not be loaded. In this case I plan to load that file dynamically from JavaScript, once page is loaded. There will be "Retry failed load" logic with maximum 5 attempts for example
Any other ideas?
If the "retry" script fails to grab the required dependencies, redirect to a "no script" version of the site, if one is available. Or try to have a gracefully degrading page, so even if all steps fail, the site is still usable.
1 - Correct but double check if JavaScript functions from different files don't overlap each other.
2 - Correct - this should be always on.
3 - Correct but the Browser will still try to get a HTTP 304: Not Modified code from the server.
4 - Correct, consider fallback to a noscript version of the website after 1 or 2 failed attempts (5 is too much).
I don't personally think it's worth it to try to redo the logic that the browser has. What if the images in your page don't load? What if the main page doesn't load. If the user has internet connection problems, they need to fix those internet connection problems. Lots of things will not work reliably until they do.
So, are you going to check that every image your page displays loads properly and if it didn't load, are you going to manually try to reload those too?
In my opinion, it might be worth it to put some inline JS to detect whether an external JS file didn't load (all you have to do is check for the existence of one global variable or top level function in the external JS file) and then just tell the user that they are having internet connection problems and they should fix those problems and then reload the site.
Your points are valid for script loading, but you must also consider the website usage.
If the scripts are not loading for whatever reason, the site must be still completely usable and navigable. The user experience come first before everything else.
The scripts should be loaded after the website interface has been loaded and visualized by the browsers, and should contain code to enhance user experience, not something you must absolutely rely on.
This way even when the connection is really slow, I will still be able to read content and choose to change page or go somewhere else, instead of having a blank page or a page with only the header displayed.
This to me is the most important point.
Also, are you sure about a retry approach? It causes more requests to the server. If the connection is slow or laggy then it may be best to not run the script at all, expecially considering users may spend little time on the page and only need to fast read at content. Also, in the connection is slow, how much time would you set for a timeout? What if the script is being downloaded while your timeout fired and you retry again? How can you effectively determine that amount of time, and the "slowness" of the connection?
EDIT
Have you tried out head.js? Is a plugin aimed at fastest possible sripts loading, maybe it will help.
Related
I'm currently building a portfolio-website for an architect that has a whole lot of images on its pages.
Navigation is done with history.js (=AJAX). In order to save loading time and make the whole thing more "snappy", I wrote a script that crawls the page body for links to other pages and fetches these automatically in the background. So far, it works like a charm.
It basically keeps a queue-Array that holds all the links. A setTimeout()-Function works through them and fetches each page using jQuery $.ajax(). The resulting HTML is stored in a Javascript Object.
Now, here's my question:
What are possible problems that might occur when using this on different machines/browsers/operation systems?
I'm thinking about:
max. javascript object/variable size (The fetched HTML is stored in an javascript object)
possible performance problems
max. number of asynchronous requests?
… anything you can think of?
Thanks a lot in advance,
a hobby programmer
Although it might be a good idea to cache the whole website on the client side there are a lot of things that can cause issues:
Memory
Unnecessary load on the webserver
Loading uneeded pages into memory
Some users have a limit to their internet so loading the entire website is not smart in those cases
Once the user naviagets away or refreshes the entire "cache" is gone
What I would do is first try to optimize the server side.
Add a bunch of caching mechanisms from the database to the user, the "Expires" header can really help you.
And if that doesn't help I would then think about caching some pages(which ones are for you to decide) in the offline cache, see (HTML 5 Offline Features)
That way you are safe even on page reload, keep the memory to a minimum and only load what you need.
PS: Don't try to reinvent stuff that the browser already has :P
You should queue the async requests, and only launch one at a time.
And since you're storing everything in variables, at some point you (the browser) may consume to much memory, and the whole thing can become very slow. I suggest you limit the size of your cache to a certain number of pages.
You can also try to not to store fetched content - just fetch it and throw-out. Browser still will cache fetched pages and images in its internal storage, so subsequent loads will be much faster (of course if ajax library does not forcibly disable the cache i.e. by using POST)
We recently moved to jQuery 1.6 and ran into the attr() versus prop() back-compat issue. During the first few hours after the change was deployed everything was fine, then it started breaking for people. We identified the problem pretty quickly and updated the offending JS, which was inline.
No we have a situation where some folks are still having issues. In every case thus far, I could get the user up and running again by telling them to load the page in question then manually refresh it in the browser. So something must still be cached somewhere.
But there are basically only two potential culprits: First, the jQuery library itself, but this is loaded with the version number in the query string so I think browsers will be refreshing it in their cache. Second, the inline javascript. Is it possible that this is being cached in the browser?
We are using APC, apc.stat=1 so it should be detecting that the PHP files have changed. Just to be on the safe side I nuked the opcode cache anyway.
To summarize, I have two questions:
Could some browsers be ignoring the query string when jQuery is loaded?
Could some browsers be caching an older version of the inline javascript?
Any other ideas very welcome too.
UPDATE: In the course of checking that there wasn't any unexpected caching going on using Firebug, I discovered a case where the old jQuery library would load. That doesn't explain why we had trouble after deploying the site and before we updated the inline code, but if it solves the problem I'll take it.
The answer to both your questions is no. Unless the whole page is being cached. A browser can't cache part of a file, since it would have to download it to know which parts it had cached and by that time it's downloaded them all anyways. It makes no sense :)
You could try sending some headers along with your page that force the browser not to use it's cached copy.
I'd say the answers to your questions are 1) No and 2) Yes.
jQuery versions are different URLs so there's no caching problems there unless you somehow edit a jQuery file directly without changing the version string.
Browser pages (including inline javascript) will get cached according to both the page settings and the browser settings (that's what browsers do). The inline javascript isn't cached separately, but if the web page is cached, then the inline javascript is cached with it. What kind of permissible caching have you set for your web pages (either in meta tags or via http headers)?
Lots to read on the subject of web page cache control here and here if needed.
It is very important to plan/design an upgrade strategy when you want to roll out upgrades that works properly with cached files in the browser. Doing this wrong can result in your user either staying on old content/code until caches expire or even worse ending with a mix of old/new content/code that doesn't work.
The safest thing to do is to change source URLs when you have new content. Then there is zero possibility that an old cached page will ever get the new content so you avoid the mixing possibility. For example, on the Smugmug photo sharing web site, whenever any site owner updates an image to a new version of the image, a version number in the image URL is changed. Then, when the source page that shows that image is served from the web server, it includes the new image URL so, no matter whether the old version of the image is in the browser cache or not, the new version image is shown to the user.
It is obviously not always practical to change the URL of all pages (especially top level pages) so those pages often have to be set with short cache settings so the browsers won't cache them for long and they will regularly pull fresh content.
Sites like Facebook use "lazy" loading of js.
When you would have to take in consideration that I have one server, with big traffic.
I'm interested - which one is better?
When I do more HTTP requests at once - slower loading of page (due to limit (2 requests at once))
When I do one HTTP request with all codes - traffic (GB) is going high, and apaches are resting little bit more. But, we'll have slower loading of page.
What's faster in result ?
Less requests! Its the reason why we combine JS files, CSS files, use image sprites, etc. You see the problem of web is not that of speed or processing by server or the browser. The biggest bottleneck is latency! You should look up for Steve Souders talks.
It really depends on the situation, device, audience, and internet connection.
Mobile devices for example need as little HTTP requests as possible as they are on slower connections and every round trip takes longer. You should go as far as inline (base-64) images inside of the CSS files.
Generally, I compress main platform and js libs + css into one file each which are cached on a CDN. JavaScript or CSS functionality that are only on one page I'll either inline or include in it's own file. JS functionality that isn't important right away I'll move to the bottom of the page. For all files, I set a far HTTP expires header so it's in the browser cache forever (or until I update them or it gets bumped out when the cache fills).
Additionally, to get around download limits you can have CNAMES like images.yourcdn.com and scripts.yourcdn.com so that the user can download more files in parallel. Even if you don't use a CDN you should host your static media on a separate hostname (can point to the same box) so that the user isn't sending cookies when it doesn't need to. This sounds like overfill but cookies can easily add an extra 4-8kb to every request.
In a developer environment, you should be working with all uncompressed and individual files, no need to move every plugin to one script for example - that's hard to maintain when there are updates. You should have a script to merge files before testing and deployment. This sounds like a lot of work but its something you do for one project and can reuse for all future projects.
TL;DR: It depends, but generally a mixture of both is appropriate. 'Cept for mobile, less HTTP is better.
The problem is a bit more nuanced then that.
If you put your script tags anywhere but at the bottom of the page, you are going to slow down page rendering, since the browser isn't able to much when it hits a script tag, other then download it and execute it. So if the script tag is in the header, that will happen before anything else, which leads to users sitting there stairing at a white screen until everything downloads.
The "right" way is to put everything at the bottom. That way, the page renders as assets are downloaded, and the last step is to apply behavior.
But what happens if you have a ton of javascript? (in facebooks example, about a meg) What you get is the page renders, and is completely unusable until the js comes down.
At that point, you need to look at what you have and start splitting it between vital and non vital js. That way you can take a multi-stage approach, bringing in the stuff that is nessicary for the page to function at a bare minimum level quickly, and then loading the less essential stuff afterwards, or even on demand.
Generally, you will know when you get there, at that point you need to look at more advanced techniques like script loaders. Before that, the answer is always "less http requests".
Following on Steve (YSlow) Souder's evangelism, my site (LibraryThing.com) splits requests across domains to facilitate parallel loading. We do CSS, JS and images; you can also do Flash, etc. We also use Google's version of Prototype, which is cross-domain, not just cross-subdomain.
This is all great for speed, but for a small percent of users, it's going wrong. I think the problem is overzealous security settings, probably in IE, but perhaps in other browsers and/or upstream systems as well. I'm amazed Souders and others don't discuss this, as we get it a lot.
The question is: What is the best way to handle this?
Right now, when it hits the bottom of the page we're checking to see if some JS variable, declared in a script that should have loaded, is set. If it isn't set, it gets it from the main domain and sets a cookie so next time it won't load it from the subdomain. But we're only reloading the JS at the bottom, so if the CSS also failed, you're looking at junk.
Does anyone have a better or more generalized solution? I'm thinking that there could be a general "onload" or "onerror" script that sets the cookie AND loads the content?
If this behavior always affects JS files at least, one option would be to keep a cookie indicating whether the user's browser has been tested for this behavior yet. If they've not been tested, insert (as the first script element in the tag) a reference to a cross-domain script that simply sets this cookie to "success". Then immediately afterward have some inline JS that will check for this cookie, and if not set, set to "failed" and reload the page.
Then on the server-side just check for the same cookie, and ensure cross-site requests aren't sent to anyone with a "failed" result.
This approach should ensure that users with browsers that do support cross-site requests don't see any odd behavior, but should immediately fix the problem for other users at the cost of an automatic refresh the first time they visit.
Do you have a specific user-agents list that present this behaviour?
Maybe Apache conf could solve this problem? (or create a new problem for you to solve :-) ).
Watch out for the cookie frenzy - the more you add cookies (moreover, on the main domain), the more your clients will have to send it along their requests.
Souders talked about it too, but it's always good to check your clients browsers sent/received ratio for requests.
I'm going to take some wild guesses about your problem.
Cache. Did you make these changes in the script files these problem users could have older versions. IE 6 is extremely bad with overzealous caching.
I notice your scripts don't have a build # in the url, XYZ.js?version=3 will force the browser not to use the old cached scripts like XYZ.ks?version=2. (Applies to Images/Css as well)
You also have inline javascript mixed in with your HTML which would also get cached.
3 domains is likely overkill unless your site has a TON of content on it (huge pages)
DNS lookups can be expensive and have very long timeout values.
I wouldn't be comfortable putting my sites javascript on a separate domain because of the possible security conflicts. You have to keep your javascript/ajax calls in sync with domains. Seems like more of a hassle than it's worth.
I've been using i.domain.com and domain.com for 5+ years with no issues.
I bet putting the JS back on the main domain will fix your problems. It will certainly make it less complex and easier to deal with.
But done properly, your 3 domains should work. Unfortunately I don't have enough info in this question to find the issue.
Yahoo best practices states that putting JavaScript files on bottom might make your pages load faster. What is the experience with this? What are the side effects, if any?
It has a couple of advantages:
Rendering begins sooner. The browser cannot layout elements it hasn't received yet. This avoids the "blank white screen" problem.
Defers connection limits. Usually your browser won't try to make more than a couple of connections to the same server at a time. If you have a whole queue of scripts waiting to be downloaded from a slow server, it can really wreck the user experience as your page appears to grind to a halt.
If you get a copy of Microsoft's Visual Round Trip Analyzer, you can profile exactly what is happening.
What I have seen more often that not is that most browsers STOP PIPELINING requests when they encounter a JavaScript file, and dedicate their entire connection to downloading the single file, rather than downloading in parallel.
The reason the pipelining is stopped, is to allow immediate inclusion of the script into the page. Historically, a lot of sites used to use document.write to add content, and downloading the script immediately allowed for a slightly more seamless experience.
This is most certainly the behavior that Yahoo is optimizing against. (I have seen the very same recommendation in MSDN magazine with the above explanation.)
It is important to note that IE 7+ and FF 3+ are less likely to exhibit the bad behavior. (Mostly since using document.write has fallen out of practice, and there are now better methods to pre-render content.)
As far as I can tell, it simply lets the browser begin rendering sooner.
One side effect would be that some scripts doesn't work at all if you put them at the end of the page. If there is a script running while the page is rendered (quite commmon for ad scripts for example) that relies on a function in another script, that script has to be loaded first.
Also, saying that the page loads faster is not the exact truth. What it really does is loading the visual elements of the page earlier so that it seems like your page is loading faster. The total time to load all components of the page is still the same.
Putting them at the bottom is a close equivalent to using the "defer" attribute (even more info here). This is similar to how a browser cannot continue with page layout unless IMG tags have width and height information -- if the included javascript generates content, then the browser can't continue with layout until it knows what is there, and how big everything is.
So long as your javascript doesn't need to run before the onload event happens, you should be able to either place the script tags at the end, or use the defer attribute.
Your page should actually load faster. Browsers will open more than one connection to download three images in parallel, for example. On the other hand, the <script> tags in most browsers cause the browser to block on that script executing. If its a <script> tag with a src attribute, the browser will block to both download and execute. If you put your <script> tags at the end, you avoid this problem.
At the same time, this means that those pages don't have any JS functionality until they're done loading. A good exercise in accessibility is to ensure your site runs well enough to be usable until the JS loads. This ensures that the page will (a) work well for people with slow connections (b) work well for people who are impaired or use text-only browsers.
if you are developing for firefox/safari, you can always check with firebug/developer console as they show the loading sequence of files
There are some points.
It loads page fast since the JavaScript internal or external is on bottom.
If you have not used a onLoad method of window in JavaScript it will start execution as soon as it rendered. The Script at bottom ensures that your script will execute after page load.
If script is as a file means external then will render after the HTML image and other visual object that forms the page view.
If you are using fireFox then there is a plug in to check the performance.
Please do hit the firefox site for this plugin.