I've been taking a look at their code and I can't quite figure it out. Are they loading entire new pages, or are they using jquery to change the browser url and keep most of the page static?
I'm viewing source of this page.
I don't think there is a "silver bullet" explanation--their site is likely fast because they've performance-tuned it and removed any significant bottlenecks.
They are in-fact reloading the page with each selected image--this is clear with a HTTP debugger such as Fiddler2.
The perceived speed is partly explained by their use of content-delivery-networks and gzip compression--both speed up delivery of content. Their HTML structure likely factors is as well, and there is the chance they are streaming the response to allow the browser to begin rendering as early as possible.
Related
My website: http://jgyboatparty.com/
is loading horrifically slow and I cant figure out why. I have compressed the images as much as possible and made the video file available as a webm, and yet with a fresh reload the page is taken minutes to load!
Am i missing something here, as for some reason when I look at when elements are loaded, some of the images dont even start loading until 30 seconds in. This all means that my loader goes on for a while. (I have a wait until images loaded function)
Any advice would be much appreciated :)
The web is full of useful tools that shall aid you in this:
Google pageSpeed for one and GTmetrix another. Make use of these tools to analyse what could be causing your site to be slow.
Also, ensure all of your'e images are properly optimised. Another tool again that may help is Tinypng.
Google's Inspector can also be very useful to help diagnose bottlenecks and so forth. Here's another link that may help you:
https://developers.google.com/web/tools/chrome-devtools/network-performance/
Also, I see you are using various libraries such as owl carousel, Parallax, and fitvids to name three. I would look to try and at least use cdn versions: a CDN is a way to deliver content from your website or mobile application to people more quickly and efficiently, based on their geographic location.
Also, look into lazy loading of your images.
Do you have enough juice to your server? When pinging you server it takes 71ms, and when i am pinging your govrment's server it's approximately 31ms.
When i am checking the network tab for your website an image which is about 155kb big, takes about 1.2 seconds to load.
Steps to improve your speed might be minify all your scripts.
Do not load all of your content at once.
Open your broswer's dev tools and find out.
Browsers only allow 4 (max) concurrent downloads from 1 domain. So while the browser is downloading those, it blocks the others.
Move your scrips/ images to a sub domain or a cdn or just do something listed in the post below.
Get number of concurrent requests by browser
I am creating a website professionally for a client as my first project and i am using too many libraries for instance velocity.js,jquery,jquery.ui,animate.css and also some image slider plugin for jquery right now i am using the min version and all of the files are downloaded in my machine but when i will put the site live will it severely affect the loading time of website or it will be normal.
You can put it up. Test it Click here. But the best way is to put it up and test the ping.
Yes, it will severely affect the loading of the page. Chrome comes with developer tools out of the box, and Firebug for Firefox is only a couple of clicks away. That, combined with the RTT time and badnwidth to the site gives you enough information to calculate exactly how slow the first hit page load time.
Assuming that caching is configured correctly, subsequent page transitions should be faster - but even loading all this javascript from cache and parsing it will have a big impact on the load time. Note that in the of Dokuwiki, the CMS already handles merging of javascript into a single file.
Using PJax or similar will help with subsequent transitions. Using online repositories for standard libraries does not help with performance. Minimizing the javascript does not give very big wins (since you are already compressing the files on the fly, aren't you?)
There's big wins to be had from defering Javascript loading and stripping/optimizing/merging CSS files.
There are a lot of other things you can do to make your page render faster - but space here is too limited.
I'm currently building a portfolio-website for an architect that has a whole lot of images on its pages.
Navigation is done with history.js (=AJAX). In order to save loading time and make the whole thing more "snappy", I wrote a script that crawls the page body for links to other pages and fetches these automatically in the background. So far, it works like a charm.
It basically keeps a queue-Array that holds all the links. A setTimeout()-Function works through them and fetches each page using jQuery $.ajax(). The resulting HTML is stored in a Javascript Object.
Now, here's my question:
What are possible problems that might occur when using this on different machines/browsers/operation systems?
I'm thinking about:
max. javascript object/variable size (The fetched HTML is stored in an javascript object)
possible performance problems
max. number of asynchronous requests?
… anything you can think of?
Thanks a lot in advance,
a hobby programmer
Although it might be a good idea to cache the whole website on the client side there are a lot of things that can cause issues:
Memory
Unnecessary load on the webserver
Loading uneeded pages into memory
Some users have a limit to their internet so loading the entire website is not smart in those cases
Once the user naviagets away or refreshes the entire "cache" is gone
What I would do is first try to optimize the server side.
Add a bunch of caching mechanisms from the database to the user, the "Expires" header can really help you.
And if that doesn't help I would then think about caching some pages(which ones are for you to decide) in the offline cache, see (HTML 5 Offline Features)
That way you are safe even on page reload, keep the memory to a minimum and only load what you need.
PS: Don't try to reinvent stuff that the browser already has :P
You should queue the async requests, and only launch one at a time.
And since you're storing everything in variables, at some point you (the browser) may consume to much memory, and the whole thing can become very slow. I suggest you limit the size of your cache to a certain number of pages.
You can also try to not to store fetched content - just fetch it and throw-out. Browser still will cache fetched pages and images in its internal storage, so subsequent loads will be much faster (of course if ajax library does not forcibly disable the cache i.e. by using POST)
"The jQuery Mobile "page" structure is optimized to support either single pages, or local internal linked "pages" within a page." jQuery docs
What gives better performance for a jQuery Mobile application - which runs on PhoneGap?
all pages in a single.html file and internal loading
either single pages with external links
Any other aspects to consider?
I use jQuery mobile, and all the sites that I've made have been one page sites. The only external pages that I create are those that have embedded Google maps, just so the iframe loading doesn't happen if the user doesn't need it.
I think it boils down to this: one page with lots of content may slow initial loading but will be snappier once loaded, whereas a tiny home page will be quick from the start, buteach linked page will trigger an Ajax request. When designing for mobile, my rule of thumb is to minimize http requests as much as possible. Though many users are on 3+ G networks, it can still be a wait depending on connectivity. Also, connectivity can change in an instant and if the user has been navigating through the site successfully, and all of sudden things slow down to a crawl, this may create a bit of frustration. Therefore, I think from a user experience POV, users are willing to wait a few extra ticks on the initial load if everything else is quick once it's loaded.
Designing all in one page is also good for development with jQM, imo, because I just create a cache-manifest that includes only one page (and the css and js files). Then my site is cached and runs even if the user has no connectivity. If you've worked with applicationCache, you quickly realize that the more files you have, the more difficult it is to maintain the cache manifest and updates are slower.
I can't say much about browser performance, but you should consider load times.
Multiple pages in one document are loaded with the document, so if there's more of them, the DOMready will happen after some time, which can give an unpleasant look. Separate pages are fethed when you need them, so if there are no reasons to use multipage, then I'd recommend sticking with multiple HTML files. For an online app
Also - a multipage can't be used much if you want to stick with progressive enhancement which is JQM's development philosophy.
Any other aspects to consider?
Yes... As far as I know, there still might be some problems (eg. with dialogs) in multipage documents. JQMalpha3 didn't want to display dialogs for me if there was more than one in a multipage.
This depends on your app's size personally I've realised that using one page apps is more responsive if you have only a few pages in my case it was only 3 screens loading in external data which was more responsive than 3 seperate pages.
Hope that helps
I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.