why is my page loading so slowly? - javascript

My website: http://jgyboatparty.com/
is loading horrifically slow and I cant figure out why. I have compressed the images as much as possible and made the video file available as a webm, and yet with a fresh reload the page is taken minutes to load!
Am i missing something here, as for some reason when I look at when elements are loaded, some of the images dont even start loading until 30 seconds in. This all means that my loader goes on for a while. (I have a wait until images loaded function)
Any advice would be much appreciated :)

The web is full of useful tools that shall aid you in this:
Google pageSpeed for one and GTmetrix another. Make use of these tools to analyse what could be causing your site to be slow.
Also, ensure all of your'e images are properly optimised. Another tool again that may help is Tinypng.
Google's Inspector can also be very useful to help diagnose bottlenecks and so forth. Here's another link that may help you:
https://developers.google.com/web/tools/chrome-devtools/network-performance/
Also, I see you are using various libraries such as owl carousel, Parallax, and fitvids to name three. I would look to try and at least use cdn versions: a CDN is a way to deliver content from your website or mobile application to people more quickly and efficiently, based on their geographic location.
Also, look into lazy loading of your images.

Do you have enough juice to your server? When pinging you server it takes 71ms, and when i am pinging your govrment's server it's approximately 31ms.
When i am checking the network tab for your website an image which is about 155kb big, takes about 1.2 seconds to load.
Steps to improve your speed might be minify all your scripts.
Do not load all of your content at once.

Open your broswer's dev tools and find out.
Browsers only allow 4 (max) concurrent downloads from 1 domain. So while the browser is downloading those, it blocks the others.
Move your scrips/ images to a sub domain or a cdn or just do something listed in the post below.
Get number of concurrent requests by browser

Related

NodeJS express page takes too long to load

I have been working on a project for a while now using NodeJS and express to make a website. It is hosted on heroku right now.
When I was testing it during developement, I did not have any issues with load time. However when I tested it in a different Wifi than usual (which did not differ much in download speed from the usual) some pages suddenly take 40-60 seconds to load as seen below.
What I don't understand is the big gap where nothing(?) happens.
I am still studying atm so I am still very inexperienced. Any help is greatly appreciated.
I would also be thankful for any links to best practices on how to go about this as I couldn't really find anything that helped me.
And please let me know if there is any more information needed to diagnose this, thank you.
It's not "nothing" happening during the big gap. You just missed what is happening. Look at the top of the graph. You will see the long green bar that's downloading. That's what's happening. It is downloading the main html file (I think the url is /).
It takes 38 seconds (38233 milliseconds) to load the html the first time and 52 seconds (52444 milliseconds) the second time. This is because your html file is 7.5MB - which is around two mp3 files.
The download times are what I would expect from trying to download two mp3 files - around 1 minute.
Find out why your HTML is 7.5MB. That's what is slowing the page load.
Instead of worrying about the “nothing gab”, start by worrying about them images you have: 300k+, 280k+.. and the rest all them pics make your html file to weight 7,45M. So the nothing huge gan is because yow browser is downloading all them pics, plus up on that consider yow free plan at heroku. Them bros are not going to give you their best suite for free

How to determine the loading time of website

I am creating a website professionally for a client as my first project and i am using too many libraries for instance velocity.js,jquery,jquery.ui,animate.css and also some image slider plugin for jquery right now i am using the min version and all of the files are downloaded in my machine but when i will put the site live will it severely affect the loading time of website or it will be normal.
You can put it up. Test it Click here. But the best way is to put it up and test the ping.
Yes, it will severely affect the loading of the page. Chrome comes with developer tools out of the box, and Firebug for Firefox is only a couple of clicks away. That, combined with the RTT time and badnwidth to the site gives you enough information to calculate exactly how slow the first hit page load time.
Assuming that caching is configured correctly, subsequent page transitions should be faster - but even loading all this javascript from cache and parsing it will have a big impact on the load time. Note that in the of Dokuwiki, the CMS already handles merging of javascript into a single file.
Using PJax or similar will help with subsequent transitions. Using online repositories for standard libraries does not help with performance. Minimizing the javascript does not give very big wins (since you are already compressing the files on the fly, aren't you?)
There's big wins to be had from defering Javascript loading and stripping/optimizing/merging CSS files.
There are a lot of other things you can do to make your page render faster - but space here is too limited.

How to reliably load required JavaScript files?

I came across the problem when due to internet connection problems, some of the required JavaScript files are not loading. Body onload event gets fired however classes required for logic of the page are not present.
One more thing, problem which I want to fix is not in the website, it is in web application which does not have any image or CSS files. Just imagine a JavaScript code running in iframe. Thus, I have problems only with scripts.
Here are my ideas how to fix this, please comment/correct me if I'm wrong:
Obfuscate and combine files into when pushing to live so overall size of the files will be decreased and task will come to reliably loading 1 file
Enable gzip compression on server. So again resulting file size will be much smaller
Put proper cache headers for that file, so once loaded it will be cached in browser/proxy server
Finally, even having all this, there could be a case that file will not be loaded. In this case I plan to load that file dynamically from JavaScript, once page is loaded. There will be "Retry failed load" logic with maximum 5 attempts for example
Any other ideas?
If the "retry" script fails to grab the required dependencies, redirect to a "no script" version of the site, if one is available. Or try to have a gracefully degrading page, so even if all steps fail, the site is still usable.
1 - Correct but double check if JavaScript functions from different files don't overlap each other.
2 - Correct - this should be always on.
3 - Correct but the Browser will still try to get a HTTP 304: Not Modified code from the server.
4 - Correct, consider fallback to a noscript version of the website after 1 or 2 failed attempts (5 is too much).
I don't personally think it's worth it to try to redo the logic that the browser has. What if the images in your page don't load? What if the main page doesn't load. If the user has internet connection problems, they need to fix those internet connection problems. Lots of things will not work reliably until they do.
So, are you going to check that every image your page displays loads properly and if it didn't load, are you going to manually try to reload those too?
In my opinion, it might be worth it to put some inline JS to detect whether an external JS file didn't load (all you have to do is check for the existence of one global variable or top level function in the external JS file) and then just tell the user that they are having internet connection problems and they should fix those problems and then reload the site.
Your points are valid for script loading, but you must also consider the website usage.
If the scripts are not loading for whatever reason, the site must be still completely usable and navigable. The user experience come first before everything else.
The scripts should be loaded after the website interface has been loaded and visualized by the browsers, and should contain code to enhance user experience, not something you must absolutely rely on.
This way even when the connection is really slow, I will still be able to read content and choose to change page or go somewhere else, instead of having a blank page or a page with only the header displayed.
This to me is the most important point.
Also, are you sure about a retry approach? It causes more requests to the server. If the connection is slow or laggy then it may be best to not run the script at all, expecially considering users may spend little time on the page and only need to fast read at content. Also, in the connection is slow, how much time would you set for a timeout? What if the script is being downloaded while your timeout fired and you retry again? How can you effectively determine that amount of time, and the "slowness" of the connection?
EDIT
Have you tried out head.js? Is a plugin aimed at fastest possible sripts loading, maybe it will help.

How do Sports Illustrated's Gallery Pages Load So Fast?

I've been taking a look at their code and I can't quite figure it out. Are they loading entire new pages, or are they using jquery to change the browser url and keep most of the page static?
I'm viewing source of this page.
I don't think there is a "silver bullet" explanation--their site is likely fast because they've performance-tuned it and removed any significant bottlenecks.
They are in-fact reloading the page with each selected image--this is clear with a HTTP debugger such as Fiddler2.
The perceived speed is partly explained by their use of content-delivery-networks and gzip compression--both speed up delivery of content. Their HTML structure likely factors is as well, and there is the chance they are streaming the response to allow the browser to begin rendering as early as possible.

Slow load times: ISP or Coding

I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.

Categories