NodeJS express page takes too long to load - javascript

I have been working on a project for a while now using NodeJS and express to make a website. It is hosted on heroku right now.
When I was testing it during developement, I did not have any issues with load time. However when I tested it in a different Wifi than usual (which did not differ much in download speed from the usual) some pages suddenly take 40-60 seconds to load as seen below.
What I don't understand is the big gap where nothing(?) happens.
I am still studying atm so I am still very inexperienced. Any help is greatly appreciated.
I would also be thankful for any links to best practices on how to go about this as I couldn't really find anything that helped me.
And please let me know if there is any more information needed to diagnose this, thank you.

It's not "nothing" happening during the big gap. You just missed what is happening. Look at the top of the graph. You will see the long green bar that's downloading. That's what's happening. It is downloading the main html file (I think the url is /).
It takes 38 seconds (38233 milliseconds) to load the html the first time and 52 seconds (52444 milliseconds) the second time. This is because your html file is 7.5MB - which is around two mp3 files.
The download times are what I would expect from trying to download two mp3 files - around 1 minute.
Find out why your HTML is 7.5MB. That's what is slowing the page load.

Instead of worrying about the “nothing gab”, start by worrying about them images you have: 300k+, 280k+.. and the rest all them pics make your html file to weight 7,45M. So the nothing huge gan is because yow browser is downloading all them pics, plus up on that consider yow free plan at heroku. Them bros are not going to give you their best suite for free

Related

why is my page loading so slowly?

My website: http://jgyboatparty.com/
is loading horrifically slow and I cant figure out why. I have compressed the images as much as possible and made the video file available as a webm, and yet with a fresh reload the page is taken minutes to load!
Am i missing something here, as for some reason when I look at when elements are loaded, some of the images dont even start loading until 30 seconds in. This all means that my loader goes on for a while. (I have a wait until images loaded function)
Any advice would be much appreciated :)
The web is full of useful tools that shall aid you in this:
Google pageSpeed for one and GTmetrix another. Make use of these tools to analyse what could be causing your site to be slow.
Also, ensure all of your'e images are properly optimised. Another tool again that may help is Tinypng.
Google's Inspector can also be very useful to help diagnose bottlenecks and so forth. Here's another link that may help you:
https://developers.google.com/web/tools/chrome-devtools/network-performance/
Also, I see you are using various libraries such as owl carousel, Parallax, and fitvids to name three. I would look to try and at least use cdn versions: a CDN is a way to deliver content from your website or mobile application to people more quickly and efficiently, based on their geographic location.
Also, look into lazy loading of your images.
Do you have enough juice to your server? When pinging you server it takes 71ms, and when i am pinging your govrment's server it's approximately 31ms.
When i am checking the network tab for your website an image which is about 155kb big, takes about 1.2 seconds to load.
Steps to improve your speed might be minify all your scripts.
Do not load all of your content at once.
Open your broswer's dev tools and find out.
Browsers only allow 4 (max) concurrent downloads from 1 domain. So while the browser is downloading those, it blocks the others.
Move your scrips/ images to a sub domain or a cdn or just do something listed in the post below.
Get number of concurrent requests by browser

Structuring huge application assets

We are about to completely rebuild a clients website, it currently has over 1000 pages.
There will be a cull, however my idea is to dynamically load assets based on what's on the page but I wanted to get feedback.
Let's say I have 100 global components (carousel,buttons,videos,Nah etc) currently over time we've just put all javascript for all components into a bundle.js file, same with css, however if a page only uses 3 of those 100 components it seems redundant to include everything.
So I guess my question is if it wrong to dynamically request only the components used, at runtime rather than loading all assets every time?
The big downside I can see is that almost every page will request new files, so caching will be harder, also more HTTP request would have to be made.
But if someone has a better idea please let me know
Firstly, I suggest an evidence-based approach. Don't do anything without data to back up the decision.
My thoughts on an overall approach. I'm thinking about React as I write this, but nothing is React-specific.
Server-render your content. It will then display to your users without needing your JavaScript bundle.
Get a good CDN and/or something like varnish and cache each route/page response. You'll get fast response times no matter how big the site.
Now, when the user visits a page they'll get it quickly and then you asynchronously download your JavaScript file that will breath life into the page.
Because the user is already reading your page, you can take your time loading the JS - up to a second or two. If you think most of users will have decent internet (e.g. they're all in South Korea) then I'd go as big as a 2mb JS bundle before bothering to do chunking. Opinions will vary, it's up to you. If your users have bad internet (e.g. they're all in North Korea) then every kb counts and you should aim to be making the smallest chunks needed for each page. Both for speed and to respect the users' download quota.

Audio Player on site fragile if code is tampered with too much

I am using codes on my sites that I have never used before just for the layout and music player.
The page/audio player works perfect on desktop and android devices if I use this page: http://chancetrahan.com/1abcd.html
The player only works on desktop versions if I use the index file at http://chancetrahan.com
I have followed the site speed up suggestions on pagespeed insights and have gotten it to work on desktop, but I just realized it's not working right, and not showing up right if I view from android browsers.
I'm not sure if it's broken because I compressed the .js and .css files when I transferred them to my site or not, but that is what I am thinking might have happened.
I noticed that when I remove the FB ROOT from the code, it breaks the music player, I have no idea why the music player uses FB ROOT, but it does, I'm not sure what rollups are, but it also says that common.js is running twice.
I have stripped down the code to the bare bones trying to replicate this layout/template/theme with minimal code use, and a speedy response. You might not be able to see some of the code because I am using cloudflare, but I would be happy to use TeamViewer to go over this and kind of get some insight form someone who understands this code.
If you could help me figure this out I would be really appreciative for all the help I can get. Thank you for your help and advice!
It's very common to break javascript files when trying to lower your pagespeed scores. If you're using Cloudflare, test it on development mode and see if it works. Often times rocket loader will cause a problem.
I hope you have a backup of the javascript files before you minified them. If you tried to minify code that was already compressed, there's a very high chance that you broke it.
Further, pagespeed is not as big of a deal as you'd think when it comes to SEO. I put a lot of development hours into making a Wordpress theme from scratch that can get a 100/100 Pagespeed score. It helps, but not a crazy difference in terms of search engine rankings. It's more about just being nice for your users.
Another thing to mention, if you're concatenating your javascript files, first check if your server supports HTTP/2 as this will actually make your site much slower if it does. HTTP/2 is the new version of the SPDY protocol which requires HTTPS, and will quickly and efficiently deliver many resources at once without the need for concatenation.

What is a reasonable size of JavaScript?

What is the maximum size of JavaScript that would be reasonable for a web page? I have a JavaScript program with a data segment of size about 130,000 bytes. There is virtually no whitespace, comments, or variables in this file which could be minified. The file looks something like this:
"a":[0],
"b":[0,5],
"c":[3,4,24],
"d":[0,1,3],
going on for several thousand lines.
Google Analytics gives the following info on the connection speed of the current users:
Rank Type Visitors
1. DSL 428
2. Unknown 398
3. Cable 374
4. T1 225
5. Dialup 29
6. ISDN 1
Is the file size too much?
The alternative is using a server-side program with Ajax.
Better the small size better will be the load time. If you are too concerned with the file size then try gzipping it. You can also minify the js file.
Minifying js and css files is one of the performance rules that Yahoo suggests. For more detailed reading check this out.
Best Practices for Speeding Up Your Web Site
Edit
Check this one
How To Optimize Your Site With GZIP Compression
It depends on your users and what sort of connection speeds they have. With a 1 Mb/s connection or faster it probably wouldn't be too noticable, but with an older modem it would be very irritating having to wait 10 seconds or more.
You could try Minify to compress your script: http://code.google.com/p/minify/
You can also load your scripts in the background using AJAX: http://betterexplained.com/articles/speed-up-your-javascript-load-time/
whatever your users will tolerate given their connection speed .. how long can they wait vs the benefit they gain for doing that ..
a download calculator might help ya
130k would take about 25-35 seconds to download on dialup.
As someone who is forced to use dialup two or three times a year, I'll tell you - if you're programming a web application that I wanted to use, I might stick around to use it. If it's just a site that I'm surfing to randomly, I'd be outta there :)
You should definitely look into minimizing the script. Looks like others have found the links before I did, so definitely check them out.
It is very impotent to speed up the web page load time to have small javaScript file There are some points
Use external JavaScript file.
Put all your JavaScript below body end tag.
Try to minimize file size using tools mentioned above.
There are many more tips regarding this here

Slow load times: ISP or Coding

I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.

Categories