I have ran Google Page Speed Insights and their main suggestion is to minify js file with over 70% of savings (from 423 KB to 312 KB to be exact) which is insane.
But the file is minified! What am I missing?
Resources:
JS file.
Web Page - https://www.eldorado.gg
P.S.
I have gone through other similar questions on SO and neither were similar to my problem.
Google Page Speed Insights should be taken with a pinch of salt
It can give you a very high level view of massive issues but once you start to optimise things, it falls short. It believes your js file is large and that minimising it will help. It can't actually tell that you have minimised it already.
I once had it tell me to use gzip compression for a single, cached 2k SVG file. Another time, it asked me to shrink the 200 or so tiny 12k jpegs that a site was using. Its not very clever - in the first case it was wasting my time, and in the second case there was a better answer (use sprite sheets) but its unable to look that deeply.
I'd recommend skimming your minified file and looking for human readable JS tokens.
From what I can see, the test is mainly looking at JS tokens length.
https://github.com/GoogleChrome/lighthouse/blob/master/lighthouse-core/audits/byte-efficiency/unminified-javascript.js
/**
* #fileOverview Estimates minification savings by determining the ratio of parseable JS tokens to the
* length of the entire string. Though simple, this method is quite accurate at identifying whether
* a script was already minified and offers a relatively conservative minification estimate (our two
* primary goals).
*
* This audit only examines scripts that were independent network requests and not inlined or eval'd.
*
* See https://github.com/GoogleChrome/lighthouse/pull/3950#issue-277887798 for stats on accuracy.
*/
If you see human readable tokens in your minified file, try looking at your minifier settings to see if there is an option you can toggle that gets rid of these.
If there is no setting that gets rid of them, try identifying where these are coming from and how they are different than other tokens that are minified successfully. The culprit might be a coding pattern, framework, or transpiler.
Note: The source code is linked from their help page (https://web.dev/unminified-javascript/)
Related
A bit of a specific question, I'm currently running a server side application that uses the echarts js library. And I've noticed that the uglification of the echarts file is taking very long. In the order of multiple minutes rather than milliseconds and seconds. Echarts is quite a large file, but it is still disproportionately slow, so I've come to the educated guess that it may be the presence of chinese characters in the library.
This uglification process is taking place inside of a gulp streaming task.
Does anybody know of a way to reduce the time for the uglification of chinese characters?
[fixed]
This issue was due to the fact that the file path I was pointing to as a source for my uglification process pointed to a file that was already uglified
... oops :)
Reduced my build time by a factor of 5
I'm using the excellent requirejs optimizer to compress the code of a web application.
The application uses a lot of third-party libs. I have several options :
Let the user download all the third party libs separately from my server
Let the user download all the third party libs from a CDN, if available
User requirejs to produce a 'compressed' version of all those libs, in a single file
Now, I know that caching and / or a CDN would help with how long it takes to fetch each individual library, however if I have 15 libs, I'm still going to end up with 15 http requests ; which is all the more annoying if the actual code for my application ends up being served in one or two relatively small files.
So what are the pros and cons of each methods ? Also, I suppose I would be actually 'redistributing' (in the sense of common FOOS licenses) the libraries if I were to bundle them inside my app (rather than pointing to a CDN ?)
Any experience / ideas welcome.
Thanks.
You could take a look to Why should I use Google's CDN for jQuery? question, why CDN is better solution.
It increases the parallelism available. (Most browsers will only
download 3 or 4 files at a time from any given site.)
It increases the chance that there will be a cache-hit. (As more sites
follow this practice, more users already have the file ready.)
It ensures that the payload will be as small as possible. (Google can
pre-compress the file in a wide array of formats (like GZIP or
DEFLATE). This makes the time-to-download very small, because it is
super compressed and it isn't compressed on the fly.)
It reduces the amount of bandwidth used by your server. (Google is
basically offering free bandwidth.)
It ensures that the user will get a geographically close response.
(Google has servers all over the world, further decreasing the
latency.)
(Optional) They will automatically keep your scripts up to date. (If
you like to "fly by the seat of your pants," you can always use the
latest version of any script that they offer. These could fix security
holes, but generally just break your stuff.)
Let's say I'm building a static 10-page website for a client, and there's only a few lines of JavaScript for the whole site (less than 1KB). In this situation I'd guess that it's best (for performance) to put the <1KB of JavaScript code inline between script tags on every page, rather than in an external .js file. The extra bandwidth consumption (when moving between pages) is probably worth it for removing a whole HTTP request.
At the other end of the spectrum, if I had 200KB of JavaScript on the same website, then I'd definitely put this in a separate file to reduce bandwidth when moving between pages on the site.
But I've no idea where the 'cut-off point' should be. If I have 5KB of JS, should I just put this inline in my HTML? What about 10KB? 20KB?
Obviously the 'cut-off point' is going to depend on the situation, e.g. it might be different for mobile sites. But if anyone has any general pointers that would help guide this decision, then I'd like to hear them.
(NB: I'm interested only in performance here, not maintainability etc. I can keep my code in separate files but use some kind of build process or serverside middleware to automatically inline it, so maintainability will not be an issue.)
(Bonus points: please let me know if all the considerations are exactly the same with inline vs. external CSS.)
I'm strictly talking about performance here, more as a thought experiment (and please excuse the lack of experimental rigor).
If you're inlining the javascript, yes you do save time in the fact that everything is done in one http request. However, you should factor in the time it takes for a server to dynamically generate the webpage that you need (SSL also adds time to this).
Best case
If you can create a build that generates minified javascript and injects it into an html page, combined with gzip compression, you should result in a much lower bandwidth. The more text you stick in the compression, the bigger your payoff compared to individually gzipping per request. Ultimately, this means if you can have it load a single html page statically with all the javascript/css inline, this means it will definitely be faster.
Normal case (with dynamic html and shared js libraries)
A small sample of some stackoverflow posts and how they affect my own bandwidth:
304ms 9.67KB
204ms 11.28KB
344ms 17.93KB
290ms 17.19KB
210ms 16.79KB
591ms 37.20KB
229ms 30.55KB
Judging from this, it seems like the overhead incurred (disregarding filesize) is around 150ms for each http connection - perhaps in the worst case. Now, the question is, how big does the text (javascript in your case) have to be for your bandwidth to incur 150ms. According to this article (http://www.telecompetitor.com/akamai-average-u-s-broadband-connection-speed-is-now-5-3-mbps/), for broadband users, we are working with an average of 5.3 mbps (which is .6625 MB/s or 678.4 KB/s or .6784 KB/ms). This means that for the average broadband user, you will need roughly 100KB download of gzipped+minified javascript for it to equal.
Please adjust the parameters yourself to see what this may mean for your audience. This number, whatever you calculate it to be, is the point at which you may be better off serving it inline (through server-generated response) versus fetching it/caching it externally.
All in all, I don't think that for performance reasons, this is a bottleneck at all. The size of compressed + minified javascript is usually negligible and it has to be on orders of unmaintainable magnitude that it would matter.
The cutoff point is 0KB -- the answer to "When is a JavaScript small enough to be worth inlining" is "never".
Aside from the maintainability issues, which you say you don't want to consider, it's better for performance anyway. Yes, you have one extra HTTP request the first time you reference the .js file, but after that it's cached so your overall traffic/bandwidth decreases over time.
The same is true for CSS files.
Making the page heavier with inline JS or CSS is more costly than the extra request for the initial fetches of the files (unless you're serving completely static .html where the whole page is cached for a long time)
Short answer: it depends!
Assuming:
You would not inline the script by actually copy-and-pasting it into every HTML page
A separate file would be cached properly
Do the numbers. Come up with a rough estimate for how much extra traffic would be generated by the extra HTML, in proportion to the rest of the traffic, and then decide if you think that's an acceptable compromise.
If it's 1% it's probably not worth worrying about. But if it's 50% then you can halve your traffic by putting the script in a separate file.
What is the maximum size of JavaScript that would be reasonable for a web page? I have a JavaScript program with a data segment of size about 130,000 bytes. There is virtually no whitespace, comments, or variables in this file which could be minified. The file looks something like this:
"a":[0],
"b":[0,5],
"c":[3,4,24],
"d":[0,1,3],
going on for several thousand lines.
Google Analytics gives the following info on the connection speed of the current users:
Rank Type Visitors
1. DSL 428
2. Unknown 398
3. Cable 374
4. T1 225
5. Dialup 29
6. ISDN 1
Is the file size too much?
The alternative is using a server-side program with Ajax.
Better the small size better will be the load time. If you are too concerned with the file size then try gzipping it. You can also minify the js file.
Minifying js and css files is one of the performance rules that Yahoo suggests. For more detailed reading check this out.
Best Practices for Speeding Up Your Web Site
Edit
Check this one
How To Optimize Your Site With GZIP Compression
It depends on your users and what sort of connection speeds they have. With a 1 Mb/s connection or faster it probably wouldn't be too noticable, but with an older modem it would be very irritating having to wait 10 seconds or more.
You could try Minify to compress your script: http://code.google.com/p/minify/
You can also load your scripts in the background using AJAX: http://betterexplained.com/articles/speed-up-your-javascript-load-time/
whatever your users will tolerate given their connection speed .. how long can they wait vs the benefit they gain for doing that ..
a download calculator might help ya
130k would take about 25-35 seconds to download on dialup.
As someone who is forced to use dialup two or three times a year, I'll tell you - if you're programming a web application that I wanted to use, I might stick around to use it. If it's just a site that I'm surfing to randomly, I'd be outta there :)
You should definitely look into minimizing the script. Looks like others have found the links before I did, so definitely check them out.
It is very impotent to speed up the web page load time to have small javaScript file There are some points
Use external JavaScript file.
Put all your JavaScript below body end tag.
Try to minimize file size using tools mentioned above.
There are many more tips regarding this here
I'm making an AIR application (so the download time doesn't have a huge impact). Does combining and minifing all the JavaScript files affect the performance? How would obfuscating affect the performance?
Minifying improves performance for your page overall by decreasing the load time (even if only slightly).
Neither minifying nor obfuscating alter the execution time by any perceivable amount for the vast majority of JavaScript code out there.
I do recommend minifying for those reasons and more. Minifying multiple scripts together (like jQuery and its plugins) can yield even greater savings.
As pointed out, on constrained devices and/or with very large codebases minifying could yield a noticeable result.
Minification
Minification does improve performance for two reasons:
Reduced file-size (because it removes comments and unnecessary white spaces), so your script loads faster. Even if it is embedded into the <head>.
It is parsed faster, since comments and white spaces don't have to be explicitly ignored (since they're not there).
Combining
I've written quite a few HTML/JavaScript AIR applications, and from personal experience, combining files won't make a difference. In fact, it's good practice to separate the script based on certain criteria (classes, global functions, SQL functions, etc.). It helps keep them organised when the project becomes too big.
Obfuscation
Obfuscating is usually a combination of minification and renaming variables. It involves using eval to blow up the code again. This reduces performance for obvious reasons, but it depends on the size of your code.
I'd suggest running tests to understand this best for your specific situation.
Everyone here already talked about minifying, but nobody talked about the second part of your question - combining. This will definitely improve performance, probably even more than minifying.
Multiple files require multiple HTTP requests, so when you put them all into one file, only one request is needed. This is important for two reasons:
each individual HTTP request may take longer to load for various routing reasons, and one file will potentially delay your whole application.
browsers and other clients have a maximum limit of files they are allowed to download concurrently from a single domain. Depending on the number of files in your application, this may mean the client queuing them up, thus making the load even longer.
Also, besides minifying and combining, you have to absolutely make sure you have some sort of server-side compression enabled. This can save you 90% or even more in the amount of bytes transferred, depending on the files.
You can read more about compression (gzip, deflate) in How to make your site lightning fast by compressing (deflate/gzip) your HTML, JavaScript, CSS, XML, etc. in Apache.
Minification does not improve the speed at which JavaScript executes at runtime, which I believe it what you're really interested in. In fact, if you use Packer's Base64 compression, it's actually slower on initial load.
Minification will reduce the size of the JavaScript though, making your application's download size smaller.
Minifying strips out all comments, superfluous white space and shortens variable names. It thus reduces download time for your JavaScript files as they are (usually) a lot smaller in filesize. So, yes it does improve performance.
The obfuscation shouldn't adversely affect performance.
The article Best Practices for Speeding Up Your Web Site talks about minifying.
I'd like to post this as a separate answer as it somewhat contrasts the accepted one:
Yes, it does make a performance difference as it reduces parsing time - and that's often the critical thing. For me, it was even just simply linear in the size and I could get it from 12 seconds to 4 seconds parse time by minifying from 3 MB to 1 MB.
It's not a big application either. It just has a couple of reasonable dependencies.
So the moral of the story here is: Yes, minifying is important for performance - and not because of bandwidth, but because of parsing.
According to this page:
Minification in JavaScript is the process of removing all characters that are not necessary from the JavaScript source code. That is why it is called “minification” – because all of the data that is not necessary to the functioning of the JavaScript is removed from the source code, and therefore the JavaScript is “minimized”. Even though these characters are removed from the JavaScript source code, the functionality of the JavaScript code does not change at all.
So, your JavaScript code will behave exactly the same even after it goes through the minification process. Code that has gone through the minification process is also known as “minified” code
What are the benefits and advantages of JavaScript minification
The main purpose of JavaScript minification is to speed up the downloading or transfer of the JavaScript code from the server hosting the website’s JavaScript. The reason that minification makes downloads go faster is because it reduces the amount of data (in the minified JavaScript file) that needs to be downloaded. Less data means that the user’s browser spends less time processing that data, which is why time is saved. So, we can say that minification is performed on JavaScript source code because it is essentially a performance enhancement – and it allows websites that use minified JavaScript to load faster.