What is the maximum size of JavaScript that would be reasonable for a web page? I have a JavaScript program with a data segment of size about 130,000 bytes. There is virtually no whitespace, comments, or variables in this file which could be minified. The file looks something like this:
"a":[0],
"b":[0,5],
"c":[3,4,24],
"d":[0,1,3],
going on for several thousand lines.
Google Analytics gives the following info on the connection speed of the current users:
Rank Type Visitors
1. DSL 428
2. Unknown 398
3. Cable 374
4. T1 225
5. Dialup 29
6. ISDN 1
Is the file size too much?
The alternative is using a server-side program with Ajax.
Better the small size better will be the load time. If you are too concerned with the file size then try gzipping it. You can also minify the js file.
Minifying js and css files is one of the performance rules that Yahoo suggests. For more detailed reading check this out.
Best Practices for Speeding Up Your Web Site
Edit
Check this one
How To Optimize Your Site With GZIP Compression
It depends on your users and what sort of connection speeds they have. With a 1 Mb/s connection or faster it probably wouldn't be too noticable, but with an older modem it would be very irritating having to wait 10 seconds or more.
You could try Minify to compress your script: http://code.google.com/p/minify/
You can also load your scripts in the background using AJAX: http://betterexplained.com/articles/speed-up-your-javascript-load-time/
whatever your users will tolerate given their connection speed .. how long can they wait vs the benefit they gain for doing that ..
a download calculator might help ya
130k would take about 25-35 seconds to download on dialup.
As someone who is forced to use dialup two or three times a year, I'll tell you - if you're programming a web application that I wanted to use, I might stick around to use it. If it's just a site that I'm surfing to randomly, I'd be outta there :)
You should definitely look into minimizing the script. Looks like others have found the links before I did, so definitely check them out.
It is very impotent to speed up the web page load time to have small javaScript file There are some points
Use external JavaScript file.
Put all your JavaScript below body end tag.
Try to minimize file size using tools mentioned above.
There are many more tips regarding this here
Related
I have ran Google Page Speed Insights and their main suggestion is to minify js file with over 70% of savings (from 423 KB to 312 KB to be exact) which is insane.
But the file is minified! What am I missing?
Resources:
JS file.
Web Page - https://www.eldorado.gg
P.S.
I have gone through other similar questions on SO and neither were similar to my problem.
Google Page Speed Insights should be taken with a pinch of salt
It can give you a very high level view of massive issues but once you start to optimise things, it falls short. It believes your js file is large and that minimising it will help. It can't actually tell that you have minimised it already.
I once had it tell me to use gzip compression for a single, cached 2k SVG file. Another time, it asked me to shrink the 200 or so tiny 12k jpegs that a site was using. Its not very clever - in the first case it was wasting my time, and in the second case there was a better answer (use sprite sheets) but its unable to look that deeply.
I'd recommend skimming your minified file and looking for human readable JS tokens.
From what I can see, the test is mainly looking at JS tokens length.
https://github.com/GoogleChrome/lighthouse/blob/master/lighthouse-core/audits/byte-efficiency/unminified-javascript.js
/**
* #fileOverview Estimates minification savings by determining the ratio of parseable JS tokens to the
* length of the entire string. Though simple, this method is quite accurate at identifying whether
* a script was already minified and offers a relatively conservative minification estimate (our two
* primary goals).
*
* This audit only examines scripts that were independent network requests and not inlined or eval'd.
*
* See https://github.com/GoogleChrome/lighthouse/pull/3950#issue-277887798 for stats on accuracy.
*/
If you see human readable tokens in your minified file, try looking at your minifier settings to see if there is an option you can toggle that gets rid of these.
If there is no setting that gets rid of them, try identifying where these are coming from and how they are different than other tokens that are minified successfully. The culprit might be a coding pattern, framework, or transpiler.
Note: The source code is linked from their help page (https://web.dev/unminified-javascript/)
This question already has answers here:
jQuery or javascript to find memory usage of page
(10 answers)
Closed 8 years ago.
I have website which can upload images.
I do CROP from client side before upload then in server side new optimize...
On Mobile devices when no Free RAM fails.
How I can get RAM memory usage from JavaScript to skip CROP if there no memory?
I am looking only JavaScript solution!.
PLEASE I do not have LEAK OF MEMORY!!!
if I open many apps and no much RAM left my strategy not working
I need CODE by JavaScript get the free RAM and if is it bellow some amount I skip the CROP
------------ OK define Fail: --------------
From mobile devices people take photo and upload it...
from JavaScript I perform CROP
1. around 2MB image goes to 300kb
2. I upload only 300kb then from server side 300kb --> 30kb that I save
If there is no RAM this FAILs
I do not want to say "try again"
I would like to Skip the CROP
Thank you very much for comments.
I handle the errors but I would like to avoid client to wait 40-60 sec and then message
If I go with NO CROP IS IT OK but saving near 1.7MB per image bandwidth... GREEDY :-)
window.performance good I will used thanks.
I will do research to have round trip from SERVER SIDE what I can do can I find it for Mobile devices
In Development
Use Chrome's DevTools for pretty comphrensive diagnostics. You can get JavaScript run-time diagnostics, request information, and basically anything you might need.
Client-side Testing
As far as testing how much RAM is available in your code itself, there isn't really a "correct" or "recommended" strategy (source). Obviously the best solution would be to just optimize your code; varying your site's display/interactions based of how many other apps a client is running could be confusing for the user (eg: they expect some tool to be displayed and it never is; they think the page isn't loading properly, so they leave; etc.).
Some highlights from the source above:
"Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures."
You'll need to monitor both the DOM and the memory you're using for your code for accurate results.
You could try using window.performance (source), but this isn't well-supported across different browsers.
Note: As Danny mentions in a comment below, showing an advisory message when a feature is disabled could clear up some confusion, but then why use the resources on a feature that is so optional that you could just not use it? Just my two cents... :)
PageSpeed and Yslow suggest that to combine javascripts file to reduce HTTPRequest. But this is becuase (I think) pre ie8 browser has no more than 2 serverhost connection.
But nowaday, browser has 6 serverhost connections, which means it has download javascripts in parrallel. So let say we have 1MB of javascript, should we break it down into 6 different files in similar size to obtain max download speed? Please let me know.
Micahel.S
No, because each HTTP request involves overhead (less if pipelining is used)
The answer to your question is no. However, assuming you are able to serve your content in a completely isolated environment where only IE8 is used (like company intranet), then the answer to your question becomes: no.
Since you aren't designing for IE6-7 I assume you are in an isolated environment (otherwise you are making a poor designing decision). In this environment, yes you might see small benefits from breaking down JavaScript files, but I recommend against it.
Why? Since you are optimizing for speed, I assume you are putting JavaScript at the bottom of the body tag in your HTML document, in order to prevent JS from blocking download of DOM. This is a fundamental practice to make the page appear to be loading faster. However, by placing the content in the bottom of the body, your question becomes moot. Since the DOM is no longer being blocked by the script tags, whatever speed benefits you could achieve by using parallel downloading would be lost on the user because they see the page load before the browser even requests the JavaScript files.
tl;dr: There is no practical speed advantage to break JS into multiple files for parallel downloading.
Splitting the files won't make too much of a difference really. If you want to see performance gains in terms of download times for your Production environment what I always do is use YUI Compressor (http://developer.yahoo.com/yui/compressor/) to get my JS file size down as small as possible then serve a gzipped version of the js to browsers that support it.
In a development environment you shouldn't be too worried about it though. Just split them logically based on their purpose so they're easier to work on then bring them all together into one file once you're ready and optimize it for production.
Most browsers will cache your JavaScript files anyway, so after the first page load, it won't matter.
But, you should split your JavaScript logically and in a way that would most help you in development, since browsers vary in the number of simultaneous connections they allow.
For speed, you can obfuscate your code (via a minimization routine) and serve it in a way no human would have the patience to read.
So I have been playing around with a home project that includes a lot of js. I having been using Script# to write my own library etc. Personally I wouldn't write a lot of js if I didn't have a tool like Script# or GWT to help maintain it.
So far it includes these external libraries:
– ASP.NET AJAX
– ExtJS
– Google Maps
– Google Visulisations
– My own library to wrap the above libraries and add extra functionality...
So that works out to be a heap of js. It runs fine on my pc. I however have little faith in js/browsers and I am concerned that loading too much js will cause the browser to die or perform poorly.
Is this a valid concern?
Does anyone have any experience with loading a lot of js into the browser that has resulted in performance issues? I however know there are a lot of variables here, for example browser type (I assume IE is worse than others) the client PCs RAM etc, but it would be good to get other peoples experiences. I would hate to invest a lot of time into js only to find that I am painting myself into a corner.
The more I use Script# the more client classes I have as I move more processing onto the client. At what point would this start becoming an issue? I'm sure the browser could easily handle 100 MS Ajax classes but at what would be too far for a browser?
NOTE: I am not concerned about the actual js file sizes but more the runtime environment that gets loaded.
There is nothing wrong with having large number of js files or big js files, the project currently am working on got more than 60 core framework libraries and 30 of each module got average of 5 to 6 js files.
So the only concern is how you design your website that make use of the JS best practices & optimization techniques. like
Minimize the JS using YUI or any other compression libraries to address the download size issues.
Enable proper caching in your webserver to reduce the file downloads.
Put your javascript in the bottom of the page, or make it a separate file.
Make your AJAX response cachable.
And finally, design your page that handles the on-demamnd script loading.
- Microsoft DOLOTO is a good example for this one. download it here
And Check out the High Performance Web Sites && latest Even Faster Web Sites by Steve Souders. Its a must read for the web developers. This book addresses all the common problems web developers facing today.
with modern browsers routinely occupying 250 MB of RAM or more, script caching, and optimized javascript engines, keeping the script library resident would probably be negligible added load in most reasonable scenarios.
the biggest bottleneck would probably be intitial load time of the scripts - downloading and parsing them. but once that's done, the scripts are cached and the per-page initialization isn't very noticeable.
I highly doubt a browser would ever crash running your JS scripts, but it will become really slow and may not perform what you want. Most people are more concerned about how fast it runs, not if it will run!
I agree with jspcal, you should be able to load quite a lot of javascript with no problems. The javascript engines in all the modern browsers are a lot faster than they were a few years ago. The initial load will be the biggest issue. If possible I'd suggest lazy loading scripts that aren't needed for the page to render.
Also, Steve Souders has a lot of great material about improving page load times, such as this article, which gives several techniques for loading scripts without blocking.
http://www.stevesouders.com/blog/2009/04/27/loading-scripts-without-blocking/
If you're really concerned about performance then I would take a look at your target audience. If you think you'll have a relatively high number of IE6 users then test it out in IE6-- on an older machine if possible. IE Tester is great for this.
I was wondering, If I have, let's say 6 javascripts includes on a page and 4-5 css includes as well on the same page, does it actually makes it optimal for the page to load if I do create one file or perhaps two and append them all together instead of having bunch of them?
Yes. It will get better performance with fewer files.
There are a few reasons for this and I'm sure others will chime in as I won't list them all.
There is overhead in the requests in addition to the size of the file, such as the request its self, the headers, cookies (if sent) and so on. Even in many caching scenarios the browser will send a request to see if the file has been modified or not. Of course proper headers/server configuration can help with this.
Browsers by default have a limited number of simultaneous connections that it will open at a time to a given domain. I believe IE has 2 and firefox does 4 (I could mistaken on the numbers). Anyway the point is, if you have 10 images, 5 js files and 2 css files, thats 17 items that needs to be downloaded and only a few will be done at the same time, the rest are just queued.
I know these are vague and simplistic explanations, but I hope it gets you on the right track.
One of your goals is to reduce http requests, so yes. The tool called yslow can grade your application to help you see what you can do to get a better user experience.
http://developer.yahoo.com/yslow/
Even if browse doing several requests it's trying to open least amount of TCP connections (see Keep-Alive HTTP header option docs). Speed of web pages loading also can be improved by settings up compression (DEFLATE or GZIP) mode on the server side.
Each include is a separate HTTP request the user's browser has to make, and with an HTTP request comes overhead (on both the server and the connection). Combining multiple CSS and JavaScript files will make things easier on you and your users.
This can be done with images as well, via a technique called CSS sprites.
Yes. You are making fewer HTTP requests that way.
The best possible solution would be to add all code to one page so it can be fetched in one GET request by the browser. If you are linking to multiple files, the browser has to request for these external pages everytime the page is loaded.
This may not cause a problem if pieplineing is enabled in the browser and the site is not generating much traffic.
Google have streamlined their code to being all in one. I can't even imagine how many requests that has saved and lightened the load on their servers with that amount of traffic.
There's no longer any reason to feel torn between wanting to partition js & css files for the sake of organisation on the one hand and to have few files for the sake of efficiency on the other. There are tools that allow you to achieve both.
For instance, you can incorporate Blender into your build process to aggregate (and compresses) CSS and JavaScript assets. Java developers should take a look at JAWR, which is state of the art.
I'm not really very versed in the factors which effect server load, however I think the best thing to do would be to find a balance between having one big chunk and having your scripts organized into meaningful separate files. I don't think that having five or so different files should influence performance too much.
A more influential factor to look at would the compression of the scripts, there are various online utilities which get rid of white space and use more efficient variable names, I think these will result in much more dramatic improvements than putting the files together.
As others have said, yes, the fewer files you can include, the better.
I highly recommend Blender for minifying and consolidating multiple CSS/JS files. If you're like me and often end up with 10-15 stylesheets (reset, screen, print, home, about, products, search, etc...) this tool is great help.