I've been facing pagespeed problems using this tool from google
My goal is to reach ~95 on pagespeed insights;
I use create-react-app, and after I've:
Enabled gzip in nginx for all project files,
Compressed all images as suggested by Google,
Used loadCSS polyfill to avoid css-blocking while page renders (I placed links of sanitize.min.css, and slick-carousel I use in public/index.html as suggested),
Loaded fonts as suggested here using fontfaceobserver,
Reviewed my css files, reduced amount of inline-css in js files to minimal
my current pagespeed is Medium(72) and Good(85) and PageSpeed Insights suggests to remove JS file which are blocking download (this is my main.js from react) and optimize CSS delivery (which is my main.css), and to make use of browser-caching (which I think is not the case here).
I seen similar questions and I tried to load pages as chunks using code-splitting (from this tutorial) but it only reduced the pagespeed from what I have now, to 40-70 for mobile and for desktop respectively which makes no sense to me because why smaller chunks of code would do this? I also tried different approaches of react-router code-splitting but they didn't help to improve pagespeed at all, even worsen it.
I thought the issue might be that I
#import "material-components-web/material-components-web";
In my index.scss which is pretty heavy itself, I tried to import separate MDC packages but it didn't affected page speed at all. What can be done to reach desired 90+ pagespeed in my case? Thank you!
Well create-react-app is a broad starter project, meaning it is made with different things in mind, and with that, come ~50-100 dependencies which most of them you probably don't need/use. Such a simple tool comes with it's downsides like this. If you are a beginner it is a very awesome tool to use, if you are advanced you'll see you have to eject to make more advanced config changes.
If you really need to tweak it for you every single need, you need to eject, then start by cutting off dependencies that you don't need, but if you are a beginner, I'd say stick with it.
I know this isn't a precise answer to your question, but broad questions receive broad answers :)
Related
I have an Angular7 app that was passed onto me. The First Contentful Paint and Time to Interactive is about 6 seconds! It seems to stall(pending) for about 4 seconds on https://www.google-analytics.com/collect and https://fonts.gstatic.com/s/opensans API call, which I found is related with google fonts. But this might not be the cause of the stall. I tried to use lazy loading modules to get bundle smaller. Currently we are at: styles.css 465B, runtime.js 1.1kB, polyfills.js 36kB, styles.js 10kB, vendor.js 583kB, main.js 142kB. Bundle size doesn't seem to be an issue but browser(chrome) still stalls for 6 seconds before user see the home page. Does anyone have any advise?
I also looked at SSR but it seems to be very complicated to setup considering we are serving on AWS S3 using content in dist. I am building with:
ng build --outputHashing=all --prod
Please help. I am want to make sure there is nothing else I can improve on before trying SSR. Thank you!
First of all, you can try and update Angular version to the latest, as it brings performance upgrades and bug fixes. That is what I would do before SSR. Nevertheless, SSR is a "must", if you want better user experience. Considering your bundle size, it is not so big. My app is triple size as yours and it loads faster. As I said, I render on server (SSR) and using v9 so far. But is very hard to answer the question without any code samples.
Cheers!
You can try to optimize your code a bit and use the AOT compilation mode in the time of building.
I found a good article, you can explore this and understand in what way exactly you can improve your code and reduce loading time.
https://www.dotnettricks.com/learn/angular/tips-to-optimize-your-angular-application
I am developing pretty big SPA (final ~30MB) and unfortunately one of requirements is that an app has to be released as one big html file. I use webpack to connect all pieces together.
Currently I am facing a problem with performance (some libraries are quite big ones). They "eat" a lot of ram and affects loading time due to code evaluation in browser. I would like to postpone it and evalute only these modules which are necessary at main screen of app.
My idea is to use the same mechanism like webpack does for sourcemaps:
https://webpack.js.org/configuration/devtool/ (eval-source-map)
Webpack simply puts code within eval("code of module") which prevents automatic evaluation by Javascript engine. Of course this code can't be minified and there is also sourcemap attached as base64 to the end. I would like to do same without sourcemaps and including uglification. Moreover I have an idea to reduce size of application by compressing sources so eventually it would be eval(gz.decompress("code of module")).
It will be a huge change in application so before I am going to reinvent a wheel I would like to ask you:
Does it make sense from problem point of view?
Do you know any existing solutions?
Do you suggest to use any existing components from webpack like:
https://webpack.github.io/docs/code-splitting.html
or write own solution from scratch (loader/plugin).
Don't do that what you want!
If you do want to find a weird trick to get what you want, try including your big JS file dynamically. See here or google jquery getscript. No additional Webpack actions required.
If not, please, continue reading.
You're dealing with the problem from the wrong perspective.
First, make sure you are doing all the obvious HTML/HTTP stuff:
You're downloading the gzip-ed version of the file (if not, google http script gzip)
You're including the <script> tag at the end of the body. This will start downloading and parsing JS only after HTML has been rendered.
Then, the most important, try to figure out where is the 30MB coming from. It's unlikely a fair sum of all your big fat dependencies. Usually, it's a particular bloated library (or two). Make sure you use got instead of request because the least is bloated. Find alternatives for the out-sized dependencies.
No single SPA in the world should have a 30MB JS bundle. I'm assuming your project isn't very large because otherwise it would be business critical and you would invest into providing a decent back-end strategy (e.g. code splitting, dead code elimination, etc.).
1) The similar problem can be solved with Webpack code splitting functionality.
The idea is that you don't load route specific code and libraries until the user accesses the specific page.
2) Take a look at this: script-ext-html-webpack-plugin, looks very promising to do these kinds things. For example, defer options would be for modules or scripts that you want to delay the execution. Async would be for scripts that you want to execute as HTML gets executed. Be careful though about race conditions.
3) You mentioned that you use libraries that are so big, make sure you use Webpack with tree shaking. If you use the old Webpack (version 1.*) which does not have tree shaking, you should try to optimize imports manually. For example, instead of import _ from 'lodash' it would be import map from 'lodash/map'.
4) You also mentioned that it is the ram that is the problem, so how compression can help ram? compression can help the browser to retrieve it faster.
5) The other idea would be:
Load the scripts that you need for the home page
execute them. at this point, the user sees the functioning page
then behind the scenes load other scripts slowly without the user to notice it.
evaluate loaded code as it will become needed for the user.
I am working on a website which is based on angularjs and rails in the backend.
The site is currently in production/live
The issue which I am having is that after the assets have been precompiled with the help of rake assets:precompile,The overall js file size goes above 1Mb.Hence it takes time for the site to load.
This is a major issue and since the site is fully ajax based,I cannot implement page caching.
Also have tried gzip on my nginx server but this is not helping.
This is hampering the performance of the site and would welcome any sort of help or suggestions if possible.
Thanks
I don't know about RoR or the rake assets you mentioned but here is a few leads and how I proceed (Lately, I've been starting to use Grunt) :
Concat your js files into 1 js file. It's easier to process one request rather than many little ones.
Minify your js files and make sure to use minified lib version.
Try to adopt a smart approach to load your libraries and your own files. For instance, if you only need graphics in your admin dashboard, make sure not to load d3.js on your front page. I know the Jquery ecosystem is full of useful plugins but I've seen way too many developers taking shortcuts and claiming they need Jquery when others viable alternatives exist.
Serving file using gzip is a good idea. This should reduces the size of your files significantly.
Also, Could you provide a link to your website ?
I am not even sure if something like I want is possible, so I am asking you guys to just let me know if anyone did that before. So, my goal is to when I click on "Publish" website in VS2010, to have all javascript files compressed into one, same with css and then in my layout file change the references from all different js and css files to only those two merged ones. Is that doable? Or maybe it's doable but in more manual way?
Of course the goal here is to have only two calls to external files on the website, but when I develop I need to see all files so that I can actually work with it. I guess I could do it manually before each push, but I'd rather have it done automatically using some script or something. I didn't try anything yet, and I am not looking for ready solution, I am just looking to get to know the problem better and maybe some tips.
Thanks a lot!
This is built into ASP.net 4.5. But in the mean time, you should look at the following projects
YUI Compressor
The objective of this project is to compress any Javascript and Cascading Style Sheets to an efficient level that works exactly as the original source, before it was minified.
Cassette
Cassette automatically sorts, concatenates, minifies, caches and versions all your JavaScript, CoffeeScript, CSS, LESS and HTML templates.
RequestReduce
Super Simple Auto Spriting, Minification and Bundling solution
No need to tell RequestReduce where your resources are
Your CSS and Javascript can be anywhere - even on an external host
RequestReduce finds them at runtime automatically
SquishIt
SquishIt lets you squish some JavaScript and CSS. And also some LESS and CoffeeScript.
Combres
.NET library which enables minification, compression, combination, and caching of JavaScript and CSS resources for ASP.NET and ASP.NET MVC web applications. Simply put, it helps your applications rank better with YSlow and PageSpeed.
Chirpy
Mashes, minifies, and validates your javascript, stylesheet, and dotless files. Chirpy can also auto-update T4MVC and other T4 templates.
Scott Hanselman wrote a good overview blog post about this topic a while back.
I voted up the answer that mentioned Cassette but I'll detail that particular choice a little more. Cassette is pretty configurable, but under the most common option, it allows you to reference CSS and Javascript resources through syntax like this:
Bundles.Reference("Scripts/aFolderOfScriptsThatNeedsToLoadFirst", "first");
Bundles.Reference("Scripts/aFolderOfScripts");
Bundles.Reference("Styles/aFolderOfStyles");
You would then render these in your master or layout pages like this:
#Bundles.RenderStylesheets()
#Bundles.RenderScripts("first")
#Bundles.RenderScripts()
During development, your scripts and styles will be included as individual files, and Cassette will try to help you out by detecting changes and trying to make the browser reload those files. This approach is great for debugging into libraries like knockout when they're doing something you don't expect. And, the best part, when you launch the site, you just change the web.config and Cassette will minify and bundle all your files into as few bundles as possible.
You can find more detail in their documentation (which is pretty good though sometimes lags behind development): http://getcassette.net/documentation/getting-started
Have a look at YUI compressor # codeplex.com this could be really helpful.
What I have done before is setup a post-build event, have it run a simple batch file which minimizes your source files. Then if you're in release mode (not in debug mode), you would reference the minimized source files. http://www.west-wind.com/weblog/posts/2007/Jan/19/Detecting-ASPNET-Debug-mode
I haven't heard about publish minification. I think use should choose between dynamical minification like SquishIt or compile time like YuiCompressor or AjaxMinifier.
I prefer compile time. I don't think it's very critical to have to compile time changing files. If you have huge css/js code lines you can choose this action only for release compilation and if it helps publish this files only in needed build cinfigurations.
I don't know if there is any possible way to somehow hook into the functionality from that 'Publish' button/whatever it is, but it's surely possible to have that kind of 'static build process'.
Personally I'm using Apache ANT to script exactly what you've described there. So you're developing on your uncompressed js/html/css files and when you're done, you call like ant build which then minifies, compresses, stripes and publishes your whole web application.
Example script: https://github.com/jAndreas/typeof-NaN-2.0/blob/master/build/build.xml
Multiple sites reference combining JavaScript and CSS files to improve web page performance, including examples of using ANT build scripts to concatenate the files prior to deployment.
I've search, and haven't found any information how to automate updating references to those files in HTML and other documents. I am looking to avoid hacking together something error prone, and want to learn from others who have automated builds in the past.
Are there automated tools in the wild to complete this task that I'm not seeing? Are there recommended processes to update the script and link tags in HTML? Can these solutions be integrated with ANT or similar build tools?
There sure is and it's a smart thing to do.
I found a PHP solution, don't know it that's okay for you, but if it isn't you can still read it's source (it's not difficult) and learn a lot. The solution works like this:
Rewrite your requests like this: from css/main.css and css/skin.css to css/main.css,skin.css (of course you can put many more).
Use apache's mod_rewrite to redirect this request to a script (in our case combine.php), that will combine all files to a single one.
The script combines all the files and sends the combined file. Then it saves it to a cache folder.
Next time around it checks if there is an up-to-date version of the cache and serves that one. If the latest file modification time has changed, it discards the cache.
The solution works great and it even makes use of HTTP cache headers and spits out an [ETags], which you should do anyway.
You are correct this is a great way to speed up page loading. It will even work in conjunction with a CDN, which the other poster recommended.
Here is a small script that will pack multiple files in to one for deployment. It supports both JS and CSS, and will even "minify" them by removing whitespace, etc. Just hook this in to your build and deploy scripts.
juicer: http://cjohansen.no/en/ruby/juicer_a_css_and_javascript_packaging_tool
What even better, it will follow JS and CSS import statements, so you only need to point your HTML files to the loader file and it will work in both development and production. (Assuming you replace the loader file with the combined file on deployment.)
There are others, including some run-time solutions. But it sounds like you have a build process in place anyway.
As far as HTML updating, if you still need it, since automated deployments are very popular in the Ruby world, and you may find some standalone utilities to help even for non-ruby projects. (As above) Methinks this would be best handled by your own project's template language, though. (With a static resource revision id, or such.)
Good luck, and let us know what you find.
I think what you really want is a CDN Content Delivery Network.
Read about it here
http://developer.yahoo.com/performance/rules.html
http://en.wikipedia.org/wiki/Content_delivery_network