I'm running an angularjs app, and can't figure out why my page is loading slowly. The css and js files all load quickly, but there is a long delay between that and when the html loads, where the app just seems to stay suspended doing nothing
headertemplate.jtml, footertemplate.html, and notelist.html are the partials being loaded to make up the view
Angular uses ajax to retrieve templates. You should be able to look at the network tab in developer tools to see the ajax request for each template. That might give you a clue as to what the hold up is.
I'm not sure why it's taking so long to receive them, but one way you can speed that up would be to pre-cache the templates with a tool like HTML2JS. This way, your templates are just another JS file that you include along with your program code and templates are loaded as soon as your program code is loaded. It will increase your initial download size, but will greatly improve the speed of fetching templates.
You can cache all templates in $templateCache which will make angular app never make xhr request for partial htmls.
https://docs.angularjs.org/api/ng/service/$templateCache
There are various gulp and grunt modules to automate that process.
https://www.npmjs.com/package/gulp-angular-templatecache
https://github.com/ericclemmons/grunt-angular-templates
By the way are you making any server calls in angular.forEach, for some reason they don't show up in network tab (idk why?? o.O)
Or may be you are using resolve in ui-router's state config which is taking time to activate the state and calling html partial..
Related
I have been using Protractor(Jasmine frame work) to automate Angular JS application. So i wanted to automate React JS application.
I have used Protractor(Jasmine frame work) to automate React JS application, but i faced lot of issues, had to use lot of explicit waits to visible the react web elements. Do we have any option to make it to wait until all the react web elements are visible
or
Please suggest the best way to automate React JS application
Not possible in protractor at least
But I always use explicit waits even for angular applications because built-in protractor's waiting often is not stable
If you understand the correlation between API requests and loading of the application you'll have no problems. So basically reactjs is a library to build one page application which is loaded when you open the url. Once loaded and rendered in browser there is no more loading, unless an API request is sent. When that happens normally developers put up a loading animation. With that said, 95% of cases, you need to handle waiting in 2 cases: when the app is opening and when loading animaniton is up. If you create methods for that and put them in page objects library, your problem is solved with a few extra lins of code here and there
Per default, Angular fetches the HTML templates from the server when the user navigates to a route. With that in mind, imagine this scenario:
User loads the Angular app. The main view has a subpage called "Order".
While the user is studying the main view a new version of the app is rolled out in production. The new version has a complete rewrite of the Order page with new Javscript and HTML.
The user navigates to the Order page. The Javascript is already loaded by the browser in step 1, so the user is on the old version until app is reloaded. But the new template gets fetched from the server on navigation. So now the Javascript and template are our of sync!
Is my assumption that the Javascript/HTML is out of sync, correct?
If so, are there any best practices related to this issue?
I guess one solution is the make Angular fetch all the templates on app initialization. But this could be a performance penalty if the app has hundreds of HTML views.
I've never wondered about that issue myself. One possible idea would be to reuse the pattern known as assets versioning, where upon new release, you rename all your assets.
For instance, instead of login.html you'd use login-xyz.html as a name of a template. xyz could be a random value, or a checksum of the file. Checksum might be a slightly better option because if the new release is small (i.e. you fixed just some small bug in one file), if user loads any page but the fixed one, he/she will not be bothered with a reload - all other files will have the same checksums, they'll work with no interruptions.
This way, when an outdated Anguar app tries to fetch a template, it'd get a HTTP 404 error. As an addition to that, you could write a simple $http interceptor, which would detect a 404 response, and reload page automatically (or offer user an option of doing so).
There are modules which are capable of renaming assets, such as gulp-rev - but I never heard of using that for Angular templates. You might implement something like that on your own, though.
Of course you might want to keep both the new and old versions of files to allow users to work without interrupting them with a refresh. Depends on what your requirements are. I assume you're trying to avoid that, though.
Sample 404 interceptor (CoffeScript, as I have it handy now):
m.factory 'notFoundInterceptor', ($q) ->
return {
responseError: (response) ->
if response?.status == 404
# Reload, or warn user
return $q.defer()
# Not a 404, so handle it elsewhere
$q.reject response
}
m.config ($httpProvider) ->
$httpProvider.interceptors.push 'notFoundInterceptor'
Thanks for good answers.
It turned out that this problem solved itself for us. Every time we roll out a new release all the users sessions gets deleted and users will be sent to the login page. This will trigger a page load and fresh JavaScript/HTML gets loaded.
I've read about this issue long time ago, and one option is to do versioning on changed pages and application.js file.
For example on your version 1 of your application you can on your html file use something like:
<script src="js/angular_app_v1.js"></script>
Inside your routes also version the templateURL
templateUrl: 'templates/view_product_v1.html'
So when you roll out a new version you won't be overwriting templates and users already working will have the old version until they reload the browser but won't have version inconsistences.
Versioning of the assets using the file names would become unmaintainable for even a medium sided app.
Although it is a heavy weight approach for web assets you could look into content negotiation. This is where the call for a resource, generally a REST api returns the version of the resource, Content-Type: application/vnd.contentful.delivery.v1+json.. On the client you can check that the version matches what it expects. So if the client only knows how to load v1.1 and the resource responses with v1.2 the UI would know it cannot process that and should reload the page.
Another possibility is to load all templates up front in the UI. There are build processes in Grunt you can run such as https://github.com/ericclemmons/grunt-angular-templates that will combine all of your templates into a single file for delivery and then load them into $templateCache so no requests ever get to the server.
If you have some sort of server-side language you can build a filter in (.NET, Rails, Java or whatever), and pass along a version number with your template requests. If the version requested by the client is older than what's deployed, you'd send an error to the client. Your client would watch for that error ($http interceptor) and force a page-refresh to pull down the newer javascript code. (Maybe show an alert to the user first so they know what's going on).
You can preload all your templates into a $templateCache and serve them as one templates.js file. There is a gulp task for this https://www.npmjs.com/package/gulp-angular-templatecache. Then your application will load all templates in a single request together with application scripts on start up, thus they will be in sync. Read http://www.johnpapa.net/angular-and-gulp/ for more info.
It always makes sense to have a version number and use it when syncing resources. It's not only a good practice for the use case you described, but also for other situation, such as rolling back to a specific version or having two versions live and usable (for example in order to let some users preview the next version)
My question is just conceptual.
When you use the ui-router (docs) in your application, then you have a system built of multiple views. Right?
I was thinking about that, because looking at the network log of Chrome, every change to another view, the client-side makes a request to server to get the HTML files.
Well, the definition of SPA might vary, but basically it has nothing to do with making requests to a backend. It is a matter if the browser needs to do a full page reload to make that request.
Traditionally
Fill out a form, submit it and the browser is redirected to a new url. A full page request
SPA
The application makes an AJAX request in the background. On its response it changes the URL manually and then changes the state of the application. There is no full page request, just JavaScript running in the background
But you are right about suspecting something wrong when there are lots of HTTP requests for templates. You do not want your application to do all these requests to the server. To avoid that you actually have to precompile your templates using a tool like grunt or gulp. Example: grunt-angular-templates. This will insert the templates in your JavaScript code and Angular just grabs them from there, instead of making a backend request. During development though it is no problem.
With modern frameworks you need a tool to "compile" your project for production. You want to concatenate all JavaScript files into one file, bring in precompiled templates and uglify the code. This is not something frameworks give you, you have to set it up yourself using tools like grunt or gulp.
Hope this was of help :-)
I am creating a one page web app with ExtJS.
Isn't the best way to decrease load time of an web app to inject JS, CSS and HTML in the initial HTML file sent to browser instead of just including the script and css tags to load the files from the server one at a time since that will reduce multiple HTTP requests into only one.
You may like the concept of httpcombiner.ashx.
http://archive.msdn.microsoft.com/HttpCombiner
This tool can also compress and cache your js and css
If you want to cut down on initial load time, one of the best ways is to take advantage of the browser cache. Suggest you look at using a hosted ExtJS library, such as from Google Ajax APIs. There is a great chance a prospective visitor will already have it cached.
This is just one tip of many.
This webpage outlines some best practices when it comes to lowering perceived webpage loading time.
http://developer.yahoo.com/performance/rules.html
In addition to using the condensers pavan suggested, you can use Google's closure compiler to minimize javascript files.
http://closure-compiler.appspot.com/home
Well, there is big difference between load time and observed load time. One of the best ways to reduce load time is to use server side compression. However, progressive loading appears to be loading faster for the user.
Therefore initial response should only contain minimal set of style sheets (lets browser render later arriving stuff already styled) and layout. Then you could have onLoad callback to some AJAX loader which loads additional components.
Most importantly do not forget to size your image containers. One of the most annoying things is when you miss-click a link just because an image started loading and changed the layout.
I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php