J2EE web application Performance Tuning - javascript

i am trying to improve the performance of my web application.
It is a java based web application & deployed on an amazon cloud server with Jboss & apache.
There is one page in the application that is taking 13-14 seconds to open. The functionality is so much that there are about 100+ http requests that get executed on page loading time. The Js & css files are taking too much time to load.
So i moved all of the Javascript code from my JSP page to new JS files. Minified the JS & css files. Still there is not much difference in the page load time.
There is dojo data on this page as well which takes time to load.
Is there any other appproach i should try to tune this page?
Can something be done at the Jboss or Apache level?

Apply caching for images etc. More here
Use CDN for any external libraries you use (like jquery). More here
Use a library for your js scripts like RequireJS to optimize your css and js files. You can concatenate (merge multiple js files to one) your code. This will reduce the number of ajax calls (that's what the browser does when it sees a js or css dependency) (As #Ken Franqueiro mentions in the comment section, Dojo already has a mechanism for this). More here
Optimize your images. Use appropriate dimensions. Do not use full blown dimensions if you just intend to use it for a 10x10 container. Use sprites if possible. More here
Show a message/loader to show the user some progress. This will minimize the user restlessness More here
If you load data that takes too long, then show the page and load the data afterwards. This will too give some sense of progress to the user.
If the response is very big you can compress your response data. Be careful though that the browsers your application supports, can handle the compressed information by default, or add a custom mechanism to decompress the data. More here
Use some profiling tools like Chrome Development Tools or FireBug for Mozilla.
Take a snapshot of your network traffic and check where the bottleneck is.
In Chrome you press F12 and then select the Network tab.

Related

Accessing local files for a client-side only website, to be distributed as a HTML-download the user opens with a browser

Due to the safety rules of the same-origin policy (SOP), i am unable to load certain local files when opening an index HTML-file directly with a browser. Using a "live-server"-plugin works fine, as all the files in that case are "on the same server". I need to distribute the website as a client-side only app - A folder and html file to be opened with your browser. Solutions to the problem always seem to require setting up a server. Is there any way to avoid that, and keep everything on the client?
I am making a mathematics e-book, that i want to distribute as a website people can download. I want it to be client-only and a download, since if it were to become popular, then i wouldn't be able to afford running the server (as i would be studying at that time). I have chosen html and javascript over EPUB, as they are much more powerful, and allow for tons of interactivity (and much more efficient development).
So far i have a browser.html file, that loads individual pages with JQuery .load(). This browser.html file has both html, and javascript. The CSS is in an external file. The individual pages have many pictures, that are also stored locally on the server. As the pages are contained in subfolders, the picture URL's go out into their parent folder, and into the assets folder, like: ../../../Assets/Chapter1/Talopgaver og intuition/Misc\F\solsystem.png. I use custom-elements (shadow-DOM) to handle various complex aspects such as questions, answers, along with certain other things too. Other than JQuery, i also make use of Math-Jax, and a "polymer" library that helps with cross-browser support of custom-elements. All the pages in a certain chapter are loaded in the start, and then put into a array (this makes it fast to scroll through pages as you often do in books). They (as strings) are each modified slightly to automate certain tedious parts of development.
I have tried to open the browser.html file on chrome, firefox, internet-explorer, and edge. They all load the html that browser.html inherently contains (properly styled even), but none of them load any external pages. Interestingly, one of the images used in the browser.html file still works (i would think that would be a local file too, not?). If have tried turning off calls to ajax or external CSS, but nothing changed. I have searched for other people with similair problems, but all the answers just reccomended setting up a server.
When loading the page with a live-server plugin, the result looks something like this:
browser.html page opened with "live-server" visual studio code plugin by Dey, Ritwick
When opening the browser.html page directly using chrome, it looks like this:
browser.html page opened directly with chrome
The error i get (after having removed an ajax .get() call) isn't particularily descriptive: simply "Failed to load resource: net::ERR_FILE_NOT_FOUND" from "platform.js:1". Even if turn off the call to start loading pages, it gives me excaclty the same error messages.
Looking at the network reports, with live server it looks very ordinary. without it's pretty weird. It says it takes hours to load browser.html, even though that clearly isn't the case. It fails loading platform.js, after using 22 seconds trying. The networks report looks a bit more healthy when turning off the call to load pages. It gives up loading platform.js faster (8 seconds), yet still supposedly takes hours to load browser.html.
Though it shouldn't ultimately be neccesary, i have linked the entire browser.html document below, along with an example of a page it might load (the example in the first picture above).
browser.html. Too big for a stackexchange code-block embed
Page in previous picture (page 37)
Any help is appreciated!
EDIT: Main problem seems to be the loading of pages using JQuery.load(). Even on a simple testing website that operation is just not possible without running on a server.

Does linking to JQuery CDN slow my site down?

A total newbie. This seems like a basic question, but one that I can't find the answer to:
I have a few lines of JQuery to show and hide a responsive menu on half-a-dozen pages. I understand I need to "link" to the JQ library:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
Will this slow my site down? Will a visitor have to download the whole JQ library for my two lines of code to work? if this is the case I might revert back to Javascript.
Loading a script file is never free, so you have to decide whether it's worth it for the functionality provided.
Once you've decided to do it (if you do), the question is whether to use a CDN or host it locally.
Pros of a CDN:
It may already be in browser cache because the same file is used elsewhere.
CDNs use edge-casting, highly-optimized servers and server configurations, etc. to make delivery as fast as possible. While it may be possible for you to have a server that's just as fast, it's quite likely that you don't.
Const of a CDN:
If the CDN is down, the link doesn't work even though your site is up and running.
It's not under your control.
Generally, yes. There are certain conditions which can affect the download speed, like content blocking, cache and async.
We can caluculate how long it will take to download a file if we know the internet connection speed and the file size.
That minified jquery CDN file is 84.8 KB.
And the average global internet connection speed is around 7.2 Mbps.
So...
It will take about 78.45 ms to download a file of that size.
Depending on what you think is a site slowdown. Are some miliseconds a huge slowdown?
You can speed up the process slightly by having the jQuery library on your own server (eliminates the need for extra request to the google CDN => saves time). Another thing to speed up the process would be to use the SLIM and minified CDN versions of jQuery.
Now if you have only 2 lines of code, there might not be a particular need to just include jQuery for that (again think longterm), but think if in the future you would use jQuery for more things.
In the end the including of just jQuery will not slow down your response times that much (max of a couple of miliseconds).
Hope this helps!
The CDN network will not slow down your webpage since you are trying to get the minified file. Even if you use the offline file, it will still take some time to load from the storage. Since you are developing a web site/web application, your app should be connected to network. If you are not showing any jquery content when the page first loads, you can make it to load in the background by adding async attribute to the script tag.
You can check how much time this file takes to load by going to (in Chrome)
developer console -> Network.
It will show list of requests and its time to finish. Different filters are also available. So select All to see all network connections and time for finishing the request.

Is it better to load many small JavaScript files or one large JavaScript file?

I have noticed in chrome that if I load an image as a base64 string and then scroll through that part of the page it will slow down.
I have also noticed that when I navigate out of a tab with my Javascript in it and then move back to that tab it will be slow for a few seconds as though V8 is recompiling the js.
There are three options I can think of but I don't know which is best:
load a tiny loading page first and handle subsequent loading eloquently
load one huge js or css file with everything (jquery + my code + etc)
clump certain codes together (use jquery cdn but group my code together)
What is the best way to get your js loaded as quickly and eloquently as possible?
Generally, loading more files incurs more overhead in HTTP than combining them into fewer files. There are ways to combine files for all kinds of content:
For images, use CSS sprites.
For javascript, compile your client-side code and libraries into one file, and minify to reduce size.
For css, you can do something similar to the above. hem compiles stylus into one css file, for example, and this can help organizationally as well.
Additionally, when you concatenate Javascript and CSS, your webserver or reverse proxy can send them in compressed form for faster page loads. This will be more efficient for larger files as there is more to gain from compression.
There are way too many maybes for this to have any guaranteed solutions, but here you go:
1) load CSS at the top -- load it all there, if you're doing a site with multiple pages.
If you're building a one-page application (where you're running galleries and twitter feeds and articles, etc on the same page, and you can open and close different sections), then you can consider loading widget-specific CSS, at the time you're loading your widget (if it's not needed at startup).
Do NOT use #import in your CSS, if you want it to load quickly (you do).
2) load the vast majority of your JS at the bottom of the page.
There is practically nothing that can't be lazy-loaded, or at least can't be initialized at the bottom of the page, after the DOM is ready, and if there really is, serve those as separate files at the top of the page, and consider how you might rewrite to depend on them less.
3) be careful with timers -- especially setInterval... ...you can get your page's performance into a lot of trouble with poorly-managed timers.
4) be even more careful with event-handlers on things like window-scroll, resize, mouse-move or key-down. These things fire many, many times a second, so if you've written fancy programs which depend on them, you need to rethink how you fire the program (ie: don't run it every time something the handler goes off).
5) serving JS files is a trade-off:
Compiling JS takes a while. So if you're loading 40,000 lines in one file, your browser is going to pause for a little while, as it compiles all of that.
If you serve 18 separate files, then you have to make 18 different server calls.
That's not cool, either.
So a good balance is to concatenate files together that you KNOW you're going to need for that page, and then lazy-load anything which is optional on the page (like a widget for adding a comment, or the lightbox widget, etc).
And either lazy-load them after all of the main products are up and running, OR load them at the last possible second (like when a user hits the "add comment" button).
If you need to have 40,000 lines loaded in your app, as soon as it starts, then take the hit, or decide what order you can load each one in, and provide "loading" indicators (which you should be doing on lazy-load always) for each widget until it's ready (loading the JS one at a time).
These are guidelines for getting around general performance issues.
Specifics are hard to answer even when you have the site directly in front of you.
Use the Chrome dev console for profiling information and network performance, and rendering performance, et cetera.
Well there is a very popular concept called concatenation. The idea is to have as few HTTP requests to your server as possible. Because each request means a new connection, for which DNS lookup happens, then handshake is negotiated and then after a few more protocol-based steps, the server sends the requested file as the response.
You can check http-archive for a list of performance best-practices.
So yeah, you should combine all JS files into one (there are certain exceptions, like js at head and js in footer)
This is the answer for your question-title and points 2 & 3.
As for the other part, I am not clear about the scenario you are talking of.
I recently had the same problem, and then I developed and released a JS library (MIT licence) to do this. Basically, you can put all your stuff (js, images, css ...) into a standard tar archive (which you can create server side), and the library reads it and allows you to easily use the files.
You'll find it here : https://github.com/sebcap26/FileLoader.js
It works with all recents browsers and IE >= 10.
The number of files to load has an impact on the load speed of the whole site. I would recommend to pack into a single javascript file all the required functionality for the website to display properly.

Decrease initial web app load time

I am creating a one page web app with ExtJS.
Isn't the best way to decrease load time of an web app to inject JS, CSS and HTML in the initial HTML file sent to browser instead of just including the script and css tags to load the files from the server one at a time since that will reduce multiple HTTP requests into only one.
You may like the concept of httpcombiner.ashx.
http://archive.msdn.microsoft.com/HttpCombiner
This tool can also compress and cache your js and css
If you want to cut down on initial load time, one of the best ways is to take advantage of the browser cache. Suggest you look at using a hosted ExtJS library, such as from Google Ajax APIs. There is a great chance a prospective visitor will already have it cached.
This is just one tip of many.
This webpage outlines some best practices when it comes to lowering perceived webpage loading time.
http://developer.yahoo.com/performance/rules.html
In addition to using the condensers pavan suggested, you can use Google's closure compiler to minimize javascript files.
http://closure-compiler.appspot.com/home
Well, there is big difference between load time and observed load time. One of the best ways to reduce load time is to use server side compression. However, progressive loading appears to be loading faster for the user.
Therefore initial response should only contain minimal set of style sheets (lets browser render later arriving stuff already styled) and layout. Then you could have onLoad callback to some AJAX loader which loads additional components.
Most importantly do not forget to size your image containers. One of the most annoying things is when you miss-click a link just because an image started loading and changed the layout.

Caching external javascript for a QtWebkit widget in a PyQt app

I have a QWebView in my app which renders a html page stored in the app as a Qresource. This page, however requires meaty external Javascript libraries such as MathJax, which I would want to include as a resource due to its size.
My problem is that it seems that QtWebkit does not cache these files as a regular browser would do, and every time I refresh the widget it downloads MathJax afresh.
So my question is: is there any way to cache these libraries after first time they are downloaded, without having resorting to shipping it with the app as resource?
You should try if a simple QtNetwork-based download honor the cache setting or not. Also, see if the settings (QWebSettings) are set properly.
In all case, you should be able to inject a custom QNetworkAccessManager that handles the caching of your custom JS library. See http://ariya.blogspot.com/2010/05/qnetworkaccessmanager-tracenet-speed.html and http://ariya.blogspot.com/2010/06/proxy-server-with-filtering-feature.html as examples and follow it up from there.
Could you post some source code? Once downloaded that data will stay in the /tmp/ folder for some time. You could likely use the data in the temp folder, my guess is you are not enforcing that policy.

Categories