I am looking for a tool that lets you monitor/log page rendering time on client machines. I am not looking for firebug/yslow because i want to know the following type of things:
How does fast do my pages load when the user is in russia?
How long does it take for javascript to run on some pages for everyone who accesses those pages?
So, i actually care what my site feels like to the people who use it. Do there exist tools that already do this?
I should add that my website is a software as a service website, not accessible publicly.
I've never heard of any way to do this. One solution, which may be terrible, might be to log the time yourself. At the top of your page have an inline script tag with a global variable called start that creates a new date. Then, have an onload listener that calls a function once the page is finished loading. In this function, get the difference between the start time and current time and send that back to your server. This is by no means that accurate, but might give you some idea. You could also log their IP address for geolocation when you send back the data.
I recommend https://www.atatus.com/. Atatus helps you to visualise page load time across pages, browsers and countries. It also has AJAX monitoring and Transaction monitoring.
There is not really a super easy way to do this effectively. but you can definitely fake the geo-location thing by using a proxy (which would actually give you N*2, time length) and get a pretty good idea at what it's like to browse your site.
As for JavaScript, you can profile it with the profiler in FireBug, this will give you an idea of what functions you should refactor and whatnot.
In my opinion I'd determine what most of your users are using or what the general demographic makeup they are, are they 75 year-old guys? If that is the case maybe they aren't up on the newer faster browsers, or for that matter don't care. If they are cool hipster designers in San Francisco, they its Safari 4.0... anyway this is just a way to determine the meat of the users, I think the best way is just grab an older laptop with Windows XP on it and just browse your site, you can use FireBug lite on browsers besides Firefox
I like to run Dynatrace AJAX edition from UI automation tests. This easily allows you to monitor performance deterioration and improvement over time. There's an article on how to do this on the Dynatrace website.
Related
I've been reading some very interesting articles about the whole client vs. server rendering lately.
http://www.onebigfluke.com/2015/01/experimentally-verified-why-client-side.html
http://www.quirksmode.org/blog/archives/2015/01/angular_and_tem.html
http://tomdale.net/2015/02/youre-missing-the-point-of-server-side-rendered-javascript-apps/
Now I've been a bit of a fan boy when it comes to client side but after I read these articles some points started to show up in favor of the server side rendering, to my surprise... The main points were:
1) You can upgrade your server, but not your users device - This means, well, yes... you are in control of the server, so if it's under performing you may opt to upgrade/scale. You can't force users to upgrade their devices.
2) First paint vs. last paint - Now on the experimentally verified... link above it shows when the users first see the page (first paint) and when the users may use the page 100% (last paint). Now from what I can think of when the user sees the page, it takes their brain some time to process the signals from the visual cortex to the frontal cortex and then to the premoter cortex where the user actually starts clicking his/her finger, that is of course if the html is rendered first so the brain has something to process while loading is happening in the background (js files, binding etc.).
What really got me was the bit were twitter reported people of having up to 10 seconds of loading time for client side rendering, no one should ever experience that! It's kind of saying, "Well if you don't have a good enough device, sorry, you'll just have to wait.".
I've been thinking, isn't there a good way of using both client-side and server-side templating engines and which both client and server use the same template engine and code. In that case it's only to figure out if it's benefactor to supply the client with the rendered page or let the client render it themselves.
In any case, share your thoughts on my sayings and the articles if you want. I'm all ears!
UPD: do it only if you really need it
(4 years and 2 isomorphic enterprise apps later)
If you're required to do SSR, fine. If you can go with a simple SPA - go with it.
Why so? SPAs are easier to develop, easier to debug and cheaper and easier to run.
The development and debugging complications are evident. What do I mean by "cheaper and easier to run", though? Well, guess what, if 10K users try to open your app at the same time your static HTML website (i.e. a built SPA) you won't even feel it. If you're running an isomorphic webapp though, the TTFB will go up, RAM usage will go up and eventually you'll have to run a cluster of those.
So, unless you are required to show some super-low TTFB times (which will likely come through aggressive caching), don't overcomplicate your life.
Original answer from 2015:
Basically you're looking for an isomorphic web app that shares the same code for frontend and backend.
Isomorphic JavaScript
JavaScript applications which run both client-side and server-side. Isomorphic JavaScript frameworks are the next step in the evolution of JavaScript frameworks. These new libraries and frameworks are solving the problems associated with traditional JavaScript frameworks.
I bet this guy explains that much better that me.
So, when a user comes to the page, the server renders the full page with contents. So it loads faster and requires no extra ajax requests to load data, etc. Then, when a user navigates to another page, the usual techniques for single page applications are used.
So, WHY WOULD I CARE?
Old browsers / Weak devices / Disabled Javascript
SEO
Some page load improvements
Old browsers / Weak devices / Disabled Javascript
For example, IE9 does not support History API. So, for old browsers (and if user disables javascript too), they would just navigate through pages just like they did it it in good old days.
SEO
Google says it supports SPA's but SPA's aren't likely to appear in the top results of google search, are they?
Page speed
As it was stated, the first page loads with one HTTP request, and that's all.
OK, so
There are lots of articles on that:
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
http://www.sitepoint.com/isomorphic-javascript-applications/
https://www.lullabot.com/articles/what-is-an-isomorphic-application
But SHOULD I CARE?
It's up to you, of course.
Yeah, that's cool, but it takes much work to rewrite/adapt the existing app. And if your backend is in PHP/Ruby/Python/Java/Whatever, I've got bad news for you (it's not necessarily impossible, but close to that).
It depends on the website, you can try to collect some stats and if the percentage of users with old devices is small, it's not worth the trouble, so why not...
LET THEM SUFFER
If you care only about users with old devices, then c'mon, it 2015, and it's your user's problem if he's using IE8 of browsing websites with a iPod Touch 2. For example, Angular dropped IE8 support in 1.3 approximately a year ago, so why wouldn't you just alert the users that they need to upgrade ;)
Cheers!
All of the conversations on this topic miss one point. Bytes sent to the client. Pages rendered as HTML on the server are a lot smaller. Less bytes transmitted is better for everyone, both server and client. I've seen the bandwidth costs on cloud sites and even a 10% reduction can be a huge saving. Client side JS pages are always fat.
I am trying to figure out how to load test my JavaScript widget. It will be running on around 1000 websites each with about 2k visitors a day so i really need to find a way to test this scenario before letting my users install the widget.
Any ideas what approach i should take?
It sounds like the widget itself (an instance of your javascript code running in a browser) will never be used by more than one user (one browser) at a time. Your server-side APIs, OTOH, will. So you need to test your server by simulating the level of HTTP traffic to your server that you expect the use of your widget to generate. For this, you want web load testing tools.
Depending on how simple the APIs are, you may be fine with something like JMeter. If the API usage is more complex, you might want a more sophisticated tool.
I have a web application with a great deal of both client-side and server-side logic. It is considered business-critical that this application feel responsive to the end user, for some definition of "feels responsive." ;)
Most website monitoring discussions revolve around keeping an eye on server-side metrics (response time, I/O queue depth, latency, CPU load, etc.), i.e. we tend to treat server performance and responsiveness as though it's a viable "proxy" for what the user is experiencing.
Unfortunately, as we move more and more logic to client side Javascript, the correlation decreases and our server metrics become less useful.
I didn't find any good matching SO questions on this. Googling gives a range of commercial products that might be related, but they're generally from the manufacturers' websites, full of unhelpful marketspeak and "please call us for details," so it's hard to know.
Are there any commonly-used tools for this sort of thing, other than rolling your own? Both free and commercial are welcome, although free is obviously better all else being equal.
EDIT: To clarify, I primarily need to gather bulk data on the user experience, including both responsiveness and breakage/script errors. Automatic analysis is a very-nice-to-have, although I'd expect to have to occasionally dig into the data myself regardless of the solution.
There are some freely available tools for performance monitoring. Yahoo open-sourced a script they used called Boomerang which can measure page load times and other performance metrics for end-users. Full documentation here. Google analytics also offers a basic page load time report.
For error monitoring, you'll want to listen for the window.onerror event. I don't know of any scripts that will automatically log it for you, or mine the logs on the server side. If you implement your own, you'll want to be very careful about not pinging the server too often--imagine how many requests it would generate if there was a JS error in your JS error handling code!
Bucky Client and Bucky Server, can perform that task :
http://github.hubspot.com/bucky/
From their website :
Open-source tool to measure the performance of your web app directly
from your users' browsers.
To analyse data they advise Graphite or OpenTSDB
You can try Atatus which provides Real User Monitoring(RUM) and Advanced error tracking for websites and web apps.
https://www.atatus.com/
http://www.whitefrost.com/documents/html/technical/dhtml/funmon.html#part1 tests the performance of javascript functions.
You can utilize Dynatrace Ajax for measuring and profiling the performance of the JavaScript in IE and Firefox. For Chrome, they have built in tools - take a look at:
http://blog.chromium.org/2011/05/chrome-developer-tools-put-javascript.html
For monitoring the performance of the overall application/site I would recommend synthetic monitoring utilizing real browsers, also known as web performance monitoring. These are services that have robotic agents sitting on Backbone ISPs performing the same activity as end users.
We utilize Catchpoint, which supports Selenium scripting. But there are others like Gomez and Keynote out there that have been providing such solutions for years.
You can also check out New Relic - now it has "real user monitoring" integrated - which measures the performance across all browser types. There is a 14 day trial period so you can set it up for free and see if you like it. You'll get visibility into browser rendering speed, DOM processing, the time it spends on the network, all the way back to your app performance on the server.
When even mobile browsers have JavaScript, is it really necessary to consider potential script-free users?
Yes. Your web pages aren't just consumed by people: they're consumed by search engines, and crawlers, and screenscrapers. Most of those automatic tools don't support Javascript, and essentially none are going to generate UI events or look at deeply nested AJAX data. You want to have a simple static HTML fallback, if nothing else then so that your web pages are well indexed by the search engines.
Forget the crazies who disable Javascript; think of the robots!
Yes.
People can (and do) browse with javascript disabled. If your site will work without users having to explicitly enable javascript for you, that makes them happy.
Exactly how relevant depends on your target audience, of course.
I would argue that you shouldn't go significantly out of your way to accommodate for non-JS users for the following reasons:
All Modern Browsers Support JS
This is a snapshot of browser usage
today:
http://www.w3schools.com/browsers/browsers_stats.asp
Even the oldest common browser, IE6,
supports basic JavaScript and AJAX.
If you decide not to integrate
certain features b/c of a JS
dependence, this proves that you are
essentially doing it for people who
started with JavaScript enabled and
explicitly chose to disable it. I
think these people should expect for
some features, and perhaps even
entire sites, not to work as a
consequence.
Few People Willingly Disable JS
Building on my point above, average
web users don't know or don't care
that JS can be disabled in browsers.
It's largely a tech savvy crowd who
knows how to do this (myself
included), and as tech savvy users we should know when to
turn it back on as well.
Cost of Support
In light of the above, consider that
choosing to accomodate users who have
primarily willingly disabled JS comes
with a very real cost. If you are
managing a large project with heavy
UI requirements, you can easily burn
a lot of developer hours
accommodating for what is a very
small user preference. Check your
budget. If it is going to take 2 devs
working 40 extra hours each on the project
to accomplish this feat, you are
easily going to burn a few thousand
dollars on what is essentially a
non-issue for the vast majority of
your users. How about using that time
and investment to further buff up
your core competency?
Precedence
I may very well be wrong on this, but
I think it would be difficult to find
major media or social sites that
don't rely on JavaScript for some
portion of their functionality to
work. If major businesses that rely
on the operation and accessibility of
their site to stay in business aren't
doing it, there's a good chance it's
because it isn't needed.
CAVEATS:
Know your market. Continue to build XHTML/CSS that is semantic (preferably by using the RDFa W3C recommendation). Still strive to make your sites accessible to the visually impaired. Don't believe everything you read. ;)
DISCLAIMER:
My argument above is largely dependent on how you define "graceful degradation." If you mean all the links still work, that's one thing, but if you mean all the links still work and so does the wombats game, that's another. I'm not trying to argue for making your site so JS dependent that non-JS users can't access any portion of it. I am trying to make an argument for the acceptability of certain features, even some core features, being reliant on JS.
It is relevant and it will be relevant even after 10-20 years when javascript might be supported everywhere. making things work without javascript is important development technique because it forces you to keep things simple and declarative. ideally javascript should be used only to enhance experience but your website shouldn't depend on it.
there is clear advantage from maintenance point of view to have most of the code in declarative format (html+css) and as little as possible in imperative (javascript).
My position:
I browse with NoScript, so if I come on your site it will be without benefit of Javascript. I don't expect the full user experience.
What I want, before turning on JS, is to be assured that you're reasonably competent and not malicious, and that I actually want what you're using JS for.
This means that, if you actually want me to use your site, you should allow me to look around, using links. (If I see a site that's totally useless without Javascript, I generally think the designers were incompetent.) You should let me know what sort of functionality I'll get from enabling Javascript, and you should present the site in a legitimate-seeming way.
I don't think that's too much to ask.
graceful degadation / progressive enhancement / unobstusive javascript is absolutely relevant!
as with all accessability issues: just imagine for one second what it's like to be the one on the outside who can't use the page.
imagine you're travelling around the world, you're in some hotel or internet café with really old computers, old software, old browsers, you want to look up your flight and you realize you can't because of some javascript incompatability in the old browser you're using.
(try 'old mobile phone' or 'stuck behind a corporate firewall' for different scenarios)
image what a world of possibilities opend up to blind people with screen readers and the web, and image what it's like to find these possibilties closed again because of javascript.
so much for appealing to your better nature.
you might also want to do it
to keep your site accessibly for search engines.
Yes, it's relevant. Mobile browsers in use today do not all have Javascript enabled. It's available on new phones, sure. But there are millions and millions of people like me, who have phones running older browsers, and for all of us, a JS-required browsing experience is just plain broken.
I don't even bother visiting sites that didn't have progressive enhancement in mind when they coded. I'm not technically behind the times. My phone is a year old. But I'm not going to re-up my contract and buy a new phone because of a crippled web experience.
It depends on who your target audience is. I have JavaScript turned off by default and turn it on when I know what the site's intent is.
It's generally much faster to browse with Javascript disabled (digg.com is lightning without JS), which is why it's popular.
In Opera it's really easy: you simply press F12 and untick the javascript option. I always browse without Flash, Java (not javascript), animated images and sound. I enable Flash on a per-site basis, eg YouTube. Sometimes I turn off JS temporarily if my system is slowing down.
And don't forget about:
Screen readers (I think they mostly have JS disabled)
Text browsers or other very old systems
Ad blockers (if your filename happens lands under their radar)
Any old browser that either doesn't support JS at all or the JS breaks (e.g. IE6 doesn't support some modern JS stuff).
The solution is to use progressive enhancement rather than graceful degradation, i.e. start with the basic HTML and add CSS. Then add Javascript and/or AJAX to parts of the site.
For example, if you had a site like Stack Overflow, voting up an answer could submit a form normally. If JS is enabled, it would do an AJAX request, update the vote count and cancel the form submission, without leaving the page. SO doesn't do this though...
I for one always have NoScript turned on unless I trust the site for a number of reasons including cross-site-scripting, click jacking, and HTML injection. It's not me being paranoid, it's because I know a lot of developers, and know most of them have no idea what web security is, never mind how to avoid vulnerabilities.
So until I trust a site there's no chance I'd let it do anything fancy.
For the unfamiliar, there are some interesting blog entries on the subject:
Protecting Your Cookies: HttpOnly
Cross-Site Request Forgeries and You
Sins of Software Security
Top 25 Most Dangerous Programming Mistakes
I'm going to have to make a case for the other side here. Peoples reasons for designing sites without javascript are largely idealistic. Given an enough time and money and the goal is achievable and will certainly open your website to the largest possible number of people. However in reality doing this will slow your development, increase the number of test cases that you have to deal with, and ultimately affect the quality of your application for those users that do use javascript.
In my opinion it is perfectly reasonable to choose to make your site only compatible with js enabled browsers and tell those users that dont have it that they are missing out. This allows you to concentrate on creating rich content that the majority of users will be able to view.
There are of course exceptions to this rule, but if you are looking to create a good website for the majority of users, or have a client who is after a flashy website with limited time or money then taking the decision that it is js enabled browsers only is a reasonable thing to do.
The real question is not whether it is relevant, but whether to use Graceful Degradation, or Progressive Enhancement as your scripting strategy.
I'm actually in an interesting position when it comes to graceful degradation of JS. I'm working on a web application that bots and crawlers have absolutely no business looking into. There's nothing they can gleam that should be indexed.
The informational site accompanying the web application, however, should be indexed, and therefore JS degrades gracefully there.
In the web application, if you don't have JavaScript enabled, you're probably not supposed to be there. It's intended to be a rich interactive experience. The web application actually requires JS to be enabled, and for you not to be sitting behind a corporate firewall.
We're not serving anything malicious, its just our intent and purpose for the web application that's different. The goals of our web application and those of our informational site are completely different.
I use JavaScript. I always keep my browser up-to-date. But sometimes, my Internet connection is so bad that scripts just don't load.
There are also cases when:
Some scripts load, but others fail, in which case parts of a website stop functioning
Scripts are loading, but I want to hit "submit" without waiting for that fancy frilly menu
A script malfunctions because it was partially loaded and then cached at that half-stage
I'm in such a hurry that I just decide to use Lynx.
Now, I'm not saying my Internet is bad all the time, or even most of the time, but it does happen. With the Internet expanding rapidly to many rural areas across the world, I'm sure I'm not the only one. So apart from bots as Nelson mentioned above, it's another thing to keep in mind. (Hint: check your demographics).
If you don't want the page to work when Javascript is off then just have that be the message in html, and if javascript is on, by using unobtrusive javascript you can get rid of that message and make visible the rest of the application.
Depending on what you write for, in terms of javascript version, you may need to degrade if the browser the user has doesn't not have the latest version, so gracefully handling that is also important.
I want to use google gears, not for functionality but for optimization.
I.e. if gears is detected in the browser side, then we will silently use it to do some optimizations and caching in the client side.
If it isn't installed we silently work against the server only.
Somewhere in the FAQ or help pages of our website we will tell the users that our site recommends gears for best performance - but it is not a must, we will not use "offline features".
Do you think this is a valid google-gears usage scenario? Do you recommend for it / against it?
Sounds like a valid usage scenario to me.. just realise that you won't be able to do it 100% transparently as the user will have to tell gears to allow your site to use it. With that in mind you should definitely make it an opt in thing (as described in the gears faq), rather than just attempting to use it if you find it installed, otherwise you'll annoy your users more than the slightly worse performance ever would have.
I think its a good possibility, but why try to hide it? As mentioned in other responses, the process can never be truly transparent, since some initial setup will be required at some point. Instead of hiding the feature, (depending on your project) you could use it to generate some positive press regarding your application.
Maybe by default you serve content from the main server, but along the bottom or top, include a highlighted link (think of "What's New" in Gmail). It could say something like "Improve performance by 50%" etc. The next page might contain a quick summary of whats going on, and why a user would commit to Gears. Similarly, when this launches, you could use the opportunity to show off "what you're spending development time doing."
I understand why you might want Gears to be unobtrusive, but don't make it hard for the user to get the best experience on your site.
Wordpress also promotes use of Gears in part to speed up actions - and Gmail in offline mode (or at least with active offline support) is also (IMHO) faster to use.
I say go for it - and if you can add offline support - and it makes sense - do that as well!