Non-invasive javascript performance agent? - javascript

I am seeking to (legitimately) plant bugging in my web pages to collect and report information about website performance.
Preference for internally hosted. While I expect that there are commercial offerings out there (e.g. Google Analytics) I'm keen to find something we can run entirely in-house (its not a public website and may contain sensitive data).
Also, I'm looking for something where it can report back to an independent URL - i.e. not relying on adding in a reverse-proxy / recording results within existing webserver logs. Indeed, I'd prefer something which does not require access to the webserver logs logs at all (other than those for the URL the bug reports back to).
I need to be able to monitor bulk traffic - so things tools like pagespeed and tamperdata are not appropriate.
I've tried googling but just seem to be getting lots of noise about the performance of javascript and web pages rather than how to actually measure these.
TIA

You could use the open source analytics software Piwik and write a plugin for it that sends the performance data to it.

Thanks chiborg. I'd kind of forgotten about this it was so long ago I asked. Yes, I was aware of PiWik - but not been very impressed with either its implementation nor the quality of documentation.
I'm currently working on a solution using Boomerang.

Related

Aren't Javascript analytics scripts susceptible to easy data hacks?

On Production environments, Javascript based analytics scripts (Google Analytics, Facebook Pixel etc.), are injected into most web applications, along with the Unique ID/Pixel ID, in plain Javascript.
For example, airbnb uses Google Analytics. I can open up my dev console and run
setInterval(function() {ga('send', 'pageview');}, 1000);
which will cause the analytics pixel to be requested every 1 second, forever. That is 3600 requests an hour from my machine alone.
Now, this can easily be done in a distributed fashion, causing millions of requests per second, completely skewing the Google Analytics data for the pageview event. I understand that the huge amounts of data collected would correct this skewing to a certain extend, but that can be easily compensated by hiking up the amount of requests.
My question is this: are there any safeguards to prevent competitors or malicious individuals from destroying the data integrity of applications in this manner? Does GA or Facebook provide such options?
Yes,but the unsafe part don't comes for the Javascript. For example, you can use the measurement protocol to flood data to one account. Here you can see a lot of people in the same comunity having thoubles with this (and it's quiet simple to solve.)
https://stackoverflow.com/search?q=spam+google+analytics
All this measurement systems uses HTTP calls to fill the data on your "database". If you are able to build the correct call you can Spam Everyone and everywhere (but don't do it, don't be evil).
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
This page of Google Analytics explain what is the protocol measurement, Javascript only work as framework to build and send the hit.
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
But, so not everything is lost.
For example, if you try to do that on you browser with that code, The Google Analytics FrameWork limit to 1 call per second and 150 per session (or cookie value). Yes it's not complicated to jump that barrier, but after that other barriers will come.
So if you use the Javascript framework are safe. Now imagine you do the same with python, sending http to the Google Analytics server. It's possible but:
So here are 2 important things to says.
Google Analytics has a proactive "firewall", to detect Spammers and ban them.(How and when they do this is not public), but in my case i see a lot of less spammer that few years ago.
Also there is a couple of good practices to avoid this. For example, store only domains under a white list, creating a filter to allow only traffic from your domain
https://support.google.com/analytics/answer/1033162?hl=en
Also it's a very good practice to protect you ecommerce, using a filter to include only data from certain store or with certain parameter, "for example brand == my brand" or "CustomDimension== true". Exclude transactions with products over $1.000 (check your limits and apply proactive filters). All this barrier make complex to broke.
If you do this, you will protect your domain a lot(because it's too much complicated to know the combination of UA + Domain Valid when you create a robot), but you know, all the system can be broken. In my experience i only see 2 or 3 cases of damage comming from spammer or people who wanna hurt, and in all this case could be prevented if I created a proactive filter. Usually spammer only spam ads into your account,almost never want to hurt you. Facebook, Piwik and other Tools happens more or less the same.

What are some good ways to measure the performance of a single page web app?

We have a single page web app and we want to measure it's performance. There's a lot of JavaScript and AJAX on it and we want to see how long it takes to fully load up. What are some good tools to use?
Chrome profilers are pretty amazing http://www.youtube.com/watch?v=OxW1dCjOstE
You can find whole bunch of tutorials on youtube showing how to inspect heap usage.
You can always use firebug (in Firefox), HttpWatch (in IE/Firefox), or use the Network tab (press F12) in Chrome.
You can use a useful plugin developped by Yahoo "yslow".
It details the loading of all the components of your website (js, css, images, favicon, ...), then it shows you a global performance's note for your website, the worst is 0, the better 100.
What it looks like:
Source plugin: http://yslow.org/
Here you have a video where they explain how to use it: (it lasts more than 1 hour) https://www.youtube.com/watch?v=hxW74v3h1xY
The website they use to determine all the things you can do to improve the performance of your website: http://webdevchecklist.com/
Consider looking at boomerang (https://github.com/soasta/boomerang), and opensource JavaScript library that measures performance while users use your site. It can automatically measure XHR requests and also has support for frameworks like Angular, Ember & Backbone and can be extended with more plugins.
Using boomerang you will need to build your own back end infrastructure to collect and analyse the data, or you can use commercial services like SOASTA's mPulse (the boomerang developers work on this) that collect, filter, analyse and report on the data. mPulse is available at http://www.soasta.com/mpulse/

End user experience monitoring tools

I have a web application with a great deal of both client-side and server-side logic. It is considered business-critical that this application feel responsive to the end user, for some definition of "feels responsive." ;)
Most website monitoring discussions revolve around keeping an eye on server-side metrics (response time, I/O queue depth, latency, CPU load, etc.), i.e. we tend to treat server performance and responsiveness as though it's a viable "proxy" for what the user is experiencing.
Unfortunately, as we move more and more logic to client side Javascript, the correlation decreases and our server metrics become less useful.
I didn't find any good matching SO questions on this. Googling gives a range of commercial products that might be related, but they're generally from the manufacturers' websites, full of unhelpful marketspeak and "please call us for details," so it's hard to know.
Are there any commonly-used tools for this sort of thing, other than rolling your own? Both free and commercial are welcome, although free is obviously better all else being equal.
EDIT: To clarify, I primarily need to gather bulk data on the user experience, including both responsiveness and breakage/script errors. Automatic analysis is a very-nice-to-have, although I'd expect to have to occasionally dig into the data myself regardless of the solution.
There are some freely available tools for performance monitoring. Yahoo open-sourced a script they used called Boomerang which can measure page load times and other performance metrics for end-users. Full documentation here. Google analytics also offers a basic page load time report.
For error monitoring, you'll want to listen for the window.onerror event. I don't know of any scripts that will automatically log it for you, or mine the logs on the server side. If you implement your own, you'll want to be very careful about not pinging the server too often--imagine how many requests it would generate if there was a JS error in your JS error handling code!
Bucky Client and Bucky Server, can perform that task :
http://github.hubspot.com/bucky/
From their website :
Open-source tool to measure the performance of your web app directly
from your users' browsers.
To analyse data they advise Graphite or OpenTSDB
You can try Atatus which provides Real User Monitoring(RUM) and Advanced error tracking for websites and web apps.
https://www.atatus.com/
http://www.whitefrost.com/documents/html/technical/dhtml/funmon.html#part1 tests the performance of javascript functions.
You can utilize Dynatrace Ajax for measuring and profiling the performance of the JavaScript in IE and Firefox. For Chrome, they have built in tools - take a look at:
http://blog.chromium.org/2011/05/chrome-developer-tools-put-javascript.html
For monitoring the performance of the overall application/site I would recommend synthetic monitoring utilizing real browsers, also known as web performance monitoring. These are services that have robotic agents sitting on Backbone ISPs performing the same activity as end users.
We utilize Catchpoint, which supports Selenium scripting. But there are others like Gomez and Keynote out there that have been providing such solutions for years.
You can also check out New Relic - now it has "real user monitoring" integrated - which measures the performance across all browser types. There is a 14 day trial period so you can set it up for free and see if you like it. You'll get visibility into browser rendering speed, DOM processing, the time it spends on the network, all the way back to your app performance on the server.

Does google analytics slow down my website? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am at the final stages of my website, and currently I need to find a suitable statistics application/tool.
I have looked into webalizer, but it seems outdated.
Also, I have looked into Google analytics, but I am afraid that if I implement it, my website will go slow. It is already pretty heavy with database material being displayed which is dynamic btw.
I have read I can put the GA js code at the bottom of the page and thus the page will load first, but I still don't want a slow down.
You are all much more experienced in statistics than I am, so I believe you can give me some good advice.
I have my own private server (Linux) and I have root access as well (offcourse).
Do you think I should have a statistics app on the server, without interferring with my website, or should I go the Google way and use analytics?
Please give me good application names which you have tested etc...
Thanks
Any additional calls to scripts will slow down your site. However, Google Analytics instructs you to place it in a specific place so that it isn't loaded until the page has loaded. (It used to be before the </body> tag but I believe it's now supposed to be the last <script> in the <head> tag.) Don't worry about it too much; the benefits of analytics will far outweigh the extra call to a remote file.
Focus on other optimizations (database queries, CSS sprites, fewer HTTP requests). Analytics is necessary in today's site market and is indispensable; IMO it is not an option to forgo it.
As far as having your own "statistics app," I assume you're talking about building your own proprietary statistics codebase? I would discourage that, because it takes a lot of time and effort and in the end you will not have the same optimizations that Google has employed an entire project's worth of software engineers to make. Remember that while it's always great to create your own product, you don't have to reinvent the wheel, especially when it comes to things like this that have many sensible drop-in solutions that are widely available for free.
With respect to non-Google analytics solutions, one other of note is Clicky. I'm not as experienced with it as I am with GA, but I've heard many reviews that it is more precise and more informative than GA. However, just as an end-user browsing the web I've noticed a lot of times that its calls to Clicky's website do tend to slow down pages, and noticeably so; I cannot really say that I have seen the same effect with GA.
One last thing I would caution against is this: Do not employ more than one analytics solution unless you are trying to find the best one to suit your needs. It's just overkill to run two remotely-hosted analytics solutions on every single one of your pages, so what I would encourage you to do is try out a few for the first few weeks or so of your site (yes, pages will slow down during this trial phase) and then simply stick with the one that you like best. That will also give you the added benefit of being able to see first-hand what the speed implications are on your unique hosting environment for each script.
Here's some other analytics solutions that you might check out:
Piwik
Webtrends
GoingUp!
Yahoo! Web Analytics
Straight from Google's analytic sign up page (https://www.google.com/analytics/provision/)
"The appearance of your website will never be affected by your use of Google Analytics - we don't place any images or text on your pages. Likewise, the performance of your pages won't be impacted, with the possible exception of the very first page-load after you have added the tracking code. This first pageview calls the JavaScript on Google's servers, which may take slightly longer than a regular page load. Subsequent pageviews will use cached data and will not be affected."
Use the Asynchronous Snippet of Analytics:
http://code.google.com/apis/analytics/docs/tracking/asyncTracking.html
People focus to much on total load times when what is important is render times and in particular progressive rendering. If you use Google Analytics properly, it will load after the page has shown to the user. So yes, it will add a small overhead to every request but because the user can see the page already they probably won't even notice. Just go for it.
Webalizer runs on server side after apache logs doesn't it? That's why it appears outdated, it can't collect as much info as JS can. But it doesn't slow the user down any. You could run Webalizer and Google together for a bit and see what serves your needs best.
We decided to work around the possibility of google's servers appearing to slow our site down. Instead of our users downloading the ga.js file from google's servers we store it locally. The only problem with that approach is that our local copy becomes outdated. So we wrote an application that periodically compares our local file to google's and updates our file accordingly.
Andrew
Google Analytics is javascript based and does not tun on your server. All processing and storage is done on Google servers, so it's ideal if you are worrying about local resources.

UI performance monitoring tools

I am looking for a tool that lets you monitor/log page rendering time on client machines. I am not looking for firebug/yslow because i want to know the following type of things:
How does fast do my pages load when the user is in russia?
How long does it take for javascript to run on some pages for everyone who accesses those pages?
So, i actually care what my site feels like to the people who use it. Do there exist tools that already do this?
I should add that my website is a software as a service website, not accessible publicly.
I've never heard of any way to do this. One solution, which may be terrible, might be to log the time yourself. At the top of your page have an inline script tag with a global variable called start that creates a new date. Then, have an onload listener that calls a function once the page is finished loading. In this function, get the difference between the start time and current time and send that back to your server. This is by no means that accurate, but might give you some idea. You could also log their IP address for geolocation when you send back the data.
I recommend https://www.atatus.com/. Atatus helps you to visualise page load time across pages, browsers and countries. It also has AJAX monitoring and Transaction monitoring.
There is not really a super easy way to do this effectively. but you can definitely fake the geo-location thing by using a proxy (which would actually give you N*2, time length) and get a pretty good idea at what it's like to browse your site.
As for JavaScript, you can profile it with the profiler in FireBug, this will give you an idea of what functions you should refactor and whatnot.
In my opinion I'd determine what most of your users are using or what the general demographic makeup they are, are they 75 year-old guys? If that is the case maybe they aren't up on the newer faster browsers, or for that matter don't care. If they are cool hipster designers in San Francisco, they its Safari 4.0... anyway this is just a way to determine the meat of the users, I think the best way is just grab an older laptop with Windows XP on it and just browse your site, you can use FireBug lite on browsers besides Firefox
I like to run Dynatrace AJAX edition from UI automation tests. This easily allows you to monitor performance deterioration and improvement over time. There's an article on how to do this on the Dynatrace website.

Categories