I have a CRA and I'm trying to work out the download speed of the index.html. how can I see this? I can see the .chunk.js file took 11seconds to download over slow 3G and it's only 250kb~, is this normal?
I want to work out how quick the html takes to load so I can then work out how long the javascript takes to kick it and monitor this whole speed process. anyone have any good tips for doing this?
currently setting sessionStorage to be Date.Now() but need to know exactly when to fire this so I'm comparing 2 accurate values
This solution won't be able to pin down the exact time taken to load index.html, but it assesses the website from a user's perspective and tells you what possible optimizations you can make. If your end goal is simply performance optimization, I recommend using lighthouse.
The screenshot below shows the assessment results and time taken for certain items to load or be perceived as visible/interactable.
They're using the Chrome DevTools in this screenshot but there's also a CLI version (which I prefer more 😛).
Related
My web application is slow I don't know why, but I think CSS and JS files is the reason. I use PHP with Firebase and it takes too long as you can see in image.
How I can increase performance of my website? I tried to use inspect-> network to see how much time every file takes to execute on my web application and I see that the problem in CSS and JS, but I don't know how to fix it.
I think you use a bad algorithm to do your job or garbage code you need to check js files how it work and use modern js code instead of nested loop of nested if statement all of that maybe one of reason why your website is slow
I am creating a website professionally for a client as my first project and i am using too many libraries for instance velocity.js,jquery,jquery.ui,animate.css and also some image slider plugin for jquery right now i am using the min version and all of the files are downloaded in my machine but when i will put the site live will it severely affect the loading time of website or it will be normal.
You can put it up. Test it Click here. But the best way is to put it up and test the ping.
Yes, it will severely affect the loading of the page. Chrome comes with developer tools out of the box, and Firebug for Firefox is only a couple of clicks away. That, combined with the RTT time and badnwidth to the site gives you enough information to calculate exactly how slow the first hit page load time.
Assuming that caching is configured correctly, subsequent page transitions should be faster - but even loading all this javascript from cache and parsing it will have a big impact on the load time. Note that in the of Dokuwiki, the CMS already handles merging of javascript into a single file.
Using PJax or similar will help with subsequent transitions. Using online repositories for standard libraries does not help with performance. Minimizing the javascript does not give very big wins (since you are already compressing the files on the fly, aren't you?)
There's big wins to be had from defering Javascript loading and stripping/optimizing/merging CSS files.
There are a lot of other things you can do to make your page render faster - but space here is too limited.
Currently I'm working with unit tests for JS project. The only thing i can do automated test for is functionality.
Question: is there any way to automate appearance testing? I mean that currently i wont know that some styles, that work perfectly in Chrome, are off in IE unless i will open IE browser and take a look at my page.
I was thinking about taking screenshots of the page during all DOM events with the same window size/resolution. Them compare those images and if there are any significant differences then spit it out as a failed test ("bug found" sorta thing).
Is there anything like that out there? I don't really want to re-invent the wheel.
I'm NOT asking "how can i take a screenshot of the web-site" I'm asking "how i can analyze already taken screenshot"
i dont know about automation .. but you can use a cross browser service to test your website in like:
http://www.browserstack.com/
also there is also Browsershots:
http://browsershots.org/
but the best way to test is to just open the website in the different browsers and view it to get the most accurate rendering of functionality and display issues based on different browser versions
UPDATE
automated thumbnail screenshot service:
http://www.shrinktheweb.com
for IE there is
http://www.iescreenshot.com/
unless you do something on your server that captures screenshots of the different resolution
and or what #michaelt suggest below
Other UPDATE
I see your example below.. there is also this
http://www.imagemagick.org/Usage/compare/
this type of image library just show difference in pixel data .. to make it automated you would have to mix js, ajax to talk to the server and capture and compare the images.. but it would require it to compare the number of colors in the image, and if those colors are the same in each image in conjunction with the luminosity.. and then having it run on a cron job on the server ...
another way of comparing pixel data in this stackoverflow question:
How does comparing images through md5 work?
i hope it helps
I'd recommend trying out Selenium WebDriver. The WebDriver interface is really rich with functionality, taking screenshots for example, and there are so called drivers for a large number of different browsers including Internet Explorer. There are a few JavaScript implementations of WebDriver, and I've used wd and WebdriverJs with different amounts of success.
If you choose to go with a JavaScript implementation, you can relatively easily start using WebDriver as part of any unit test written in Mocha or what not.
Edit:
It seems as if I've misunderstood your question. If you want to compare screenshots, you could use an image recognition tool such as Sikuli. I've only used it's Java API, but it's available for Python as well. Sikuli can easily be part of any test written in for example TestNG or JUnit. Give it an image, and it will search a given region (or the whole screen) for images that looks like it. If it finds a part of the screen that looks like the image, it will return it to you along with the level of how certain it is that they're the same.
For your use case, you could have a images on some file server representing how parts of your page should look, and have the test make sure that they all exist. If Sikuli can't find them, or finds them but with a very low degree of certainty, you'll fail the test.
I am looking for a tool that lets you monitor/log page rendering time on client machines. I am not looking for firebug/yslow because i want to know the following type of things:
How does fast do my pages load when the user is in russia?
How long does it take for javascript to run on some pages for everyone who accesses those pages?
So, i actually care what my site feels like to the people who use it. Do there exist tools that already do this?
I should add that my website is a software as a service website, not accessible publicly.
I've never heard of any way to do this. One solution, which may be terrible, might be to log the time yourself. At the top of your page have an inline script tag with a global variable called start that creates a new date. Then, have an onload listener that calls a function once the page is finished loading. In this function, get the difference between the start time and current time and send that back to your server. This is by no means that accurate, but might give you some idea. You could also log their IP address for geolocation when you send back the data.
I recommend https://www.atatus.com/. Atatus helps you to visualise page load time across pages, browsers and countries. It also has AJAX monitoring and Transaction monitoring.
There is not really a super easy way to do this effectively. but you can definitely fake the geo-location thing by using a proxy (which would actually give you N*2, time length) and get a pretty good idea at what it's like to browse your site.
As for JavaScript, you can profile it with the profiler in FireBug, this will give you an idea of what functions you should refactor and whatnot.
In my opinion I'd determine what most of your users are using or what the general demographic makeup they are, are they 75 year-old guys? If that is the case maybe they aren't up on the newer faster browsers, or for that matter don't care. If they are cool hipster designers in San Francisco, they its Safari 4.0... anyway this is just a way to determine the meat of the users, I think the best way is just grab an older laptop with Windows XP on it and just browse your site, you can use FireBug lite on browsers besides Firefox
I like to run Dynatrace AJAX edition from UI automation tests. This easily allows you to monitor performance deterioration and improvement over time. There's an article on how to do this on the Dynatrace website.
I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.