I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.
Related
I have a CRA and I'm trying to work out the download speed of the index.html. how can I see this? I can see the .chunk.js file took 11seconds to download over slow 3G and it's only 250kb~, is this normal?
I want to work out how quick the html takes to load so I can then work out how long the javascript takes to kick it and monitor this whole speed process. anyone have any good tips for doing this?
currently setting sessionStorage to be Date.Now() but need to know exactly when to fire this so I'm comparing 2 accurate values
This solution won't be able to pin down the exact time taken to load index.html, but it assesses the website from a user's perspective and tells you what possible optimizations you can make. If your end goal is simply performance optimization, I recommend using lighthouse.
The screenshot below shows the assessment results and time taken for certain items to load or be perceived as visible/interactable.
They're using the Chrome DevTools in this screenshot but there's also a CLI version (which I prefer more 😛).
My website: http://jgyboatparty.com/
is loading horrifically slow and I cant figure out why. I have compressed the images as much as possible and made the video file available as a webm, and yet with a fresh reload the page is taken minutes to load!
Am i missing something here, as for some reason when I look at when elements are loaded, some of the images dont even start loading until 30 seconds in. This all means that my loader goes on for a while. (I have a wait until images loaded function)
Any advice would be much appreciated :)
The web is full of useful tools that shall aid you in this:
Google pageSpeed for one and GTmetrix another. Make use of these tools to analyse what could be causing your site to be slow.
Also, ensure all of your'e images are properly optimised. Another tool again that may help is Tinypng.
Google's Inspector can also be very useful to help diagnose bottlenecks and so forth. Here's another link that may help you:
https://developers.google.com/web/tools/chrome-devtools/network-performance/
Also, I see you are using various libraries such as owl carousel, Parallax, and fitvids to name three. I would look to try and at least use cdn versions: a CDN is a way to deliver content from your website or mobile application to people more quickly and efficiently, based on their geographic location.
Also, look into lazy loading of your images.
Do you have enough juice to your server? When pinging you server it takes 71ms, and when i am pinging your govrment's server it's approximately 31ms.
When i am checking the network tab for your website an image which is about 155kb big, takes about 1.2 seconds to load.
Steps to improve your speed might be minify all your scripts.
Do not load all of your content at once.
Open your broswer's dev tools and find out.
Browsers only allow 4 (max) concurrent downloads from 1 domain. So while the browser is downloading those, it blocks the others.
Move your scrips/ images to a sub domain or a cdn or just do something listed in the post below.
Get number of concurrent requests by browser
I am creating a website professionally for a client as my first project and i am using too many libraries for instance velocity.js,jquery,jquery.ui,animate.css and also some image slider plugin for jquery right now i am using the min version and all of the files are downloaded in my machine but when i will put the site live will it severely affect the loading time of website or it will be normal.
You can put it up. Test it Click here. But the best way is to put it up and test the ping.
Yes, it will severely affect the loading of the page. Chrome comes with developer tools out of the box, and Firebug for Firefox is only a couple of clicks away. That, combined with the RTT time and badnwidth to the site gives you enough information to calculate exactly how slow the first hit page load time.
Assuming that caching is configured correctly, subsequent page transitions should be faster - but even loading all this javascript from cache and parsing it will have a big impact on the load time. Note that in the of Dokuwiki, the CMS already handles merging of javascript into a single file.
Using PJax or similar will help with subsequent transitions. Using online repositories for standard libraries does not help with performance. Minimizing the javascript does not give very big wins (since you are already compressing the files on the fly, aren't you?)
There's big wins to be had from defering Javascript loading and stripping/optimizing/merging CSS files.
There are a lot of other things you can do to make your page render faster - but space here is too limited.
I am using codes on my sites that I have never used before just for the layout and music player.
The page/audio player works perfect on desktop and android devices if I use this page: http://chancetrahan.com/1abcd.html
The player only works on desktop versions if I use the index file at http://chancetrahan.com
I have followed the site speed up suggestions on pagespeed insights and have gotten it to work on desktop, but I just realized it's not working right, and not showing up right if I view from android browsers.
I'm not sure if it's broken because I compressed the .js and .css files when I transferred them to my site or not, but that is what I am thinking might have happened.
I noticed that when I remove the FB ROOT from the code, it breaks the music player, I have no idea why the music player uses FB ROOT, but it does, I'm not sure what rollups are, but it also says that common.js is running twice.
I have stripped down the code to the bare bones trying to replicate this layout/template/theme with minimal code use, and a speedy response. You might not be able to see some of the code because I am using cloudflare, but I would be happy to use TeamViewer to go over this and kind of get some insight form someone who understands this code.
If you could help me figure this out I would be really appreciative for all the help I can get. Thank you for your help and advice!
It's very common to break javascript files when trying to lower your pagespeed scores. If you're using Cloudflare, test it on development mode and see if it works. Often times rocket loader will cause a problem.
I hope you have a backup of the javascript files before you minified them. If you tried to minify code that was already compressed, there's a very high chance that you broke it.
Further, pagespeed is not as big of a deal as you'd think when it comes to SEO. I put a lot of development hours into making a Wordpress theme from scratch that can get a 100/100 Pagespeed score. It helps, but not a crazy difference in terms of search engine rankings. It's more about just being nice for your users.
Another thing to mention, if you're concatenating your javascript files, first check if your server supports HTTP/2 as this will actually make your site much slower if it does. HTTP/2 is the new version of the SPDY protocol which requires HTTPS, and will quickly and efficiently deliver many resources at once without the need for concatenation.
For completely non-nefarious purposes - machine learning specifically, I'd like to download a huge dataset of CAPTCHA images. However, CAPTCHA is always implemented using some obfuscated javascript that makes getting at the actual images without a browser a non-trivial task, at least to me, who is a javascript novice.
So, can anyone give me some helpful pointers on how to download the image of the obscured word using a script completely outside of a browser? And please don't point me to a dataset of already collected obscured words - I need to collect the images from a specific website for this particular experiment.
Thanks!
Edit: Another way this question could be asked is very simple. When you click "view source" on website with complicated javascript, you see the script references, but that's all you see. However, if you click "save webpage as..." (in firefox) and then view the source of the saved webpage, the javascript will be resolved and new html and the images (at least in the case of ASIRRA and reCAPTCHA) is in the source. How can I mimic this "save webpage as..." behavior using a script? This is an important web coding question in general, so please stop questioning me on my motives with this! This is knowledge I can use from now on in all web development involving scripting and I'm sure other stack overflow visitors can as well!
While waiting for an answer here I kept digging and eventually figured out a sort of hacked way of getting done what I wanted.
First off, the reason this is a somewhat complicated problem (at least to a javascript novice like me) is that the images from ASIRRA are loaded onto the webpage via javascript, which is a client-side technology. This is a problem when you download the webpage using something like wget or curl because it doesn't actually run the javascript, it just downloads the source html. Therefore, you don't get the images.
However, I realized that using firefox's "Save Page As..." did exactly what I needed. It ran the javascript which loaded the images, and then it saved it all into the well-known directory structure on my hard drive. That's exactly what I wanted to automate. So... I found a firefox Add-on called "iMacros" and wrote this macro:
VERSION BUILD=6240709 RECORDER=FX
TAB T=1
URL GOTO=http://www.asirra.com/examples/ExampleService.html
SAVEAS TYPE=CPL FOLDER=C:\Cat-Dog\Downloads FILE=*
Set to loop 10,000 times, it worked perfectly. In fact, since it was always saving to the same folder, duplicate images were overwritten (which is what I wanted).
Why not just get CAPTCHA yourself and generate images? reCAPTCHA's free too.
http://www.captcha.net/
Update: I see you want it from a specific site but if you get your own you can tweak it to give the same kind of images as the site you're targeting.
Get in contact with the people who run the site and ask for the dataset. If you try to download many images in any suspicious way, you'll end up on their kill list rather quickly which means that you won't get anything from them anymore.
CAPTCHAs are meant to protect people against abuse and what you do will look like abuse from their point of view.