When first visiting my site it stays on the splash screen for about a minute. I was wondering if it had to do with the file size of my pictures of my HTML and CSS code? You can check the HTML code by viewing the page source. I will add the CSS code if needed.
Here is my website: Tom Falzani Portfolio Website.
Whenever you meets performance issue, don't forget the network tab of your browser's developper debbuging tool. See Google Network Performance Documentation.
Here is a record of the initialisation and loading of your website, and if we take a deeper look at it, we see that the philly_scene.png file makes 6.3 MB and take 25 seconds to load and that the philly.png file makes 6.6 MB and takes 18 seconds.
To avoid this long loading time, you can try to ajust the both size of them using this link, for exemple. (Never used)
Or you can them asynchronously and deliver a first render of your website until they are actually loaded.
If you're using jQuery, take a look at the .load() function, there is also many others ways to do it like the Axios Promise or the Async library if you're using Node.js
Related
I am in the process of trying to speed up my HTML website and the first thing I have done is to reduce the images to the exact size that they are loaded into.
My PageSpeed Insight for desktop is 90 which I am very happy with but for mobile, it is 24, which isn't so good (was 19 before I did the images)! It says that I can save 4 seconds if I "remove unused JavaScript" but I haven't added any JavaScript to my website yet (only just taught myself HTML and CSS so moving onto JavaScript soon).
So I am wondering what this unused JavaScript is and how I can remove it if I wasn't the one to add it in there in the first place. Any other tips on how to speed up the website will be much appreciated!
Thank you!
If by "speed up the page" you mean improve the performances of your website:
Changing the dimension of images don't necessary mean that they are "optimized for the web". You can find tools like https://optimage.app/ and information on the web on how to compress them.
You can use the inspector of your browser (Right-click on the page + Inspect) to do this :
Go to Network Tab, clear all logged requests then reload the page you will be able to see which requests are the most time-expensives and maybe do something about it.
Go to performance, start recording, reproduce the actions that you want to analyze
(You can zoom in and out in the timeline and click on the element to have more details)
Delete all Javascript from your website
I've almost finished my first angular.js app. Currently I'm trying to optimize page speed. Whole app is single page app so pages content is loading thew ajax. Initial load time on computer is around 5 seconds(might be problem but now it's ok). Other pages are loading in 200-400ms(mostly time for first byte takes 300ms), so it's quite fast.
Problem is load time on mobile devices. It's starting from 10 seconds to 20 maybe on low ram phones even more. What should I do? I compressed all scrips, styles, merged them into 1 file. When I make a test on webpagetest.org I got marks C A A A B so it's good, also on gtmetrix.com I got PageSpeed Score
(97%),YSlow Score(87%). Huh?
Problem might be in size of page. I'm transfering 1.4mb on first load. Scripts has around 600kb, css 340kb(but it's all minified). I'm using angular-material which has around 200kb.
What should I do? I tried do optimize everything what I found on google, but it doesn't seems to help much. Is there any other way what can I improve or it's lost and it gonna be as slow as it is forever?
I have a fairly extensive javascript that I can load in my Chrome (latest stable) and in IE11.
The load icon spins but the script eventually loads on my machine in both browsers.
I have 2 other people trying to load the page that contains the javascript in IE11 and they both cannot get the page to load. The loader icon spins forever and when they mouse over the refresh icon a flyout states "long running script"
How can I analyze my javascript to identify how and where the script is taking forever to load?
Chrome's Developer Tools (F12) can profile your code. This will give you a lot of information -- possibly a lot of noise -- but it will identify two things for sure 1) functions where a lot of time is spent, and 2) functions that are called often.
This is the first place I'd start: turn on the profiler and reload the page.
If that doesn't give you a good place to start, look into the Chrome Timeline and console.timeStamp( 'Some Note' ). After you have started recording a timing session, every time the code encounters "console.timeStamp", it will annotate the timeline allowing you to estimate elapsed time between one or more points in your execution. See here: https://developers.google.com/chrome-developer-tools/docs/console#measuring_how_long_something_takes
I'm making an html interface to upload images on a server with Drag & Drop and multiple selection files. I want to display the pictures before sending them to the server. So I first try to use FileReader but I have had some problems like in this post. so I change my way and I decided to use blob:url like ebidel recommends in the post, with window.URL.createObjectURL() and window.URL.revokeObjectURL() to release memory.
But now, I've got another problem, which is similar to this one. I want that a client could upload 200 images at time if he wants. But the browser crashed and the ram used was very high! So I thought that maybe too much images were displayed at the same time, and I set up a system with a waiting queue of files using an array, in order to treat only 10 files at time. But the problem still occurs.
On Google Chrome, if I check chrome://blob-internals/ the files (which are normally already released by window.URL.revokeObjectURL()) are released approximately after a 8 seconds delay. On Firefox I'm not sure but it seems like if the files were not released (I check on about:memory -> images for that)
Is my code which is bad, or is it a problem independent of me? Is there a solution to force the navigators to release immediately the memory? If it can help, this is the part of JavaScripton which the problems occurs: link expired because code was not included in question.
EDIT
This is a kind of own answer + an answer to bennlich (too long text for a comment)
I understood from the answer of user1835582 that I could indeed remove the Blob/File but while the browser needs images it keeps them somewhere in memory (which is logical). So it's the fact to display images (many & heavy) that gave me crashes & slow downs, not the revokeObjectURL method. Moreover, each browser manages the memory by its own way, leading to different behaviors. Here is how I came to this conclusion.
First, let's try that revokeObjectURL works well, with a simple example using the source code of https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications#Example.3A_Using_object_URLs_to_display_images.
Using Chrome you can verify that Blob are well revoked, by checking chrome://blob-internals/ or trying to open displayed images into a new tab that will be blank. Note : to fully release Blob references, add document.getElementById("fileElem").value = "". When I posted my question some years ago, it was about 8 seconds to release blob, now it's almost immediate (probably due to improvements in Chrome & / or to a better computer)
Then, time for a charge test. I did it with a hundred of jpg of ~2.5 Mo each. After that images have been displayed, I scrolled the page. Chrome crashed and Firefox was slow (not tested on others browsers). However, when I commented li.appendChild(img) all went well, even with a huge bunch of images. This shows that problems are not coming from revokeObjectURL method which in fact works properly, but from displaying lot of heavy images. You can also test to create a simple html page with hundreds of heavy images and scroll it => same result (crash / slow down).
Finally to look deeper about images memory management, it's interesting on Firefox to look into about:memory. For example I saw that when the window is active, Firefox uncompresses the images (images -> uncompressed-heap goes up), while raw (images -> raw) is always stable (relative to the the quantity of images loaded). There is a good discussion about memory management here : http://jeff.ecchi.ca/blog/2010/09/19/free-my-memory.
With window.URL.revokeObjectURL() you can only get [Blob] or [File] object. You can not force remove from memory.
Note.
Browsers are not finalized and they can leak from these facilities. If you implement the animation, you have to understand that at your own risk.
This isn't an answer, but I just want to say that, as far as I can tell, this is still an issue in the latest version of Chrome (35). I made a test page that exposes the problem:
http://ecobyte.com/tmp/chromecrash-1a.html
If you select a large number (say, 600) of high resolution photos on your computer and drop them into the box on that page, it will crash Chrome (tried on Windows 7 and Mac OS X 10.8.5).
If you look at the source you can see that sequence of ops is:
createObjectURL
load the img (don't add to DOM!)
revokeObjectURL to free the ref
Lose the img ref
Repeat all steps for next dropped image
Seems like only a single image should be in memory/referenced at any given moment, but eventually this crashes Chrome.
I have implemented the Facebook like button on my site by using the asynchronous JS SDK and it's working great! However it takes a long time to load, which is not a great problem (Would be nicer if it loader quicker though..) as the rest of the page loads fine.
However, if your view the the site in any version of IE the whole page is unresponsive until Facebook Like / comments have loaded... All the images and other scripts are loaded, but the whole page is locked.
Any ideas on how i can rectify this for IE users?
I have seen this post: How do I keep the Facebook like button from delaying the loading on my website? but this was solved by using the Async version, where as mine IS using this and still hanging?
If it helps I can post a link to my site / page that it appears on?
Well, my only advice here is to place your FB JS code just before the </body> tag. But I have other "tips" for your site in general.
Try to minify/combine your CSS and JS files when possible
Try moving your JS code to the body tag (at the end)
Do you really need the Prototype AND jQuery libraries?! try removing one of them and port the functionality to the other (almost all tasks can be done with either library)
In the end, IE was hanging because I had a CSS3 transform on my images and apparently this slows down IE (Even though it cannot render the transform. So i can disable by this via conditional comments in the CSS or in my case modernizr.