I was wondering if there is any other ways to compress my images or any script that would load the page faster / or the the images behind the scenes?
The site is very interactive and using very high quality layers of images for the main layout. I have already saved for web devices in Photoshop and re-compressed using ImageOptim, some are jpeg but the majority are png24 to maintain transparancy, they are all set in CSS.
I have used jpegs and css sprites where i can but there is one in particular of a tree illustration streching the full site length, that is really slowing up the loading time, is there any I could compress these images further or code them differently that I missed?
Any help would be great thanks!
You said you are spriting. That is good.
You can also use tools such as PNGcrush which attempt to make files smaller by dropping things such as meta data.
You should also send far distant expiry headers and use a cache breaker on your images, to ensure the images won't be downloaded again if unnecessary.
In Photoshop, choose file-> save for web, you will be able to find the best compromise between size and quality.
Do you really need the transparency there? PNG transparency is unsupported on some browsers and makes the page processing intensive and slow even on high end computers, depending on image size and quantity of layers. If you can show something of your site maybe someone can give more hints about how to optimize it.
You can compress them on the fly with Apache if that's your web server. One of many available articles on the subject: http://www.samaxes.com/2009/01/more-on-compressing-and-caching-your-site-with-htaccess/
Related
I am a full-time developer but am building a site for my photography hobby. I dont want people to download my images and besides the usual procedures (disable right click, block hotlinks to my images etc.) i was thinking about a solution which would work 99% of the time.
The idea was to render images in a canvas or load it as a binary file or similar.
How would be performance compared to standard src linking?
And are there better solutions to my problem?
If a picture is displayed on someone's screen, there is no way you can avoid them to save it on their computer (even if you disable everything).
Trying to obfuscate the images will only result in a loss of time, performance, and could make your website much less user-friendly.
I have an existing website (a photo blog) that loads the majority of the photos from Flickr. I'd like to enhance the experience for users with high resolution screens and load higher res versions of photos, but I'm not sure what would be the best way to go.
Since the images in question are not UI elements, there is no sensible way to solve this problem with CSS. That leaves either client side JavaScript, or a server side find-and-replace for specific image patterns (since they come from Flickr, it's easy to detect and easy enough to figure out the url for a double-sized image).
For client side, my concern is that even the regular sized images are 500-800 KB in size, there there can be 10-30 images per gallery, causing lots of excess bandwidth use for retina users.
For server side, it's obviously tricky to determine if the request comes from a retina device or not. One idea I had (which I have yet to test out), was to run a JavaScript function that checks window.devicePixelRatio and sets a cookie accordingly, and then on each successive page request the server would know if the device is high res or not. That leaves the entry page with non-retina images, but at least all the next ones will have high res images loaded right away.
Are there any pitfalls to this proposed solution? Are there better ways to handle it?
You can generate CSS on server that will have links to both regular size and double-sized images in background-image property. This CSS can easily be different for every page by including it in <style> tag and referencing classes/ids that only exist on this page. This will deal with traffic issue, since majority of modern browsers don't load pictures for other DPIs.
Other solution (though worse) will be not just set cookie, but load images with JavaScript. This will solve the initial first page issue, but slow down the page rendering a little.
Does anyone know why Javascript performance would be affected by the loading of lots of external JPG/PNG images into HTML5 Image() objects, totalling approx 200Mb-250Mb. Performance also seems to be affected by cache. Ie. if the cache is full(-ish) from previous browsing the performance on the current site is greatly reduced.
There are 2 says i can crudely solve it.
clear cache manually.
minimise browser, wait about 20 secs and re-open the browser after which time the iOS/browser has reclaimed the memory and runs the JS as it should.
I would have expected the iOS to reclaim required memory to run the current task, but it seems not. Another workaround is to load 200Mb of 'cache clearing' images into Image() objects, then remove these by setting the src = "". This does seem to help, but its not an elegant solution...
please help?
First and foremost read the excellent post on LinkedIn Engineering blog. Read it carefully and check if there are some optimizations that you can also try in your application. If you tried all of them and that still haven't solved your performance issues read on.
I assume that you have some image gallery or magazine-style content area on your page
How about having this image area in a separate iframe? What you could do then is this:
Have two iframes. Only one should be visible and active in time.
Load images into first iframe. Track the size of loaded images. If exact size tracking is hard
numberOfLoadedImages * averageImageSize
might be a pretty good aproximation.
As that number approaches some thresshold start preloading the currently visible content into second iframe.
Flip the visibility of iframes so the second one becomes active.
Clear the inner content of the first frame.
Repeat the whole procedure as necessary.
I don't know for sure if this would work for you but I hope that WebKit engine on iPad clears the memory of frames independently.
EDIT: It turned out you're writing a game.
If it's a game I assume that you want to have many game objects on the screen at the same time and you won't be able to simply unload some parts of them. Here are some suggestions for that case:
Don't use DOM for games: it's too memory-heavy. Fortunately, you're using canvas already.
Sprite your images. Image sprites not only help reducing the number of requests. They also let you reduce the number of Image objects and keep the per-file overhead lower. Read about using sprites for canvas animations on IE blog.
Optimize your images. There are several file size optimizers for images. SmushIt is one of them. Try it for your images. Pay attention to other techniques discussed in this great series by Stoyan Stefanov at YUI blog.
Try vector graphics. SVG is awesome and canvg can draw it on top of canvas.
Try simplifying your game world. Maybe some background objects don't need to be that detailed. Or maybe you can get away with fewer sprites for them. Or you can use image filters and masks for different objects of the same group. Like Dave Newton said iPad is a very constrained device and chances are you can get away with a relatively low-quality sprites.
These were all suggestions related to reduction of data you have to load. Some other suggestions that might work for you.
Preload images that you will need and unload images that you no longer need. If your game has "levels" or "missions" load sprites needed only for current one.
Try loading "popular" images first and download the remaining once in background. You can use separate <iframe> for that so your main game loop won't be interrupted by downloads. You can also use cross-frame messaging in order to coordinate your downloader frame.
You can store the very most popular images in localStorage, Application Cache and WebSQL. They can provide you with 5 mb of storage each. That's 15 megs of persistent cache for you. Note that you can use typed arrays for localStorage and WebSQL. Also keep in mind that Application Cache is quite hard to work with.
Try to package your game as a PhoneGap application. This way you can save your users from downloading a huge amount of data before playing the game. 200 megs as a single download just to open a page is way too much. Most people won't even bother to wait for that.
Other than that your initial suggestion to override cache with your images is actually valid. Just don't do it straight away. Explore the possibilities to reduce the download size for your game instead.
I managed to reduce the impact by setting all the images that aren't currently in the viewport to display:none. This was with background images though and I haven't tested over 100Mb of images, so can't say whether this truly helps. But definitely worth of trying.
I have 8 MB in PNG files but the Problem is that i cant decompress them anymore i tried every png compressor. Maybe thre is also something in the code? Because the site takes damn forever to load. Is there a way i can compress the javascript or html or so?
Just take a look at the sourcecode and maybe its something there i can do to SPEED it up way faster?
There's like a zillion http-requests on your page. Try reducing the number of requests by using spritesheets and inlining css and javascript.
I see you are also using images for your menu? Try to use a font for that, with the #font-face directive. (this will also prevent the flash you are seeing, when you hover over the menu-items)
And try to use jpegs (as they allow much better compression for photographs in general) for the slideshow. Use a png for the frameborder to allow for transparency. This would also allow much more flexibility, since you don't have to manually add the frameborder in photoshop, should there be more photos to add to the sideshow.
There's like a ton of other stuff you could do to improve your speed. You should try to conform to some of the best practices in the modern web-industry.
Some useful resources for you:
Move the Web Forward
HTML5 Boilerplate
Did you use Photoshop to create your PNGs? If so, did you use File > Save for Web to save the images? That will reduce the size of your PNGs by a ton.
Try running the YSlow addon in Firefox.
YSlow analyses webpages and why they're slow based on Yahoo!'s rules for high performance web sites.
https://addons.mozilla.org/en-US/firefox/addon/yslow/
Things to do to make your page load faster:
Put all <scripts> at the end of the body - this
ensures that initial css and html is rendered before any JavaScripts
is loaded.
Convert you images to jpg. Since your gallery images has no
transparency, there is no need for png.
Resize your gallery images, they are bigger that than needed
Put smaller icons, menu items and other graphics into sprites
Use #font-face insead of servering text as images
Use css gradients for of gradients in images
Compress your JavaScript using tools like UglifyJs
Install page speed for Chrome and Firefox, which measures your speed and rates it and gives you suggestions on how to improve your page's speed.
http://code.google.com/speed/page-speed/
N.B: You must accept Experimental Extension Apis on Chrome and install Firebug for Firefox.
Enjoy!
I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk