Given a bunch a images of different size and ratio, but all rectangular, is there a easy way in javascript/css to assemble them so they fit the whole web page. Such as :
Given the following constraints :
Images could be, in the process, can be resized down independently, but not up.
The resizing down should have a size limit (so image could still be visible).
Images can be reordered at will to fit the best packing
Ratio must be preserved.
Is there generic/formal terms to describe this process so I can better search for solutions ?
I guess there is no easy, ready-to-use way to do this, like a js library. But does it exist an algorithm that would handle this operation, like form a geometrical point of view of assembling rectangle to fit a bigger rectangle ?
Have you thought about alighning the images in columns and then using CSS media queries to reorganize them acordingly to the size of the web page...
Something like this: https://css-tricks.com/seamless-responsive-photo-grid/
Related
I'm working on JS / HTML game, where all objects are HTML elements (mostly <img>). Their top and left CSS attributes (sometimes along with others, like margins, transformations etc) are dynamically modified by JS code (every frame, basically). I noticed a huge performance improvement, when I switched from using .png files to .gif (22 fps -> 35 fps), but still:
Can further reduction of files' size (by 10-30%) actually noticeably improve CSS transformation performance? I would just test it, but I'm talking about ~250 gif files; and I don't want to loose too much quality, too.
Reduction of the the amount of data(=file size) which will have to be processed, will normally have a positive performance impact, but how much impact it will have has always to be tested and measured, as it depends always on how well your code works together with all surrounding frameworks, the browser and even how the underlying hardware does calculations.
In a worst case where you have to create a lot of resizing an recalulate new pixels, especially if you have to increase the display size of your image again, you might even create a loss in performance.
There is no absolute answer to how good or bad your change will be, until you have tested and measured it in your system yourself.
About Reducing GIF File Size:
A positive factor here is when you cut the width and height of the image by half, the actual file size can be reduced by around 75%.
When using GIF images, reducing the number of different colors used in the GIF image will reduce the size as well. Often you can select the number of colors when exporting gifs from image editor tools. In Photoshop you have an option "Export for Web" which will try to set the optimal settings for usage in web pages.
Regarding Game Performance and Images:
If you already know how you will resize the images(fixed sizes like +50% and +100%), using CSS Image Sprites with pre-resized images can create a good improvement, as you won't have to resize the image, but just use an existing part of the sprite sheet. This will reduce the amount of processing for image manipulation, and therefore increase the performance.
If you want all at once, performance, quality and small file size, the best way to go is replacing as much raster graphics with vector graphics as possible. Most of modern 2D Games use a lot of vector graphics.
Vector graphics are easier to resize, as only the splines have to be recalculated and filled, resizing raster graphics causes calculation for each new pixel.
All common modern browsers support the svg format and can be sized through css. If you want try it out, a free open source tool to create vector graphics is InkScape.
Hope this helps.
Why don't you use JPEG? It's compact. Yes, Image sizes affect performance.
Also, check "CSS Image Sprites", You can have a single image with all your possible icons/images and can handle view using CSS. You'll get better performance as you'll be loading a single image.
I have a website which contains some images in index.php
The problem I am facing is the whole page is not loading at once, I think images are taking some time to load
So what I have done is, I am showing an loading image at first and then after some time I am showing the page, that resolves the problem. But I am curious to know is there any other better way to do this?
I prefer to optimise the hell out of my images.
PNG images
You can use pngcrush to optimise your PNG files for you, but personally I find that once I'm done with it pngcrush only succeeds in making it bigger.
Use Indexed-PNG wherever possible. This will limit you to 256 colours, and most graphics editors won't allow partial transparency in Indexed-PNG (but it is possible - you just need the right editor. I use a custom PHP script with the GD image library) but you can expect to drop file size down to just a tiny fraction of what it was.
Reduce the amount of colours overall. PNG compression works best with blocks of the same colour, so reducing the number of colours improves compression.
GIF images
Especially for animations, there's a lot of things you can do.
Reduce the number of frames. Avoid duplicate frames at all costs, and just set the previous frame to have a longer display time.
Use combine rather than replace if possible. You will again have problems with transparent areas, but by using combine you can have each subsequent frame only change the stuff that... changes. This avoids the redundancy of re-writing the entire image if only a small part changes. GIMP has a useful filter "Animation > Optimize for GIF" which will do this for you.
Reduce colours as much as possible. GIF is limited to 256 colours, but if you can limit yourself to 32 or so, you'll get a much smaller file.
Using the above techniques, I once managed to shove 8MB of raw image data into a 125kb animated GIF.
JPG images
JPG is great for photos, but cameras have a tendency to write MASSIVE files.
Play around with the compression factor. Start at around 40%, and slowly bring it up until it looks acceptable. GIMP will show you a preview and the resulting filesize, so make use of that to find an acceptable compromise.
Scale the image down. You don't need 9 megapixels or however massive resolution cameras take now...
The above should help you reduce the amount of size taken by your images. Obviously, you should also cache images appropriately, so they only need to be retrieved once. Also make sure that you specify width and height on image elements so that the browser can reserve the space for them and avoid jumping around as they load...
And you should be pretty good.
It's hard to say what other options are available without knowing what the page looks like, but one option is to reserve space for the images so that the page text renders quickly in the correct position, and the images then load later.
I have several 1000s of images on disk that I need to display a subset of, given user search criteria. What this means is I could be showing 100s at one time. Each image is large- 3510x1131 to be exact. And I need to do some light image processing before they are displayed (adjusting brightness and contrast).
My application to do this is a Web.API app using jQuery.
My first pass was to just pass the URLs to the browser and then size the images and make those adjustments using pixastic. Works, but it's pretty slow.
I was thinking it would be much faster if I could resize the image before doing the adjustments, but I dont want to create copies and then link to those. Maybe there is a way to generate the images on the fly and serve them up? Via REST perhaps? Bottom line is I need a LOSSY image resize, not just setting width and height using css.
Are there tools out there that I am missing or has anyone done something similar? I think what Im looking for is exactly like google image search results- but I don't know how they generate those thumbnails- and if I were to adjust brightness/contrast am I doing it on the thumbail (saving time) or the full size image?
I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk
I'm to take an image on a webpage, and then use javascript (or whatever would be best suited) to dynamically 'pixelate' it (e.g. into 20px squares). Then, as the user scrolls down the page, I need the image to gradually increase in resolution, till it is no longer pixelated.
Any ideas how I could go about doing this? I realise I could use php to resize an image and several times and just switch out the image, but that would require loading several extra images. Also, I know I could probably do the effect with flash & pixelbender, but I want to achieve it within the limitations of HTML5, CSS & Javascript if possible.
Appreciate any thoughts!
Update: Something like this, but with Javascript instead of Flash? http://www.reflektions.com/miniml/template_permalink.asp?id=390
You could render the picture in a hidden <canvas> element. Then use a derivation of the technique described here http://dev.opera.com/articles/view/html-5-canvas-the-basics/#pixelbasedmanipulation . To create a pixelated version of the image in a second <canvas> element using ever decreasing fillRect's. This way you even buffer the orginal image data.
edit: I would use 2 <canvas> elements so that you only have to fetch and draw the original image once. Perhaps you could buffer/cache this image in the same <canvas> element but by drawing it outside of the view port i am not sure if this is possible though.
I would use a calculation where you get the width in pixels divided by the square width and then the height divided by the square height. This would give you the lower resolution your looking for.
Then you can find a way to change the resolution to the result or grab the color of every pixel at position (height and width)/2 of the square your looking for. Then generate them into div tags or table with the appropriate color and size eventually resulting in the image its self.
I have a probably faster idea where you can have multiple versions of the image and change their z-index or their visibility as you scroll. Basically each image would have the different resolutions. If you have to do that to many images then this solution wont be as efficient as there would be lots of image editing but you can always do a batch edit.
Let me see If I can think of more ideas then I will edit.
Have a look at http://close-pixelate.desandro.com/
Explanation here: https://stackoverflow.com/a/8372981/22470
Not in a portable way.
That might be doable in Flash. Firefox JS extensions allow it to read images as JS arrays, Base64 strings etc. You might experiment with "1 DIV=1 pixel" hack, but it's hard to get any reasonable size of the image at any reasonable speed. If you feel really hyper, you could try creating base64-encoded images on the fly using the data: URI... many ways but none good...