I am a full-time developer but am building a site for my photography hobby. I dont want people to download my images and besides the usual procedures (disable right click, block hotlinks to my images etc.) i was thinking about a solution which would work 99% of the time.
The idea was to render images in a canvas or load it as a binary file or similar.
How would be performance compared to standard src linking?
And are there better solutions to my problem?
If a picture is displayed on someone's screen, there is no way you can avoid them to save it on their computer (even if you disable everything).
Trying to obfuscate the images will only result in a loss of time, performance, and could make your website much less user-friendly.
Related
I have the standard "responsive image serving" problem, but with some complex twists. I expect I'll need to build my own solution to the below, but it's a few months down the line so I thought I'd bring this by the community now for help with my approach and getting started. I also think the solution I'm looking for would have pretty wide appeal, so this could be valuable to the community as a whole.
The problem:
We'd like to provide users with images, embedded videos, etc (anything that takes a lot of time/bandwidth to load and takes less when lower res) but change the loaded dimensions depending on the size the element is actually allocated on the page. This is basic "responsive image serving" applied to a few other types of assets (though since we provide lower-bandwidth file versions to mobile devices, I think this also falls under "adaptive design"). But don't worry about other types of content for now, let's focus on images.
We need to determine the appropriate max-width for a each specific asset placement, for each screen width breakpoint, without providing this info as configuration.
I'm creating a platform that will serve pages relying on HTML templates from many different parties. Images can be served from anywhere on the page, and pages can use any styling system they want, so we have no idea what the appropriate size for an image is just by looking at screen width. We need to actually evaluate the max width of the placement at each supported sizing breakpoint. Sure, this could be done manually in advance given a design template, but let's assume that's too much work for these 3rd parties.
For example, in Twitter Bootstrap 3 an image contained in a col-md-8 should be at most 720px width when browser width is < 768, but if it was in a col-sm-8 it should be smaller than 470px. And if we're using a different framework altogether these would clearly be different too. I need solution that can take into account everything the CSS is doing automatically, because I have no idea what the CSS will do.
We can't do any processing during the image request. We rely on a CDN (Cloudfront). They are not going to implement our custom code on each of their edge locations, and I don't want a visitor in New Delhi or Berlin to have to send yet another request halfway around the globe, for every sized asset, before they know what the final url is. So that rules out solutions like this controller-based solution and the PHP adaptive-images script.
We need this to be fast. There's a good amount of wiggle-room on the server side, since caching is so easy and flexible with Rails 3 & 4. But we probably can't use jQuery.width() on every element for performance reasons. After all, the entire reason we're serving responsive images is to decrease perceived page load time. But we do have access to jQuery in general, and we could probably load up Modernizr all the time if we needed to (currently only included for low IE with conditional HTML).
We don't trust User-Agent headers enough to base our browser width on them. I love the idea behind mobvious 1, 2 and its friend responsive-images, but there are SO many versions of browsers on SO many different devices out there. How complex would it be to build a truly reliable system to determine browser width on this, as opposed to directly calculating it using JS?
Clients without javascript (and thus crawlers) will need access to an image. Easiest solution here seems to be to include a <noscript>....</noscript> with the canonical, largest version of the image inside.
The solution
It seems like the only way to do this is to:
Have the server pass all the available sizes, then calculate the width of each element on the client side using jQuery in some performance-efficient way (maybe using $.css_width() or some sort of specialized script). So server would create:
<span data-respv-img-id="picture_of_unicorns"></span>
<noscript data-respv-img-id="picture_of_unicorns" data-img-720- url="//cdn.example.com/assets/picture_of_unicorns_720x480" data-img-320-url="//cdn.example.com/assets/picture_of_unicorns_320x260" data-img-120-url="//cdn.example.com/assets/picture_of_unicorns_120x80">
<img src="//cdn.example.com/assets/picture_of_unicorns_720x480" alt="Magical unicorns">
</noscript>
And if we're on a small screen and only the 120 fits, the JS would turn this into:
<span data-respv-img-id="picture_of_unicorns">
<img src="//cdn.example.com/assets/picture_of_unicorns_120x80" alt="Magical unicorns">
</span>
OR have the server do some sort of pre-processing, so it knows exactly what size image fits each placement on each browser width, and delivers:
<span data-respv-img-id="picture_of_unicorns"></span>
<noscript data-respv-img-id="picture_of_unicorns" data-img-1200- url="//cdn.example.com/assets/picture_of_unicorns_720x480" data-img-1024-url="//cdn.example.com/assets/picture_of_unicorns_320x260" data-img-768-url="//cdn.example.com/assets/picture_of_unicorns_120x80">
<img src="//cdn.example.com/assets/picture_of_unicorns_720x480" alt="Magical unicorns">
</noscript>
And we end up with the same thing as the other approach. But this time jQuery's job was much easier, as we passed all the sizing work off to the server. But this requires loading up a full browser stack on the server side to generate each request. That's ok with caching, but sure does bring a lot of complexity along.
Note that both of these solutions would allow for scroll-based image loading, which is another aspect I'll need to implement, but not something we need to discuss now.
Long story short: Which approach would you recommend? Can you think of a better way?
I have an existing website (a photo blog) that loads the majority of the photos from Flickr. I'd like to enhance the experience for users with high resolution screens and load higher res versions of photos, but I'm not sure what would be the best way to go.
Since the images in question are not UI elements, there is no sensible way to solve this problem with CSS. That leaves either client side JavaScript, or a server side find-and-replace for specific image patterns (since they come from Flickr, it's easy to detect and easy enough to figure out the url for a double-sized image).
For client side, my concern is that even the regular sized images are 500-800 KB in size, there there can be 10-30 images per gallery, causing lots of excess bandwidth use for retina users.
For server side, it's obviously tricky to determine if the request comes from a retina device or not. One idea I had (which I have yet to test out), was to run a JavaScript function that checks window.devicePixelRatio and sets a cookie accordingly, and then on each successive page request the server would know if the device is high res or not. That leaves the entry page with non-retina images, but at least all the next ones will have high res images loaded right away.
Are there any pitfalls to this proposed solution? Are there better ways to handle it?
You can generate CSS on server that will have links to both regular size and double-sized images in background-image property. This CSS can easily be different for every page by including it in <style> tag and referencing classes/ids that only exist on this page. This will deal with traffic issue, since majority of modern browsers don't load pictures for other DPIs.
Other solution (though worse) will be not just set cookie, but load images with JavaScript. This will solve the initial first page issue, but slow down the page rendering a little.
The ability of a browser (Chrome for example) to handle a number of images, is limited only by the hardware of the computer on which it is displayed or also by the software itself?
I'm trying to develope an image viewer that must content lot of files that must be accesible instantly depending on the demand of the user and sometimes when i go over 350 files of 300kb the page frozens.
Thank you all for your help!!
It is probably limited by both.
There are limits to how large a struct is allowed to be for instance, because it has to fit in a fixed amount of bytes.
(see this question for instance)
(also, you're not yet running into the max size of an int so that is probably not what is happening right now.)
Besides this there are several more constraints.
That said, loading a gigantic amount of images every time you open a page is probably not a good idea.
Take some inspiration from how others (like google in their image search) have solved this problem simply by not loading the images until they are needed.
I think the best approach would be to have some small thumbnails of each image and on request of the user(click) load the bigger one, like Timothy said.
If you need it to be faster you can preload images. So for example if you have a list of images and the user scrolls through you just load the next n images. To free space you can "delete" the ones the user did already passed.
I was wondering if there is any other ways to compress my images or any script that would load the page faster / or the the images behind the scenes?
The site is very interactive and using very high quality layers of images for the main layout. I have already saved for web devices in Photoshop and re-compressed using ImageOptim, some are jpeg but the majority are png24 to maintain transparancy, they are all set in CSS.
I have used jpegs and css sprites where i can but there is one in particular of a tree illustration streching the full site length, that is really slowing up the loading time, is there any I could compress these images further or code them differently that I missed?
Any help would be great thanks!
You said you are spriting. That is good.
You can also use tools such as PNGcrush which attempt to make files smaller by dropping things such as meta data.
You should also send far distant expiry headers and use a cache breaker on your images, to ensure the images won't be downloaded again if unnecessary.
In Photoshop, choose file-> save for web, you will be able to find the best compromise between size and quality.
Do you really need the transparency there? PNG transparency is unsupported on some browsers and makes the page processing intensive and slow even on high end computers, depending on image size and quantity of layers. If you can show something of your site maybe someone can give more hints about how to optimize it.
You can compress them on the fly with Apache if that's your web server. One of many available articles on the subject: http://www.samaxes.com/2009/01/more-on-compressing-and-caching-your-site-with-htaccess/
I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk