So, i was doing a link to download an image from a data url(a LARGE one):
<a download='fileName' href="data:image/png;base64,/9j/4WSsRX...">something</a>
However, whenever i try to click in that link i receive an error telling me some net problems.
I have make a fiddle test, but its LARGE(15mb of text) and it will take sometime to load:
https://jsfiddle.net/jjydp1ek/
As the jsfiddle is hard to load, i added a file in mediafire:
http://www.mediafire.com/download/p85y1g442ne9v6m/new++7.html
The test is an image with the same data url value as the link, the image is visible, however i see that the option to open image in a new tab on chrome isn't working.
I do it with canvas in ie 11 and is failing too
Questions:
It is ever possible to make it work with the download link as it is now?
Is there a limit size with the data url to download a file, which is?
How do i do to make the user able to download that image?
Also, ask questions here, or correct any error in the text you see if you think its not understandable.
Thanks.
I have a 70Mb broadband and a powerhouse of a PC and that JS fiddle won't even open.
I don't think it's feasible to have a 15MB encoded string, since that has to be downloaded onto the page each time on every visit. I would try the following:
Optimise the image, you could incorporate gulp-imagemin if you have/want to have Gulp for a build system. I think there are alternatives for Grunt if you wanted to go that way.
Store the file on the server and just place a link to the path, this is the preferred solution.
In response to your questions
The limit:
Length limitations
Although Mozilla supports data URIs of essentially
unlimited length, browsers are not required to support any particular
maximum length of data. For example, the Opera 11 browser limits data
URIs to around 65000 characters.
Source: data URIs - MDN
Downloading
The above suggestion on optimising the image as small as you can get without losing quality if that's a concern. Try it then. If not, it's not a problem to give the user a link to the image / display it on the page. The user can right click and save.
Note
By the time I finished writing this response JSFiddle timed out.
I'm currently trying to preload images for a webpage I'm creating as those images are quite big.
Currently I know (thanks to another post here) how to handle the images themselves via preloading them (via javascript pre loading and then displaying them in a canvas).
BUT whenever I switch the page the preloaded images need to be preloaded again, thus they are not cached.
So my question is: Is there any possibility to cache these images?
(or is it even best to put them into a session variable?)
The images themselves are quite big and can take up 1.5MB each (in total there are 20 images alone in the part that is currently already in existence, which takes about 4 seconds to preload).
As infos if necessary:
I'm using an apache server and php as primary language with javascript as support.
Edit:
As I forgot to mention it: The webserver I will finally store the site on is an external one (hosting provider) so I won't be able to edit the webserversettings themselves there
If the images don't change, try something like this in .htaccess:
#Set caching on image files for 11 months
<filesMatch "\.(ico|gif|jpg|png)$">
ExpiresActive On
ExpiresDefault "access plus 11 month"
Header append Cache-Control "public"
</filesMatch>
If you think this is not the right approach, like the images may change, just eager-load the images right when the page hits (warning, definitely a hack):
(function(){
var hiddenCache = document.createElement("div");
hiddenCache.style.display = "none";
document.body.appendChild(hiddenCache);
// or for loop if ECMA 3
myEagerLoadedImageUrls.forEach(function(urlStr){
var hiddenImg = document.createElement("img");
hiddenImg.src = urlStr;
hiddenCache.appendChild(hiddenImg)
});
})()
The browser already caches the images in its memory and/or disk cache as long as the headers coming from the server aren't telling it to avoid caching. The browser cache endures across page loads. SO, if your images have been loaded once on the first page, they should be in the browser cache already for the second page and thus when requested on the second page, they should load locally and not have to be fetched over the internet.
If you're looking for client-side code that can be used to preload images, there are many examples:
How do you cache an image in Javascript
Image preloader javascript that supports events
Is there a way to load images to user's cache asynchronously?
FYI, it is possible in newer browsers to use a combination of Local Storage and data URIs to implement your own image caching, but I'd be surprised if there was any real world situation where that was required and if you have a lot of images, you may run into storage limits on Local Storage quicker than limits on the size of the browser cache.
I have a webapp that shows Google+ profile pics of various users.
I just came across an error saying '403 rate limit exceeded' and all the images on my webapp are broken.
Note : I noticed that after a few mins this error goes away.
I don't want users to get a broken images when they visit my webapp, so I was thinking to cache those images in HTML5 localstorage, but I'm not sure if it will help.
Suppose I have an image (Note : I have several such images of different users on my webpage)
<img src='/url_to_g+_dp'>
. . . other html . . .
<script>JavaScript</script>
and some JavaScript that saves the image to localstorage after it finishes loading. The next time user visits the page, the browser will make a request to '/url_to_g+_dp' before JavaScript gets a chance to check if the image exists in localstorage. This way, even if I implement caching it won't help me to a large extent.
Please correct me if I'm wrong.
Suggestions are welcome.
Cheers :)
Caching will help, but not the way you propose it. Dont use an img tag, use something else and set dom properties so you know what to load. Later find all such elements and replace them with imeg, with the appropiate url (cached or uncached)
I'm not sure what the bandwidth limit is for viewing Google + profile images on an external domain but your method would at least seem to buy you some time before running into the 403 error.
You will probably want to solely use javascript to load your image data.
first give your image an id and empty out the src attribute:
<img id='profile-img' src=''>
then always load the image with your javascript. (I don't believe localStorage will allow you to store binary data so you will want to base64 encode your image before storing it):
<script>
var profile_img;
if ( localStorage.getItem('profile_image')) {
//if the image is already locally stored then use it
profile_img = localStorage.getItem('profile_image');
}
else {
//grab the google + profile image and store it locally
profile_img = [base64 encoded image from google +]; //i'm not sure off of the top of my head how you would want to go about accomplishing this.
localStorage.setItem('profile_image',profile_img);
}
document.getElementById("profile-img").src='data:image/png;base64,' + profile_img;
</script>
I am trying to clone an image which is generated randomly.
Although I am using the exact same url a different image is load. (tested in chrome and firefox)
I can't change the image server so I am looking for a pure javascript/jQuery solution.
How do you force the browser to reuse the first image?
Firefox:
Chrome:
Try it yourself (maybe you have to reload it several times to see it)
Code:
http://jsfiddle.net/TRUbK/
$("<img/>").attr('src', img_src)
$("<div/>").css('background', background)
$("#source").clone()
Demo:
http://jsfiddle.net/TRUbK/embedded/result/
You can't change the image server if it isn't yours, but you can trivially write something on your own server to handle it for you.
First write something in your server-side language of choice (PHP, ASP.NET, whatever) that:
Hits http://a.random-image.net/handler.aspx?username=chaosdragon&randomizername=goat&random=292.3402&fromrandomrandomizer=yes and downloads it. You generate a key in one of two way. Either get a hash of the whole thing (MD5 should be fine, it's not a security-related use so worries that it's too weak these days don't apply). Or get the size of the image - the latter could have a few duplicates, but is faster to produce.
If the image isn't already stored, save it in a location using that key as part of its filename, and the content-type as another part (in case there's a mixture of JPEGs and PNGs)
Respond with an XML or JSON response with the URI for the next stage.
In your client side-code, you hit that URI through XmlHttpRequest to obtain the URI to use with your images. If you want a new random one, hit that first URI again, if you want the same image for two or more places, use the same result.
That URI hits something like http://yourserver/storedRandImage?id=XXX where XXX is the key (hash or size as decided above). The handler for that looks up the stored copies of the images, and sends the file down the response stream, with the correct content-type.
This is all really easy technically, but the possible issue is a legal one, since you're storing copies of the images on another server, you may no longer be within the terms of your agreement with the service sending the random images.
You can try saving the base64 representation of the image.
Load the image in an hidden div/canvas, then convert it in base64. (I'm not sure if a canvas can be hidden, nor if it is possible to convery the img using html4 tag)
Now you can store the "stringified" image in a cookie, and use it unlimited times...
The headers being sent from your random image generator script include a Cache-Control: max-age=0 declaration which is in essence telling the browser not to cache the image.
You need to modify your image generator script/server to send proper caching headers if you want the result to be cached.
You also need to make sure that the URL stays the same (I didn't look at that aspect since there were tons of parameter being passed).
There seems to be two workarounds:
If you go with the Canvas method, see if you can get the image to load onto the Canvas itself so that you can manipulate the image data directly instead of making a 2nd http request for the image. You can feed the image data directly onto a 2nd Canvas.
If you're going to build a proxy, you can have the proxy remove the No-Cache directive so that subsequent requests by your browser use the cache (no guarantees here - depends on browser/user settings).
First off, you can "force" anything on the web. If you need to force things, then web development is the wrong medium for you.
What you could try, is to use a canvas element to copy the image. See https://developer.mozilla.org/en-US/docs/Web/Guide/HTML/Canvas_tutorial/Using_images for examples.
Tell it to stop getting a random image, seems to work the way you want when I add this third replace call:
// Get the canvas element.
var background = ($("#test").css('background-image')),
img_src = background.replace(/^.+\('?"?/, '').replace(/'?"?\).*$/, '').replace(/&fromrandomrandomizer=yes/,'')
try:
var myImg = new Image();
myImg.src = img_src;
and then append "myImg" to where you want:
$(document).append(myImg);
I did this with your fiddler scripts and got the same image every time
#test {
background:url(http://a.random-image.net.nyud.net/handler.aspx?username=chaosdragon&randomizername=goat&random=292.3402&fromrandomrandomizer=yes);
width: 150px;
height: 150px;
}
note the .nyud.net after the domain name.
I know there are many ways to prevent image caching (such as via META tags), as well as a few nice tricks to ensure that the current version of an image is shown with every page load (such as image.jpg?x=timestamp), but is there any way to actually clear or replace an image in the browsers cache so that neither of the methods above are necessary?
As an example, lets say there are 100 images on a page and that these images are named "01.jpg", "02.jpg", "03.jpg", etc. If image "42.jpg" is replaced, is there any way to replace it in the cache so that "42.jpg" will automatically display the new image on successive page loads? I can't use the META tag method, because I need everuthing that ISN"T replaced to remain cached, and I can't use the timestamp method, because I don't want ALL of the images to be reloaded every time the page loads.
I've racked my brain and scoured the Internet for a way to do this (preferrably via javascript), but no luck. Any suggestions?
If you're writing the page dynamically, you can add the last-modified timestamp to the URL:
<img src="image.jpg?lastmod=12345678" ...
<meta> is absolutely irrelevant. In fact, you shouldn't try use it for controlling cache at all (by the time anything reads content of the document, it's already cached).
In HTTP each URL is independent. Whatever you do to the HTML document, it won't apply to images.
To control caching you could change URLs each time their content changes. If you update images from time to time, allow them to be cached forever and use a new filename (with a version, hash or a date) for the new image — it's the best solution for long-lived files.
If your image changes very often (every few minutes, or even on each request), then send Cache-control: no-cache or Cache-control: max-age=xx where xx is the number of seconds that image is "fresh".
Random URL for short-lived files is bad idea. It pollutes caches with useless files and forces useful files to be purged sooner.
If you have Apache and mod_headers or mod_expires then create .htaccess file with appropriate rules.
<Files ~ "-nocache\.jpg">
Header set Cache-control "no-cache"
</Files>
Above will make *-nocache.jpg files non-cacheable.
You could also serve images via PHP script (they have awful cachability by default ;)
Contrary to what some of the other answers have said, there IS a way for client-side javascript to replace a cached image. The trick is to create a hidden <iframe>, set its src attribute to the image URL, wait for it to load, then forcibly reload it by calling location.reload(true). That will update the cached copy of the image. You may then replace the <img> elements on your page (or reload your page) to see the updated version of the image.
(Small caveat: if updating individual <img> elements, and if there are more than one having the image that was updated, you've got to clear or remove them ALL, and then replace or reset them. If you do it one-by-one, some browsers will copy the in-memory version of the image from other tags, and the result is you might not see your updated image, despite its being in the cache).
I posted some code to do this kind of update here.
Change the image url like this, add a random string to the querystring.
"image1.jpg?" + DateTime.Now.ToString("ddMMyyyyhhmmsstt");
I'm sure most browsers respect the Last-Modified HTTP header. Send those out and request a new image. It will be cached by the browser if the Last-Modified line doesn't change.
You can append a random number to the image which is like giving it a new version. I have implemented the similar logic and it's working perfectly.
<script>
var num = Math.random();
var imgSrc= "image.png?v="+num;
$(function() {
$('#imgID').attr("src", imgSrc);
})
</script>
I found this article on how to cache bust any file
There are many ways to force a cache bust in this article but this is the way I did it for my image:
fetch('/thing/stuck/in/cache', {method:'POST', credentials:'include'});
The reason the ?x=timestamp trick is used is because that's the only way to do it on a per image basis. That or dynamically generate image names and point to an application that outputs the image.
I suggest you figure out, server side, if the image has been changed/updated, and if so then output your tag with the ?x=timestamp trick to force the new image.
No, there is no way to force a file in a browser cache to be deleted, either by the web server or by anything that you can put into the files it sends. The browser cache is owned by the browser, and controlled by the user.
Hence, you should treat each file and each URL as a precious resource that should be managed carefully.
Therefore, porneL's suggestion of versioning the image files seems to be the best long-term answer. The ETAG is used under normal circumstances, but maybe your efforts have nullified it? Try changing the ETAG, as suggested.
Change the ETAG for the image.
See http://en.wikipedia.org/wiki/URI_scheme
Notice that you can provide a unique username:password# combo as a prefix to the domain portion of the uri. In my experimentation, I've found that inclusion of this with a fake ID (or password I assume) results in the treatment of the resource as unique - thus breaking the caching as you desire.
Simply use a timestamp as the username and as far as I can tell the server ignores this portion of the uri as long as authentication is not turned on.
Btw - I also couldn't use the tricks above with a google map marker icon caching problem I was having where the ?param=timestamp trick worked, but caused issues with disappearing overlays. Never could figure out why this was happening, but so far so good using this method. What I'm unsure of, is if passing fake credentials will have any adverse server performance affects. If anyone knows I'd be interested to know as I'm not yet in high volume production.
Please report back your results.
Since most, if not all, answers and comments here are copies of parts the question, or close enough, I shall throw my 2 cents in.
I just want to point out that even if there is a way it is going to be difficult to implement. The logic of it traps us. From a logical stance telling the browser to replace it's cached images for each changed image on a list since a certain date is ideal BUT... When would you take the list down and how would you know if everyone has the latest version who would visit again?
So my 1st "suggestion", as the OP asked for, is this list theory.
How I see doing this is:
A.) Have a list that our dynamic and manual changed image urls can be stored.
B.) Set a dead date where the catch will be reset and the list will be truncated regardless.
C.0) Check list on site entrance vs browser via i frame which could be ran in the background with a shorter cache header set to re-cache them all against the farthest date on the list or something of that nature.
C.1) Using the Iframe or ajax/xhr request I'm thinking you could loop through each image of the list refreshing the page to show a different image and check the cache against it's own modified date. So on this image's onload use serverside to decipher if it is not the last image when it is loaded go to the next image.
C.1a) This would mean that our list may need more information per image and I think the obvious one is the possible need of some server side script to adjust the headers as required by each image to minimize the footstep of re-caching changed site images.
My 2nd "suggestion" would be to notify the user of changes and direct them to clear their cache. (Carefully, remove only images and files when possible or warn them of data removal due to the process)
P.S. This is just an educated ideation. A quick theory. If/when I make it I will post the final. Probably not here because it will require server side scripting. This is at least a suggestion not mentioned in the OP's question that he say's he already tried.
It sounds like the base of your question is how to get the old version of the image out of the cache. I've had success just making a new call and specifying in the header not to pull from cache. You're just throwing this away once you fetch it, but the browser's cache should have the updated image at that point.
var headers = new Headers()
headers.append('pragma', 'no-cache')
headers.append('cache-control', 'no-cache')
var init = {
method: 'GET',
headers: headers,
mode: 'no-cors',
cache: 'no-cache',
}
fetch(new Request('path/to.file'), init)
However, it's important to recognize that this only affects the browser this is called from. If you want a new version of the file for any browser once the image is replaced, that will need to be accomplished via server configuration.
Here is a solution using the PHP function filemtime():
<?php
$addthis = filemtime('myimf.jpg');
?>
<img src="myimg.jpg?"<?= $addthis;?> >
Use the file modified time as a parameter will cause it to read from a cached version until the file has changed. This approach is better than using e.g. a random number as caching will still work if the file has not changed.
In the event that an image is re-uploaded, is there a way to CLEAR or REPLACE the previously cached image client-side? In my example above, the goal is to make the browser forget what "42.jpg" is
You're running firefox right?
Find the Tools Menu
Select Clear Private Data
Untick all the checkboxes except make sure Cache is Checked
Press OK
:-)
In all seriousness, I've never heard of such a thing existing, and I doubt there is an API for it. I can't imagine it'd be a good idea on part of browser developers to let you go poking around in their cache, and there's no motivation that I can see for them to ever implement such a feature.
I CANNOT use the META tag method OR the timestamp method, because I want all of the images cached under normal circumstances.
Why can't you use a timestamp (or etag, which amounts to the same thing)? Remember you should be using the timestamp of the image file itself, not just Time.Now.
I hate to be the bearer of bad news, but you don't have any other options.
If the images don't change, neither will the timestamp, so everything will be cached "under normal circumstances". If the images do change, they'll get a new timestamp (which they'll need to for caching reasons), but then that timestamp will remain valid forever until someone replaces the image again.
When changing the image filename is not an option then use a server side session variable and a javascript window.location.reload() function. As follows:
After Upload Complete:
Session("reload") = "yes"
On page_load:
If Session("reload") = "yes" Then
Session("reload") = Nothing
ClientScript.RegisterStartupScript(Me.GetType), "ReloadImages", "window.location.reload();", True)
End If
This allows the client browser to refresh only once because the session variable is reset after one occurance.
Hope this helps.
To replace cache for pictore you can store on server-side some version value and when you load picture just send this value instead timestamp. When your image will be changed change it`s version.
Try this code snippet:
var url = imgUrl? + Math.random();
This will make sure that each request is unique, so you will get the latest image always.
After much testing, the solution I have found in the following way.
1- I create a temporary folder to copy the images with the name adding time () .. (if the folder exists I delete content)
2- load the images from that temporary local folder
in this way I always make sure that the browser never caches images and works 100% correctly.
if (!is_dir(getcwd(). 'articulostemp')){
$oldmask = umask(0);mkdir(getcwd(). 'articulostemp', 0775);umask($oldmask);
}else{
rrmfiles(getcwd(). 'articulostemp');
}
foreach ($images as $image) {
$tmpname = time().'-'.$image;
$srcimage = getcwd().'articulos/'.$image;
$tmpimage = getcwd().'articulostemp/'.$tmpname;
copy($srcimage,$tmpimage);
$urlimage='articulostemp/'.$tmpname;
echo ' <img loading="lazy" src="'.$urlimage.'"/> ';
}
try below solutions,
myImg.src = "http://localhost/image.jpg?" + new Date().getTime();
Above solutions work for me :)
I usually do the same as #Greg told us, and I have a function for that:
function addMagicRefresh(url)
{
var symbol = url.indexOf('?') == -1 ? '?' : '&';
var magic = Math.random()*999999;
return url + symbol + 'magic=' + magic;
}
This will work since your server accepts it and you don't use the "magic" parameter any other way.
I hope it helps.
I have tried something ridiculously simple:
Go to FTP folder of the website and rename the IMG folder to IMG2. Refresh your website and you will see the images will be missing. Then rename the folder IMG2 back to IMG and it's done, at least it worked for me in Safari.