I have been trying to enable Caching for my web-page. I find-out so much post related to Caching static file in browser cache, but i did't get success.
I try for both server side or client side code for it:
SERVER SIDE
I tried to put code for set-up a "Cache-Control" header on server side page-load(write code in C#) :
DateTime dt = DateTime.Now.AddMinutes(30);
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.Public);
HttpContext.Current.Response.Cache.SetExpires(dt);
HttpContext.Current.Response.Cache.SetMaxAge(new TimeSpan(dt.Ticks - DateTime.Now.Ticks));
Reference
CLIENT SIDE
In javascript after some googling i find a "Preloading images" technique , but applying this code also not give me correct solution of storing file into cache.
Reference
HTML META TAGS
Added following tags in my page header:
<meta http-equiv="Cache-control" content="private"/>
<meta http-equiv="EXPIRES" content="Wed, 16 oct 2013 11:12:01 GMT"/>
did't get success.
Reference
Can any one tell me what i am doing wrong here?
And any one suggest me for perfect solution/full tutorial for enabling cache to store static files into browser cache.
Thanks in advance....!!!!
First i think you have confused caching with preloading images.
If what you really need is caching, check your browser whether caching is disabled.Because scripts,images and css are cached defaultly by browser.
Next how did you check whether those are cached?
You could use "cache-manifest" which uses AppCache of the browser.
It allows you to run your website offline also.
Head to http://diveintohtml5.info/offline.html for more information.
Hope it helps!!
Related
I had an idea on how to vastly speed up my website and ease the load on my server for cached objects, but I'm wondering if this is an extremely stupid idea before I spend all day thinking about it! Hear me out -
The localStorage object has a minimum of 2.5 MB (in Chrome, more here) which is good enough for most small sites. Which is why I am wondering - for small, static websites, would it be a good idea to basically save the entire site in a huge localStorage file and basically load the site from the visitor's own computer each time? To better explain my idea, here's some pseudo-code:
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<!--other head information-->
</head>
<body>
<script>
if (localStorage.getItem('cached_site')!=null) {
document.getElementsByTagName("body")[0].remove();
document.documentElement.innerHTML = localStorage.getItem('cached_site');
//Somehow stop loading the rest of the site (ajax?)
}
else {
//set localStorage var to the source code of the site.
//**This will be fleshed out to data/base64 all image resources as well**
localStorage.setItem('cached_site',document.documentElement.innerHTML);
//load the site as normal
}
</script>
</body>
</html>
This will save the contents of the body in a localStorage file called cached_site. There are some issues with the logic (like the "stop loading the site" part) but you get the idea.
Is this worth it? Potentially, requests could be completely cut from static content because it's all being loaded onto the user's computer. In order to update the content, maybe another value, perhaps a version number could be saved and checked in the else block.
My webhost limits the amount of requests I can get per day, which is why I thought of this. It's not a huge deal, but an interesting project nonetheless.
This idea will have issues with dynamic web sites... how will you treat dynamically generated web pages?
Another idea would be to save each page into a static html file on the server, and then serve the static page, which will be regenerated when needed.
You should also cache the static parts of your website, i.e. images, scripts, CSS, and store these in the cache of your visitors' browsers.
In Apache, you could use mod_expires, and then set up an .htaccess file like this:
### turn on the Expires engine
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)(\.gz)?$">
ExpiresDefault "access plus 1 month"
FileETag None
</FilesMatch>
This will basically cache all of the static parts of your website in your visitors' browser cache, so they will stop refetching the same files over and over.
You should also try and combine your CSS/Javascript files, and ideally end up with 1-2 javascript files, and 1 CSS file, thus limiting the number of requests per client.
I would also suggest using Yahoo's YSlow and Google's PageSpeed Tools to profile your site. They both contain best practices on how to speed up your page here and here.
Here are some suggestions from Yahoo + Google, at a glance:
Minimize HTTP Requests (use combined files for JS/CSS, CSS sprites, image maps and inline images, where possible)
Add an Expires or a Cache-Control Header (like what has been explained above)
Properly configure ETags, or remove them
Gzip Components (e.g. with mod_deflate in Apache)
Make JavaScript and CSS External
Minify JavaScript and CSS (i.e. remove whitespace and newlines) - in PHP, you can use the minify script to do this
Optimize images
I inherited a website, I'm trying to serve its content over https, but when I do so I get an error that this "content" is being delivered insecurely. The certificate and all that good stuff is set up correctly.
<script type="text/javascript" src="https://domain.com/?dynamic=js"></script>
This doesn't seem to actually reference a file. I've googled but can't find anything to lead me in the right direction. Can anyone provide some insight, or better yet explain why this leads to the security problem?
Yes, it is valid so long as https://domain.com/?dynamic=js generates a javascript file. See this page for more info on dynamic files:
http://www.dynamicdrive.com/forums/showthread.php?21617-Dynamic-external-js-scripts-and-css-stylesheets-with-PHP
If you are running under secure connection (https) then all the resources in your domain have to be also serving via https - like images etc...
check to see if some image is using http: and not https
There's no problem with the script-tag. You don't actually need a .js-extension for it to be valid, as long as it returns JavaScript the browser will be happy.
Also, this line has nothing to do with the HTTPS-error you're getting. You should make sure that ALL the content linked on that page is delivered through HTTPS
Make sure ALL of the assets on the page are served up with relative paths. Images. css. scripts, etc.. Then they will load no matter if you are on https or not.
Relative = "/images/test.jpg" instead of "http://test.com/images/test.jpg"
Also can do Protocol relative url : "//test.com/images/test.jpg" (Thanks to commenter)
how do i cache specific files in html ?
i have tried
meta http-equiv="cache-control" content="private" max-age="604800"
but when i click "audit" using google chrome inspect element
its giving me
Leverage browser caching (4)
The following resources are missing a cache expiration. Resources that do not specify an expiration may not be cached by browsers:
some.css
some.js
The following resources are explicitly non-cacheable. Consider making them cacheable if possible:
some.html
some-hosted.html
how do i cache em ?
Your syntax is invalid.
<meta http-equiv="cache-control" content="max-age=604800;private" />
Mind you this won't work for CSS/JS or anything else that isn't a HTML file. In those cases you need to set real HTTP headers server-side (which is usually done using webserver settings or via dynamic server-side languages like PHP, .Net, Coldfusion, etc...).
The best way of doing this is to server your files through some kind of server side mechanism that attaches the right cache control headers to the http response.
I have this situation where we have media files stored on a global CDN. Our web app is hosted on it's own server and then when the media assets are needed they are called from the CDN url. Recently we had a page where the user can download file attachments, however some of the file types were opening in the browser instead of downloading (such as MP3). The only way around this was to manually specify the HTTP response to attach the file but the only way I could achieve this was to download the file from CDN to my server and then feed it back to the user, which defeats the purpose of having it on the global CDN. Instead I am wondering if there is some client side solution for this?
EDIT: Just found this somewhere, though I'm not sure if it will work right in all the browsers?
<body>
<script>
function downloadme(x){
myTempWindow = window.open(x,'','left=10000,screenX=10000');
myTempWindow.document.execCommand('SaveAs','null','download.pdf');
myTempWindow.close();
}
</script>
<a href=javascript:downloadme('/test.pdf');>Download this pdf</a>
</body>
RE-EDIT: Oh well, so much for that idea -> Does execCommand SaveAs work in Firefox?
Does your CDN allow you to specify the HTTP headers? Amazon cloudfront does, for example.
I found an easy solution to this that worked for me. Add a URL parameter to the file name. This will trick the browser into bypassing it's built in file mappings. For examaple, instead of http://mydomain.com/file.pdf , set your client side link up to point to http://mydomain.com/file.pdf? (added a question mark)
Can anyone help? I have been designing a site using Javascript but the rest of the html content is static ie. images etc
When i load my page in Firefox i have to clear the cache..
I remember a long time ago there was something you could add to the html to force a reload.
My question is, is this a good thing? I presume it caches for a reason i.e to cahce images etc.. But this causes my pages not to refresh
And how to do it?
Really appreciate any feedback
If you want only the js to be loaded afresh everytime, and leave everything else to load from cache, you can add a version number to the js include line like so:
<script src="scripts.js?v=5643" type="text/javascript"></script>
Change the version number (?v=num) part each time you change the js file. This forces the browser to get the js file from the server.
Note: Your actual file name will be the same - scripts.js
For disabling cache for all files, if you're using apache, put this in your httpd.conf
<Directory "/home/website/cgi-bin/">
Header Set Cache-Control "max-age=0, no-store"
</Directory>
You can also put a meta tag on your html like so:
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="cache-control" content="no-cache" />
More info on this here
For web pages you can how the page is cached in the HTTP Header. You should look at Expires if you have a particular date for the cache to expire or Cache-Control for dynamic expiration based on when the page was requested. Here's a pretty good tutorial that covers how cache works and covers set up on the major web servers.
Try pressing Control + F5 when you load your page in FireFox-- this should clear your browser's cache of the page and reload cleanly for you.