Jquery getScript caching - javascript

By Default $.getScript() disables caching and you can use $.ajaxSetup and set caching to true. When testing if the script is actually cached with Firebug most of the time the script is coming back at 200 (Which means the script is a fresh copy) and one in maybe 20 or 30 times it will come back 304 (meaning it used a cached version). Why is it getting a new copy the vast majority of the time?
$.ajaxSetup({
cache: true
});
$.getScript( scriptFile );
The files that getScript retrieves have not been edited and the requests are a page change apart.

First lets clarify what is means that the jQuery disables the caching.
When jQuery disables the cache is means that is force the file to be load it again by the browser with some kind of trick, eg by adding one extra random number as parameter at the end of the url.
When jQuery have enable the cache, is not force anything and let the cache that you have set on the header of this file. Which means that if you do not have set on the header of the files parameters to keep it on browser cache, the browser will try to load it again by some methods.
So with enable the cache by jQuery you must also have been set the correct cache headers on your static files to be keep on browser cache, or else browser may try to load them again.
For files that browser see the created date on header, then is connect to the server asking the header again, is compare it and if is not have change then is not load it again, but is make one call to the server.
For files that you have set the a max age, and not ask the server till that date, then the browser is direct load it from the cache if he finds it.
To summarize:
The cache:true is let the browser decide for the cache of this file from the header you send.
The cache:false is force the file to be load again.
Some relative to cache questions:
caching JavaScript files
IIS7 Cache-Control
Tthe inside code
The getScript() is calling the jQuery.get() witch is a shorthand Ajax function of
$.ajax({
url: url,
data: data,
success: success,
dataType: dataType
});
So by calling the getScript() you make an ajax call, and the jQuery did not keep any kind of cache of your files if this is what you think at the first place.
Custom function to load the sripts
If you do not won to make a global cache:true, and you need only some files to be loaded with cache:true, you can make a custom function as:
function getScriptCcd(url, callback)
{
jQuery.ajax({
type: "GET",
url: url,
success: callback,
dataType: "script",
cache: true
});
};
This is not affected by the global cache parameter and is load the script files with out adding anything non-cache parameters at the end.

There is an error as of the date this question was posted where both Firefox and Chrome would state that a script is not being loaded from Cache when it indeed is. As of the date of this answer this issue still exists. The easiest way to test is to use console.log and send out a version number.
To cache a dynamically loaded script it it simply done by using the following code.
function onDemandScript ( url, callback ) {
callback = (typeof callback != 'undefined') ? callback : {};
$.ajax({
type: "GET",
url: url,
success: callback,
dataType: "script",
cache: true
});
}
For development you should comment out cache: true.

By default, $.getScript() sets the cache setting to false. This appends a timestamped query parameter to the request URL to ensure that the browser downloads the script each time it is requested.
jQuery doc site has a nice extension for not appending a timestamp to the request and bypass the cache:
jQuery.cachedScript = function( url, options ) {
// Allow user to set any option except for dataType, cache, and url
options = $.extend( options || {}, {
dataType: "script",
cache: true,
url: url
});
// Use $.ajax() since it is more flexible than $.getScript
// Return the jqXHR object so we can chain callbacks
return jQuery.ajax( options );
};
// Usage
$.cachedScript( "ajax/test.js" ).done(function( script, textStatus ) {
console.log( textStatus );
});
Source

There's a better option actually, you can turn ON caching for certain requests, for example:
$.ajaxPrefilter(function( options ) {
if ( options.type==='GET' && options.dataType ==='script' ) {
options.cache=true;
}
});

I know this is an old post, and the existing answer is the real answer, but touching on Iscariot's concern IT REALLY IS CACHING (at least kinda sorta). This is just a quirk of firefox. Maybe this will prove useful to others who are confused by this quirk.
I tested this concept with a REALLY LARGE javascript file that defines google map polygons for the Idaho DOT district boundaries based on arrays of tens of thousands of latlons (the uncompressed file size is 2,806,257, but I run it through a compression process). Using the following javascript
// Grab polys if not already loaded
if (typeof(defaults.data.polys) === 'undefined') {
/*$.getScript('/Scripts/ScriptMaster.php?file=Districts', function () {});*/
$.ajax({
type: "GET",
url: '/Scripts/ScriptMaster.php?file=Districts',
success: function() {
defaults.data.polys = getPolys();
data.polys = defaults.data.polys;
},
dataType: "script",
cache: true
});
}
and you can see the relevant php (you don't want the actual Districts.js file it would take too much space on this post, so here's ScriptMaster.php)
<?php
require_once('../settings.php');
if (!isset($_GET['file'])) die();
$file = $_GET['file'];
$doCache = $file == 'Districts';
header('Content-type: application/x-javascript');
if ($doCache) {
// This is a luxury for loading Districts.js into cache to improve speed
// It is at the top because firefox still checks the server for
// headers even when it's already cached
$expires = 7 * 60 * 60 * 24; // set cache control to expire in a week (this is not likely to change)
header('Cache-Control: max-age='.$expires.', must-revalidate');
header('Last-modified: Fri, 3 May 2013 10:12:37 GMT');
header('Expires: '.gmdate('D, d M Y H:i:s', time() + $expires).'GMT');
header('Pragma: public');
}
ob_start("compress");
require_once($file.".js");
ob_end_flush();
function compress($buffer) {
global $doCache;
if (DEV_MODE && !$doCache) return $buffer;
/* remove comments */
$buffer = preg_replace('/\/\/.+?$/m', '', preg_replace('!/\*[^*]*\*+([^/][^*]*\*+)*/!', '', $buffer));
/* remove tabs, spaces, new lines, etc. */
$buffer = str_replace(array("\r\n", "\r", "\n", "\t", ' ', ' ', ' '), '', $buffer);
/* remove unnecessary spaces */
$buffer = str_replace(': ', ':', $buffer);
$buffer = str_replace(' :', ':', $buffer);
$buffer = str_replace(', ', ',', $buffer);
$buffer = str_replace(' ,', ',', $buffer);
$buffer = str_replace('; ', ';', $buffer);
$buffer = str_replace(' ;', ';', $buffer);
$buffer = str_replace('{ ', '{', $buffer);
$buffer = str_replace(' {', '{', $buffer);
$buffer = str_replace('} ', '}', $buffer);
$buffer = str_replace(' }', '}', $buffer);
if ($doCache) { header('Content-Length: '.strlen($buffer)); }
return $buffer;
}
?>
It's important to note that calling php's header functions BEFORE the script even executes the string you're going to print as unlike chrome and possibly (probably, I'm just too lazy to check) other browsers firefox appears to make a ping to server to check for headers before using cache. Maybe with more research you could determine if this pertains to elements in as equally as it does with ajax (probably not).
So I did five test runs showing the load times for this script with ajax as stated in firebug. Here are the results
#results loading the script after clearing cache (yes those are seconds, not ms)
200 OK 4.89s
200 OK 4.9s
200 OK 5.11s
200 OK 5.78s
200 OK 5.14s
#results loading the page with control+r
200 OK 101ms
200 OK 214ms
200 OK 24ms
200 OK 196ms
200 OK 99ms
200 OK 109ms
#results loading the page again by navigating (not refreshing)
200 OK 18ms
200 OK 222ms
200 OK 117ms
200 OK 204ms
200 OK 19ms
200 OK 20ms
As you can see, my localhost server to web client connection is not the most consistent and my laptop specs are a little shabby (single core processor and all, it's a few years old too) BUT THE POINT is there is a significant drop in load time after the cache is loaded.
[Also in case anyone's curious without the compression script (not like tabs, spaces or new lines are wasted, it just has to be readable still) takes somewhere between 7-8 seconds to load, but I'm not going to do that five times]
So never fear, it really is caching. For smaller scripts that only take ms's to load you may not notice the difference in firefox, honestly; simply because it checks for headers from the server. I know this because of the load time change from moving those header functions from the end of the script to the start. If you have those functions after php goes through the string it takes longer to load.
Hope this helps!

What do you perhaps are looking for is a getScriptOnce function, which basically, if it knows that the file was already loaded successfully, does not load again such file when such function is called.
I wrote such function. You can test with the Network tab in Firebug or Chrome dev tools. This just loads the same file once. You just need to copy to your files the getScriptOnce function and the global array ScriptArray
var getScriptOnce = (function(url, callback) {
var scriptArray = []; //array of urls
return function (url, callback) {
//the array doesn't have such url
if (scriptArray.indexOf(url) === -1){
if (typeof callback === 'function') {
return $.getScript(url, function(script, textStatus, jqXHR) {
scriptArray.push(url);
callback(script, textStatus, jqXHR);
});
} else {
return $.getScript(url, function(){
scriptArray.push(url);
});
}
}
//the file is already there, it does nothing
//to support as of jQuery 1.5 methods .done().fail()
else{
return {
done: function () {
return {
fail: function () {}
};
}
};
}
}
}());
/*#####################################################################*/
/*#####################################################################*/
//TEST - tries to load the same jQuery file twice
var jQueryURL = "https://code.jquery.com/jquery-3.2.1.js";
console.log("Tries to load #1");
getScriptOnce(jQueryURL, function(){
console.log("Loaded successfully #1")
});
//waits 2 seconds and tries to load again
window.setTimeout(function(){
console.log("Tries to load #2");
getScriptOnce(jQueryURL, function(){
console.log("Loaded successfully #2");
});
}, 2000);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Related

How to ensure that the javascript is loaded only once

I am calling a JS using jquery getScript().
Sometimes i could see that the files are already loaded (cached resource).
So,On refreshing the cached page is not removed and also the same file is loaded again.
Because of the multiple includes of the same file i am getting errors.
How to avoid that ?
$.getScript("http://localhost:8888//../../demo.js", function()
{
console.log('Script is loaded.');
});
By default, $.getScript sets the cache setting to false. Try setting it to true to see if this solves your problem:
$.ajaxSetup({
cache: true
});
Add the above before your call like:
$.ajaxSetup({
cache: true
});
$.getScript("http://localhost:8888//../../demo.js", function() { console.log('Script is loaded.'); });
directly from jquery docs:
Caching Responses
By default, $.getScript() sets the cache setting to false. This
appends a timestamped query parameter to the request URL to ensure
that the browser downloads the script each time it is requested. You
can override this feature by setting the cache property globally using
$.ajaxSetup():
$.ajaxSetup({ cache: true }); Alternatively, you could define
a new method that uses the more flexible $.ajax() method.
Examples: Example: Define a $.cachedScript() method that allows
fetching a cached script:
jQuery.cachedScript =
function( url, options ) {
// Allow user to set any option except for dataType, cache, and url options = $.extend( options || {}, {
dataType: "script",
cache: true,
url: url });
// Use $.ajax() since it is more flexible than $.getScript // Return the jqXHR object so we can chain callbacks return
jQuery.ajax( options ); }; // Usage $.cachedScript( "ajax/test.js"
).done(function( script, textStatus ) { console.log( textStatus );
});
I believe if it is cached the browser will not go make a new request for it, it will know to load the cached version, so you are good just firing off your $.getScript as you have it.
It may appear in the network tab of chrome developer tools again, but the time will be 0 and the Size (Content) value will say '(from cache)' This would be a good way to test what is actually going on.
Assuming your demo.js file contains at least one function or variable, you could check for presence before loading again:
if (typeof(your_variable) === "undefined") {
$.getScript("http://localhost:8888//../../demo.js", function() { console.log('Script is loaded.'); });
}
(where your_variable is the name of a function or variable inside demo.js)
Cashingvis good feature I solved multiple time loading js when I I load through jquery before. My issue was when I call a file loading by jquery I have a jquery file in that loading file now it loads only once so then events i now envoje only once. Thanks a lot have a nice day.

Ajax progress bar with a big list

I have an ajax call that is grabbing a large json list. Is there any way I can make a progress bar that gets the real value of json load (for example a status bar that says 1 out of 200 loaded)?
Right now I have a pretty basic Ajax call
function SendAjax(urlMethod, jsonData, returnFunction) {
$.ajax({
type: "GET",
contentType: "application/json; charset=utf-8",
url: urlMethod,
data: jsonData,
dataType: "json",
success: function (msg) {
if (msg != null) {
ReturnJson(msg);
}
},
error: function (xhr, status, error) {
// Boil the ASP.NET AJAX error down to JSON.
var err = eval("(" + xhr.responseText + ")");
// Display the specific error raised by the server
alert(err.Message);
}
});
}
Try using AjaxStart on your application global scope. That means you can put the code in your layout file, and if the processing is long, it will show the progress indicator...
$(document).ajaxStart(function() {
$( "#loading" ).show();
});
You can see the example and answer at preload with percentage - javascript/jquery.
there are a few states in an ajax request, but they do not represent any percentage through the request.
The network traffic should really be your real problem, you do not want to send 200 separate requests (although this would allow for a progress bar, it would make the whole request take significantly longer, depending on your network connection to the server).
You are best off just showing an activity indicator and removing it when the request completes and try to optimise your code server side to return the 200 items as fast as possible.
If 200 items really is too big (larger than X seconds to return them), you could split it in half or quarters to show some progress however this will waste time with those extra requests on network, page headers, etc.
If your server-side code has a way of sharing application state (such as the $_SESSION in PHP) you could make 2 separate requests, one that asks for the data, and one that checks on progress of the first request. Repeat the second request on a timer until the first completes, and update the $_SESSION (or whatever works in your server code) as each item is processed.
For example:
The initial page must start a session, so that the subsequent AJAX calls have the cookie, and can access the shared data:
<?php
session_start();
session_write_close(); // close the session so other scripts can access the file (doesn't end the session)
// your page content here
?>
First AJAX Call to start the processing:
<?php
function updateSession($count){
session_start(); // open the session file
$_SESSION['progress'] = $count;
session_write_close(); // let other requests access the session
}
// as you process each item, call the above function, ex:
for ($i = 1; $i <= 10; $i++) {
updateSession($i);
}
?>
Second AJAX call (repeated every X seconds) looks like:
<?php
session_start(); // open the session file
echo #$_SESSION['progress'] or 0; // echo contents or 0 if not defined
session_write_close(); // let other requests access the session
?>
Sorry I don't know ASP.NET, but hopefully the above code is useful to you.

Chrome not handling jquery ajax query

I have the following query in jquery. It is reading the "publish" address of an Nginx subscribe/publish pair set up using Nginx's long polling module.
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = $.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: 46000, // must be longer than max heartbeat to only trigger after silent error.
error: function(jqXHR, textStatus, errorThrown) {
alert("Background failed "+textStatus); // should never happen
getxhr.abort();
requestNextBroadcast(); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
requestNextBroadcast();
}
});
}
The code is part of a chat room. Every message sent is replied to with a null rply (with 200/OK) reply, but the data is published. This is the code to read the subscribe address as the data comes back.
Using a timeout all people in the chatroom are sending a simple message every 30 to 40 seconds, even if they don't type anything, so there is pleanty of data for this code to read - at least 2 and possibly more messages per 40 seconds.
The code is 100% rock solid in EI and Firefox. But one read in about 5 fails in Chrome.
When Chrome fails it is with the 46 seconds timeout.
The log shows one /activity network request outstanding at any one time.
I've been crawling over this code for 3 days now, trying various idea. And every time IE and Firefox work fine and Chrome fails.
One suggestion I have seen is to make the call syncronous - but that is clearly impossible because it would lock up te user interface for too long.
Edit - I have a partial solution: The code is now this
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>,
error: function(jqXHR, textStatus, errorThrown) {
window.status="GET error "+textStatus;
setTimeout(requestNextBroadcast,20); // try again
},
success: function(reply, textStatus, jqXHR) {
handleRequest(reply); // this is the normal result.
setTimeout(requestNextBroadcast,20);
}
});
}
Result is sometimes the reply is delayed until the $delay (15000) happens, Then the queued messages arrive too quicly to follow. I have been unable to make it drop messages (only tested with netwrok optomisation off) with this new arrangement.
I very much doubt that delays are dur to networking problems - all machines are VMs within my one real machine, and there are no other users of my local LAN.
Edit 2 (Friday 2:30 BST) - Changed the code to use promises - and the POST of actions started to show the same symptoms, but the receive side started to work fine! (????!!!???).
This is the POST routine - it is handling a sequence of requests, to ensure only one at a time is outstanding.
function issuePostNow() {
// reset heartbeat to dropout to send setTyping(false) in 30 to 40 seconds.
clearTimeout(dropoutat);
dropoutat = setTimeout(function() {sendTyping(false);},
30000 + 10000*Math.random());
// and do send
var url = "handlechat.php?";
if (postQueue.length > 0) {
postData = postQueue[0];
var postxhr = jQuery.ajax({
type: 'POST',
url: url,
data: postData,
timeout: 5000
})
postxhr.done(function(txt){
postQueue.shift(); // remove this task
if ((txt != null) && (txt.length > 0)) {
alert("Error: unexpected post reply of: "+txt)
}
issuePostNow();
});
postxhr.fail(function(){
alert(window.status="POST error "+postxhr.statusText);
issuePostNow();
});
}
}
About one action in 8 the call to handlechat.php will timeout and the alert appears. Once the alert has been OKed, all queued up messages arrive.
And I also noticed that the handlechat call was stalled before it wrote the message that others would see. I'm wondering if it could be some strange handling of session data by php. I know it carefully queues up calls so that session data is not corrupted, so I have been careful to use different browsers or different machines. There are only 2 php worker threads however php is NOT used in the handling of /activity or in the serving of static content.
I have also thought it might be a shortage of nginx workers or php processors, so I have raised those. It is now more difficult to get things to fail - but still possible. My guess is the /activity call now fails one in 30 times, and does not drop messages at all.
And thanks guys for your input.
Summary of findings.
1) It is a bug in Chrome that has been in the code for a while.
2) With luck the bug can be made to appear as a POST that is not sent, and, when it times out it leaves Chrome in such a state that a repeat POST will succeed.
3) The variable used to store the return from $.ajax() can be local or global. The new (promises) and the old format calls both trigger the bug.
4) I have not found a work around or way to avoid the bug.
Ian
I had a very similar issue with Chrome. I am making an Ajax call in order to get the time from a server every second. Obviously the Ajax call must be asynchronous because it will freeze up the interface on a timeout if it's not. But once one of the Ajax calls is a failure, each subsequent one is as well. I first tried setting a timeout to be 100ms and that worked well in IE and FF, but not in Chrome. My best solution was setting the type to POST and that solved the bug with chrome for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
type: 'POST',
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
Update:
I believe the actual underlying problem here is Chrome's way of caching. It seems that when one request fails, that failure is cached, and therefore subsequent requests are never made because Chrome will get the cached failure before initiating subsequent requests. This can be seen if you go to Chrome's developer tools and go to the Network tab and examine each request being made. Before a failure, ajax requests to getTime.php are made every second, but after 1 failure, subsequent requests are never initiated. Therefore, the following solution worked for me:
setInterval(function(){
$.ajax({
url: 'getTime.php',
cache: false,
async: true,
timeout: 100,
success: function() { console.log("success"); },
error: function() { console.log("error"); }
});
}, 1000);
The change here, is I am disabling caching to this Ajax query, but in order to do so, the type option must be either GET or HEAD, that's why I removed 'type: 'POST'' (GET is default).
try moving your polling function into a webworker to prevent freezing up in chrome.
Otherwise you could try using athe ajax .done() of the jquery object. that one always works for me in chrome.
I feel like getxhr should be prefixed with "var". Don't you want a completely separate & new request each time rather than overwriting the old one in the middle of success/failure handling? Could explain why the behavior "improves" when you add the setTimeout. I could also be missing something ;)
Comments won't format code, so reposting as a 2nd answer:
I think Michael Dibbets is on to something with $.ajax.done -- the Deferred pattern pushes processing to the next turn of the event loop, which I think is the behavior that's needed here. see: http://www.bitstorm.org/weblog/2012-1/Deferred_and_promise_in_jQuery.html or http://joseoncode.com/2011/09/26/a-walkthrough-jquery-deferred-and-promise/
I'd try something like:
function requestNextBroadcast() {
// never stops - every reply triggers next.
// and silent errors restart via long timeout.
getxhr = jQuery.ajax({
url: "/activity",
// dataType: 'json',
data: "id="+channel,
timeout: <?php echo $delay; ?>
});
getxhr.done(function(reply){
handleRequest(reply);
});
getxhr.fail(function(e){
window.status="GET error " + e;
});
getxhr.always(function(){
requestNextBroadcast();
});
Note: I'm having a hard time finding documentation on the callback arguments for Promise.done & Promise.fail :(
Perhaps it can be worked around by changing the push module settings (there are a few) - Could you please post these?
From the top of my head:
setting it to interval poll, would kinda uglily solve it
the concurrency settings might have some effect
message storage might be used to avoid missing data
I would also use something like Charles to see what exactly does happen on the network/application layers

Fail callback called although Ajax request is performed and server returns 200 with data

I have a HTML5 test webpage test.html with a cache manifest. The webpage does an Ajax request to the same server, to a webpage do_get_data.php that is listed under the section NETWORK: in the cache manifest.
The request is performed by both Firefox 10 and iPhone iOS 5 Safari (this is logged in the serving PHP script do_get_data.php). Firefox 10 calls the success callback function after 10 seconds, that is, when data from the server is returned. However, my iPhone iOS 5 Safari calls the fail callback function immediately after it started the request and doesn't call the success callback function.
For iPhone iOS 5 Safari, the textStatus is error and JSON.stringify(jqXHR) is {"readyState":0,"responseText":"","status":0,"statusText":"error"}.
The request is performed using the following code in test.html:
<script type="text/javascript">
function test_ok(data) {
alert('Test OK, data: ' + JSON.stringify(data));
}
function testFail(jqXHR, textStatus) {
alert(textStatus + ' | ' + JSON.stringify(jqXHR));
}
function get_data(testurl) {
var senddata, request;
alert('Request for ' + testurl + ' started.');
window.testid = new Date().getTime();
senddata = {
background: true,
requestId: window.testid
};
request = $.ajax({
url: testurl,
cache: false,
type: "GET",
data: senddata,
success: test_ok
});
request.fail(testFail);
}
</script>
<input type="button" onclick="get_data('do_get_data.php')" value="test sending" />
For reference, do_get_data.php looks like this:
<?php
$id = md5(rand() . rand());
trigger_error(implode("\t", array('start', $id, $_SERVER['REQUEST_URI'], $_SERVER['REMOTE_ADDR'], $_SERVER['USER_AGENT']));
sleep(10);
header('Content-Type: application/json');
$json = json_encode(array('msg'=>'Test was OK'));
trigger_error(implode("\t", array('echo', $id, $json));
echo $json;
?>
I've been given to understand, that causes of status code 0 are (1) loading from file://, (2) unreachable network resource and (3) cross domain policy. Since you load PHP, we can safely rule number 1 and since your server logs Safari also the number 2 too, which leaves us with 3. Does all of the above code sit on the same domain? If not, use the Access-Control-Allow-Origin HTTP header in the PHP to allow cross domain requests.
header('Access-Control-Allow-Origin: http://example.org')
Also, you should make sure, the click on the button input performs only the onclick and not any other default behavior (whatever that may be on the iOS). Returning false from the onclick handler would prevent it:
<input type="button" onclick="get_data('do_get_data.php'); return false" ... />
UPDATE:
As a last resort you can always simply disable the cache manifest to move its maybe buggy implementation out of the way.
I have struggled with this for a while and the answer was in the bug report answers:
use
NETWORK:
*
in the cache manifest to avoid caching the ajax request too.
What if you change your $.ajax call to
$.ajax({
url: testurl,
cache: false,
type: "GET",
data: senddata
}).then(
function(result) { test_ok(result); },
function(result) { testFail(result); }
);
Not sure how you are running your site under iOS, but there's been an issue with jQuery and AJAX requests while using manifest files: http://bugs.jquery.com/ticket/8412
Although you are already listing the resource under the NETWORK section of the manifest, I would suggest trying out the other workaround listed there:
jQuery.ajaxSetup({
isLocal: true
});

Test if URL is accessible from web browser i.e. make sure not blocked by Proxy server

I am serving my website from mywebsite.com. I host images on flickr so all images are loaded in the user's browser via get requests to flickr. Many of my websites users access mywebsite.com from corporate networks, which block access to flickr.com. This means users get very annoying blank placeholders instead of the images. I get the same problem with the Facebook like button. This makes my site look very unattractive to such users.
Is there a way I can run a client side script which will check if flickr.com, facebook.com, etc. are accessible. If not I could change the href attribute of the image to load from an alternate source, or replace with a standard image explaining that their network is blocking access. I could also remove the Facebook like button.
I thought an XML http request would do the trick, but then I'd hit cross domain issues I think. I guess I could also set up a proxy to serve the images, but I don't want to do that; the idea of this is that flickr takes the bandwidth hit.
TLDR: How do I determine if flickr.com is accessible from a user's browser, using client side technology.
You could try this...
var image = new Image();
image.onerror = function() {
var images = document
.getElementById('flicker-images')
.getElementsByTagName('img');
for (var i = 0, imagesLength = images.length; i < imagesLength; i++) {
images[i].src = 'images/flickr_is_blocked.gif';
}
};
image.src = 'http://flickr.com/favicon.ico';
Hacky, but it seems to work. However it relies that the favicon.ico 404ing means the main site is.
jsFiddle.
Working example: http://jsfiddle.net/peeter/pW5wB/
JS:
$(document).ready(function() {
var callbackOnSuccess = function(src) {
alert("Successfully loaded " + src);
return false;
};
var callbackOnFailure = function(src) {
alert("Failed loading " + src);
// Here you can do whatever you want with your flickr images. Lets change the src and alt tags
$(".flickr").attr("src", "flickr_is_blocked.gif");
$(".flickr").attr("alt", "Flicker is blocked");
// Lets change the parents href to #
$(".flickr").parent().removeAttr("href");
return false;
};
checkAvailability("http://flickr.com/favicon.ico", callbackOnSuccess, callbackOnFailure);
});
function checkAvailability(src, callbackSuccess, callbackFailure) {
$("<img/>").attr("src", src).load(function() {
callbackSuccess(src);
}).error(function() {
callbackFailure(src);
});
}
HTML:
<a href="http://flickr.com/favicon.ico">
<img class="flickr" src="http://flickr.com/favicon.ico" alt="Flickr"/>
</a>
For facebook you can simply include the Facebook JS API and then test if one of the objects/functions it exports exists.
It would be better if you did not (ab-)use external hosts for your stuff. If you want a CDN, better use a real one...
Flickr and Facebook both have APIs that support JSONP, so cross-domain isn't an issue.
i.e. Here's a request that just echoes some dummy data from flickr's API.
$.ajax({
url: "http://www.flickr.com/services/rest/?jsoncallback=?",
dataType: 'json',
data: {method: "fickr.test.echo", format: "json", api_key: "02de950d65ec54a7a057af0e992de790"},
success: callback
});
You can't reliably set error handlers on a jsonp reqest, so show a "loading" image until that success callback gets called. Set some timeout that will show an error message if the response doesn't come back fast enough.
This works, but timeout must be set!
$.ajax({
url: "http://example.com/ping.html",
type: 'GET',
dataType: 'jsonp',
jsonpCallback: 'jsonCallback',
timeout: 1000,
cache: false,
success: function(response) {
console.log("SERVER UP!");
},
error: function(e) {
console.log("SERVER DOWN!");
}
});
ping.html should return:
jsonCallback({response:'PONG'});

Categories