Refresh image with Javascript, but only if changed on server - javascript

I want to reload an image on a page if it has been updated on the server. In other questions it has been suggested to do something like
newImage.src = "http://localhost/image.jpg?" + new Date().getTime();
to force the image to be re-loaded, but that means that it will get downloaded again even if it really hasn't changed.
Is there any Javascript code that will cause a new request for the same image to be generated with a proper If-Modified-Since header so the image will only be downloaded if it has actually changed?
UPDATE: I'm still confused: if I just request the typical URL, I'll get the locally cached copy. (unless I make the server mark it as not cacheable, but I don't want to do that because the whole idea is to not re-download it unless it really changes.) if I change the URL, I'll always re-download, because the point of the new URL is to break the cache. So how do I get the in-between behavior I want, i.e. download the file only if it doesn't match the locally cached copy?

Javascript can't listen for an event on the server. Instead, you could employ some form of long-polling, or sequential calls to the server to see if the image has been changed.

You should have a look at the xhr.setRequestHeader() method. It's a method of any XMLHttpRequest object, and can be used to set headers on your Ajax queries. In jQuery, you can easily add a beforeSend property to your ajax object and set up some headers there.
That being said, caching with Ajax can be tricky. You might want to have a look at this thread on Google Groups, as there's a few issues involved with trying to override a browser's caching mechanisms. You'll need to ensure that your server is returning the proper cache control headers in order to be able to get something like this to work.

One way of doing this is to user server-sent events to have the server push a notification whenever the image has been changed. For this you need a server-side script that will periodically check for the image having been notified. The server-side script below ensures that the server sends an event at least once every (approximately) 60 seconds to prevent timeouts and the client-side HTML handles navigation away from and to the page:
sse.py
#!/usr/bin/env python3
import time
import os.path
print("Content-Type: text/event-stream\n\n", end="")
IMG_PATH = 'image.jpg'
modified_time = os.path.getmtime(IMG_PATH)
seconds_since_last_send = 0
while True:
time.sleep(1)
new_modified_time = os.path.getmtime(IMG_PATH)
if new_modified_time != modified_time:
modified_time = new_modified_time
print('data: changed\n\n', end="", flush=True)
seconds_since_last_send = 0
else:
seconds_since_last_send += 1
if seconds_since_last_send == 60:
print('data: keep-alive\n\n', end="", flush=True)
seconds_since_last_send = 0
And then your HTML would include some JavaScript code:
sse.html
<html>
<head>
<meta charset="UTF-8">
<title>Server-sent events demo</title>
</head>
<body>
<img id="img" src="image.jpg">
<script>
const img = document.getElementById('img');
let evtSource = null;
function setup_sse()
{
console.log('Creating new EventSource.');
evtSource = new EventSource('sse.py');
evtSource.onopen = function() {
console.log('Connection to server opened.');
};
// if we navigate away from this page:
window.onbeforeunload = function() {
console.log('Closing connection.');
evtSource.close();
evtSource = null;
};
evtSource.onmessage = function(e) {
if (e.data == 'changed')
img.src = 'image.jpg?version=' + new Date().getTime();
};
evtSource.onerror = function(err) {
console.error("EventSource failed:", err);
};
}
window.onload = function() {
// if we navigate back to this page:
window.onfocus = function() {
if (!evtSource)
setup_sse();
};
setup_sse(); // first time
};
</script>
</body>
</html>

Here am loading an image, tree.png, as binary data dynamically with AJAX and saving the Last-Modified header. Periodically (every 5 second in the code below). I issue another download request sending backup a If-Modified-Since header using the saved last-modified header. I check to see if data has been returned and re-create the image with the data if present:
<!doctype html>
<html>
<head>
<title>Test</title>
<script>
window.onload = function() {
let image = document.getElementById('img');
var lastModified = ''; // 'Sat, 11 Jun 2022 19:15:43 GMT'
function _arrayBufferToBase64(buffer) {
var binary = '';
var bytes = new Uint8Array(buffer);
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return window.btoa( binary );
}
function loadImage()
{
var request = new XMLHttpRequest();
request.open("GET", "tree.png", true);
if (lastModified !== '')
request.setRequestHeader("If-Modified-Since", lastModified);
request.responseType = 'arraybuffer';
request.onload = function(/* oEvent */) {
lastModified = request.getResponseHeader('Last-Modified');
var response = request.response;
if (typeof response !== 'undefined' && response.byteLength !== 0) {
var encoded = _arrayBufferToBase64(response);
image.src = 'data:image/png;base64,' + encoded;
}
window.setTimeout(loadImage, 5000);
};
request.send();
}
loadImage();
};
</script>
</head>
<body>
<img id="img">
</body>
</html>

You can write a server side method which just returns last modified date of the image resource,
Then you just use polling to check for the modified date and then reload if modified date is greater than previous modified date.
pseudo code (ASP.NET)
//server side ajax method
[WebMethod]
public static string GetModifiedDate(string resource)
{
string path = HttpContext.Current.Server.MapPath("~" + resource);
FileInfo f = new FileInfo(path);
return f.LastWriteTimeUtc.ToString("yyyy-dd-MMTHH:mm:ss", CultureInfo.InvariantCulture);//2020-05-12T23:50:21
}
var pollingInterval = 5000;
function getPathFromUrl(url) {
return url.split(/[?#]/)[0];
}
function CheckIfChanged() {
$(".img").each(function (i, e) {
var $e = $(e);
var jqxhr = $.ajax({
type: "POST",
contentType: "application/json; charset=utf-8",
url: "/Default.aspx/GetModifiedDate",
data: "{'resource':'" + getPathFromUrl($e.attr("src")) + "'}"
}).done(function (data, textStatus, jqXHR) {
var dt = jqXHR.responseJSON.d;
var dtCurrent = $e.attr("data-lastwrite");
if (dtCurrent) {
var curDate = new Date(dtCurrent);
var dtLastWrite = new Date(dt);
//refresh if modified date is higher than current date
if (dtLastWrite > curDate) {
$e.attr("src", getPathFromUrl($e.attr("src")) + "?d=" + new Date());//fool browser with date querystring to reload image
}
}
$e.attr("data-lastwrite", dt);
});
}).promise().done(function () {
window.setTimeout(CheckIfChanged, pollingInterval);
});
}
$(document).ready(function () {
window.setTimeout(CheckIfChanged, pollingInterval);
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<img class="img" src="/img/rick.png" alt="rick" />

If you are going to check whether files has changed on the server you have to make http request from the server for the file time, because there is no other way for your check the file time once page get loaded to the browser.
So that time check script will like
filetimecheck.php
<?php
echo filemtime(string $filename);
?>
Then you can check the file time using your Javascript. BTW I have put jQuery $.get for check the file time.
dusplayimage.php
<img id="badge" src="image.jpg"> />
<script>
var image_time = <?php echo filemtime(string $filename); ?>;
var timerdelay = 5000;
function imageloadFunction(){
$.get("filetimecheck.php", function(data, status){
console.log("Data: " + data + "\nStatus: " + status);
if(image_time < parseInt(data)) {
document.getElementById('yourimage').src = "image.jpg?random="+new Date().getTime();
}
});
setTimeout(imageloadFunction, timerdelay);
}
imageloadFunction();
</script>
You will be using extra call to the server to check the file time which you can't avoid however you can use the time delay to fine-tune the polling time.

Yes, you can customize this behavior. Even with virtually no change to your client code.
So, you will need a ServiceWorker (caniuse 96.59%).
ServiceWorker can proxy your http requests. Also, ServiceWorker has already built-in storage for the cache. If you have not worked with ServiceWorker, then you need to study it in detail.
The idea is the following:
When requesting a picture (in fact, any file), check the cache.
If there is no such picture in the cache, send a request and fill the cache storage with the date of the request and the file.
If the cache contains the required file, then send only the date and path of the file to the special API to the server.
The API returns either the file and modification date at once (if the file was updated), or the response that the file has not changed {"changed": false}.
Then, based on the response, the worker either writes a new file to the cache and resolves the request with the new file, or resolves the request with the old file from the cache.
Here is an example code (not working, but for understanding)
s-worker.js
self.addEventListener('fetch', (event) => {
if (event.request.method !== 'GET') return;
event.respondWith(
(async function () {
const cache = await caches.open('dynamic-v1');
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
// check if a file on the server has changed
const isChanged = await fetch('...');
if (isChanged) {
// give file, and in the background write to the cache
} else {
// return data
}
return cachedResponse;
} else {
// request data, send from the worker and write to the cache in the background
}
})()
);
});
In any case, look for "ways to cache statics using ServiceWorker" and change the examples for yourself.

WARNING this solution is like taking a hammer to crush a fly
You can use sockets.io to pull information to browser.
In this case you need to monitor image file changes on the server side, and then if change occur emit an event to indicate the file change.
On client (browser) side listen to the event and then then refresh image each time you get the event.

set your image source in a data-src property,
and use javascript to periodicaly set it to the src attribute of that image with a anchor (#) the anchor tag in the url isn't send to the server.
Your webserver (apache / nginx) should respond with a HTTP 304 if the image wasn't changed, or a 200 OK with the new image in the body, if it was
setInterval(function(){
l= document.getElementById('logo');
l.src = l.dataset.src+'#'+ new Date().getTime();
},1000);
<img id="logo" alt="awesome-logo" data-src="https://upload.wikimedia.org/wikipedia/commons/1/11/Test-Logo.svg" />
EDIT
Crhome ignores http cache-control headers, for subsequent image reloads.
but the fetch api woks as expected
fetch('https://upload.wikimedia.org/wikipedia/commons/1/11/Test-Logo.svg', { cache: "no-cache" }).then(console.log);
the no-cache instructs the browser to always revalidate with the server, and if the server responds with 304, use the local cached version.

Related

How to ping ip address from java script [duplicate]

I'm making a web app that requires that I check to see if remote servers are online or not. When I run it from the command line, my page load goes up to a full 60s (for 8 entries, it will scale linearly with more).
I decided to go the route of pinging on the user's end. This way, I can load the page and just have them wait for the "server is online" data while browsing my content.
If anyone has the answer to the above question, or if they know a solution to keep my page loads fast, I'd definitely appreciate it.
I have found someone that accomplishes this with a very clever usage of the native Image object.
From their source, this is the main function (it has dependences on other parts of the source but you get the idea).
function Pinger_ping(ip, callback) {
if(!this.inUse) {
this.inUse = true;
this.callback = callback
this.ip = ip;
var _that = this;
this.img = new Image();
this.img.onload = function() {_that.good();};
this.img.onerror = function() {_that.good();};
this.start = new Date().getTime();
this.img.src = "http://" + ip;
this.timer = setTimeout(function() { _that.bad();}, 1500);
}
}
This works on all types of servers that I've tested (web servers, ftp servers, and game servers). It also works with ports. If anyone encounters a use case that fails, please post in the comments and I will update my answer.
Update: Previous link has been removed. If anyone finds or implements the above, please comment and I'll add it into the answer.
Update 2: #trante was nice enough to provide a jsFiddle.
http://jsfiddle.net/GSSCD/203/
Update 3: #Jonathon created a GitHub repo with the implementation.
https://github.com/jdfreder/pingjs
Update 4: It looks as if this implementation is no longer reliable. People are also reporting that Chrome no longer supports it all, throwing a net::ERR_NAME_NOT_RESOLVED error. If someone can verify an alternate solution I will put that as the accepted answer.
Ping is ICMP, but if there is any open TCP port on the remote server it could be achieved like this:
function ping(host, port, pong) {
var started = new Date().getTime();
var http = new XMLHttpRequest();
http.open("GET", "http://" + host + ":" + port, /*async*/true);
http.onreadystatechange = function() {
if (http.readyState == 4) {
var ended = new Date().getTime();
var milliseconds = ended - started;
if (pong != null) {
pong(milliseconds);
}
}
};
try {
http.send(null);
} catch(exception) {
// this is expected
}
}
you can try this:
put ping.html on the server with or without any content, on the javascript do same as below:
<script>
function ping(){
$.ajax({
url: 'ping.html',
success: function(result){
alert('reply');
},
error: function(result){
alert('timeout/error');
}
});
}
</script>
You can't directly "ping" in javascript.
There may be a few other ways:
Ajax
Using a java applet with isReachable
Writing a serverside script which pings and using AJAX to communicate to your serversidescript
You might also be able to ping in flash (actionscript)
You can't do regular ping in browser Javascript, but you can find out if remote server is alive by for example loading an image from the remote server. If loading fails -> server down.
You can even calculate the loading time by using onload-event. Here's an example how to use onload event.
Pitching in with a websocket solution...
function ping(ip, isUp, isDown) {
var ws = new WebSocket("ws://" + ip);
ws.onerror = function(e){
isUp();
ws = null;
};
setTimeout(function() {
if(ws != null) {
ws.close();
ws = null;
isDown();
}
},2000);
}
Update: this solution does not work anymore on major browsers, since the onerror callback is executed even if the host is a non-existent IP address.
To keep your requests fast, cache the server side results of the ping and update the ping file or database every couple of minutes(or however accurate you want it to be). You can use cron to run a shell command with your 8 pings and write the output into a file, the webserver will include this file into your view.
The problem with standard pings is they're ICMP, which a lot of places don't let through for security and traffic reasons. That might explain the failure.
Ruby prior to 1.9 had a TCP-based ping.rb, which will run with Ruby 1.9+. All you have to do is copy it from the 1.8.7 installation to somewhere else. I just confirmed that it would run by pinging my home router.
There are many crazy answers here and especially about CORS -
You could do an http HEAD request (like GET but without payload).
See https://ochronus.com/http-head-request-good-uses/
It does NOT need a preflight check, the confusion is because of an old version of the specification, see
Why does a cross-origin HEAD request need a preflight check?
So you could use the answer above which is using the jQuery library (didn't say it) but with
type: 'HEAD'
--->
<script>
function ping(){
$.ajax({
url: 'ping.html',
type: 'HEAD',
success: function(result){
alert('reply');
},
error: function(result){
alert('timeout/error');
}
});
}
</script>
Off course you can also use vanilla js or dojo or whatever ...
If what you are trying to see is whether the server "exists", you can use the following:
function isValidURL(url) {
var encodedURL = encodeURIComponent(url);
var isValid = false;
$.ajax({
url: "http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20html%20where%20url%3D%22" + encodedURL + "%22&format=json",
type: "get",
async: false,
dataType: "json",
success: function(data) {
isValid = data.query.results != null;
},
error: function(){
isValid = false;
}
});
return isValid;
}
This will return a true/false indication whether the server exists.
If you want response time, a slight modification will do:
function ping(url) {
var encodedURL = encodeURIComponent(url);
var startDate = new Date();
var endDate = null;
$.ajax({
url: "http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20html%20where%20url%3D%22" + encodedURL + "%22&format=json",
type: "get",
async: false,
dataType: "json",
success: function(data) {
if (data.query.results != null) {
endDate = new Date();
} else {
endDate = null;
}
},
error: function(){
endDate = null;
}
});
if (endDate == null) {
throw "Not responsive...";
}
return endDate.getTime() - startDate.getTime();
}
The usage is then trivial:
var isValid = isValidURL("http://example.com");
alert(isValid ? "Valid URL!!!" : "Damn...");
Or:
var responseInMillis = ping("example.com");
alert(responseInMillis);
const ping = (url, timeout = 6000) => {
return new Promise((resolve, reject) => {
const urlRule = new RegExp('(https?|ftp|file)://[-A-Za-z0-9+&##/%?=~_|!:,.;]+[-A-Za-z0-9+&##/%=~_|]');
if (!urlRule.test(url)) reject('invalid url');
try {
fetch(url)
.then(() => resolve(true))
.catch(() => resolve(false));
setTimeout(() => {
resolve(false);
}, timeout);
} catch (e) {
reject(e);
}
});
};
use like this:
ping('https://stackoverflow.com/')
.then(res=>console.log(res))
.catch(e=>console.log(e))
I don't know what version of Ruby you're running, but have you tried implementing ping for ruby instead of javascript? http://raa.ruby-lang.org/project/net-ping/
let webSite = 'https://google.com/'
https.get(webSite, function (res) {
// If you get here, you have a response.
// If you want, you can check the status code here to verify that it's `200` or some other `2xx`.
console.log(webSite + ' ' + res.statusCode)
}).on('error', function(e) {
// Here, an error occurred. Check `e` for the error.
console.log(e.code)
});;
if you run this with node it would console log 200 as long as google is not down.
You can run the DOS ping.exe command from javaScript using the folowing:
function ping(ip)
{
var input = "";
var WshShell = new ActiveXObject("WScript.Shell");
var oExec = WshShell.Exec("c:/windows/system32/ping.exe " + ip);
while (!oExec.StdOut.AtEndOfStream)
{
input += oExec.StdOut.ReadLine() + "<br />";
}
return input;
}
Is this what was asked for, or am i missing something?
just replace
file_get_contents
with
$ip = $_SERVER['xxx.xxx.xxx.xxx'];
exec("ping -n 4 $ip 2>&1", $output, $retval);
if ($retval != 0) {
echo "no!";
}
else{
echo "yes!";
}
It might be a lot easier than all that. If you want your page to load then check on the availability or content of some foreign page to trigger other web page activity, you could do it using only javascript and php like this.
yourpage.php
<?php
if (isset($_GET['urlget'])){
if ($_GET['urlget']!=''){
$foreignpage= file_get_contents('http://www.foreignpage.html');
// you could also use curl for more fancy internet queries or if http wrappers aren't active in your php.ini
// parse $foreignpage for data that indicates your page should proceed
echo $foreignpage; // or a portion of it as you parsed
exit(); // this is very important otherwise you'll get the contents of your own page returned back to you on each call
}
}
?>
<html>
mypage html content
...
<script>
var stopmelater= setInterval("getforeignurl('?urlget=doesntmatter')", 2000);
function getforeignurl(url){
var handle= browserspec();
handle.open('GET', url, false);
handle.send();
var returnedPageContents= handle.responseText;
// parse page contents for what your looking and trigger javascript events accordingly.
// use handle.open('GET', url, true) to allow javascript to continue executing. must provide a callback function to accept the page contents with handle.onreadystatechange()
}
function browserspec(){
if (window.XMLHttpRequest){
return new XMLHttpRequest();
}else{
return new ActiveXObject("Microsoft.XMLHTTP");
}
}
</script>
That should do it.
The triggered javascript should include clearInterval(stopmelater)
Let me know if that works for you
Jerry
You could try using PHP in your web page...something like this:
<html><body>
<form method="post" name="pingform" action="<?php echo $_SERVER['PHP_SELF']; ?>">
<h1>Host to ping:</h1>
<input type="text" name="tgt_host" value='<?php echo $_POST['tgt_host']; ?>'><br>
<input type="submit" name="submit" value="Submit" >
</form></body>
</html>
<?php
$tgt_host = $_POST['tgt_host'];
$output = shell_exec('ping -c 10 '. $tgt_host.');
echo "<html><body style=\"background-color:#0080c0\">
<script type=\"text/javascript\" language=\"javascript\">alert(\"Ping Results: " . $output . ".\");</script>
</body></html>";
?>
This is not tested so it may have typos etc...but I am confident it would work. Could be improved too...

multipart HTTP request with microsoft graph javascript sdk

I'm trying to use the Microsoft Graph JavaScript SDK to create a page in OneNote with images, which OneNote requires a multipart request for. I've created a FormData object with all the data I'm trying to send.
The request goes through when I send it up myself as follows:
var xhr = new XMLHttpRequest();
xhr.open("POST", url, true);
xhr.setRequestHeader("Authorization", "Bearer" + token);
xhr.onreadystatechange = function() {
//Call a function when the state changes
if (xhr.readyState == XMLHttpRequest.DONE && xhr.status == 200) {
// Request finished. Do processing here.
} else {
// handle case
}
};
// dataToSend = FormData object containing data
// (as Blobs), including the page HTML in a
// "Presentation" part as specified
xhr.send(dataToSend);
However, since I'm using the Graph SDK to make all my other requests, I'm wondering if there's a way to do the multipart request with the SDK as well. So far, this is what I've tried:
this.client
.api(pagesURL)
.version("beta")
.header("Content-Type", "text/html")
.post(dataToSend);
Investigating the request in Fiddler shows that the request body contains [object, Object], not the data formatted as a multipart request. Any help on how to get the FormData object into the request properly using the SDK/ guidance on whether this is possible would be greatly appreciated!
I believe this is what you're looking for:
this.client
.api("https://graph.microsoft.com/beta/me/notes/sections/{Section ID}/pages")
.header("Content-Type", "application/xhtml+xml")
.header("boundary", "MyPartBoundary")
.post(dataToSend);
This snippet was adapted from the multi-part unit test used by the SDK itself. You can find that test at https://github.com/microsoftgraph/msgraph-sdk-javascript/blob/dev/spec/types/OneNote.ts
Update the microsoft-graph-client to latest version and try something like this.
const HTMLPageContent =
`<!DOCTYPE html>
<html>
<head>
<title>A page with rendered images</title>
</head>
<body>
<p>Here is an image uploaded as <b>binary data</b>:</p>
<img src="name:imageBlock1" alt="an image on the page" />
</body>
</html>`;
let sectionId = "<Your_OneNote_Page_Section_Id>";
let formData = new FormData();
let htmlBlob = new Blob([HTMLPageContent], {
type: "text/html"
});
formData.append("Presentation", htmlBlob);
formData.append("imageBlock1", file);
client
.api(`/me/onenote/sections/${sectionId}/pages`)
.post(formData)
.then((json) => {
console.log(json);
return Promise.resolve();
});

AjaxChat: Image Upload code hangs, freezes browser, crashes server

This is a tangent from the question here:
Returning value to Javascript from PHP called from XMLHttpRequest
I am adding an "image upload" button to my AjaxChat. I am using an XMLHttpRequest to send the image to the server, where I run a PHP script to move it to my images folder. Below is the Javascript function in charge of opening the XMLHttpRequest connection and sending the file:
function uploadImage() {
var form = document.getElementById('fileSelectForm');
var photo = document.getElementById('photo');
var uploadButton = document.getElementById('imageUploadButton');
form.onsubmit = function(event) {
event.preventDefault();
// Update button text
uploadButton.innerHTML = 'Uploading...';
//Get selected files from input
var files = photo.files;
// Create a new FormData object
var formData = new FormData();
// Loop through selected files
for (var i = 0; files.length > i; i++) {
var file = files[i];
// Check file type; only images are allowed
if (!file.type.match('image/*')) {
continue;
}
// Add file to request
formData.append('photo', file, file.name);
}
// Set up request
var xhr = new XMLHttpRequest();
// Open connection
xhr.open('POST', 'sites/all/modules/ajaxchat/upload.php', true);
// Set up handler for when request finishes
xhr.onload = function () {
if (xhr.status === 200) {
//File(s) uploaded
uploadButton.innerHTML = 'Upload';
var result = xhr.responseText;
ajaxChat.insertText('\n\[img\]http:\/\/www.mysite.com\/images' + result + '\[\/img\]');
ajaxChat.sendMessage();
} else {
alert('An error occurred!');
}
form.reset();
};
// Send data
xhr.send(formData);
}
}
Here is upload.php:
<?php
$valid_file = true;
if($_FILES['photo']['name']) {
//if no errors...
if(!$_FILES['photo']['error']) {
//now is the time to modify the future file name and validate the file
$new_file_name = strtolower($_FILES['photo']['tmp_name']); //rename file
if($_FILES['photo']['size'] > (1024000)) { //can't be larger than 1 MB
$valid_file = false;
}
//if the file has passed the test
if($valid_file) {
//move it to where we want it to be
move_uploaded_file($_FILES['photo']['tmp_name'], '/var/www/html/images'.$new_file_name);
$message = $new_file_name;
exit("$message");
}
}
}
?>
I currently have the multiple image upload disabled, so the "Loop through selected files" only executes once.
The upload worked for a little bit on my PC, but then I tried uploading an image from my phone. When I did so, the entire server (and my browser) crashed, presumably due to an infinite loop somewhere. Every time I close my browser and log back in, or restart the server, or restart my computer, it hangs and eventually crashes again (on my PC or on my phone). I have been unable to find the script that is causing the issue. I get the feeling it's right under my nose. Does anyone see the problem? If you need the HTML form code then I can provide that, but I don't think it's necessary.

Load External Script and Style Files in a SPA

I have a type of SPA which consumes an API in order to fetch data. There are some instance of this SPA and all of them use common style and script files. So my problem is when I change a single line in those files, I will have to open each and every instances and update the files. It's really time consuming for me.
One of the approaches is to put those files in a folder in the server, then change the version based on the time, but I will lose browser cache if I use this solution:
<link href="myserver.co/static/main.css?ver=1892471298" rel="stylesheet" />
<script src="myserver.co/static/script.js?ver=1892471298"></script>
The ver value is produced based on time and I cannot use browser cache. I need a solution to update these files from the API, then all of the SPAs will be updated.
In your head tag, you can add the code below:
<script type="text/javascript">
var xmlhttp = new XMLHttpRequest();
var url = "http://localhost:4000/getLatestVersion"; //api path to get the latest version
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
var tags = JSON.parse(xmlhttp.responseText);
for (var i = 0; i < tags.length; i++) {
var tag = document.createElement(tags[i].tag);
if (tags[i].tag === 'link') {
tag.rel = tags[i].rel;
tag.href = tags[i].url;
} else {
tag.src = tags[i].url;
}
document.head.appendChild(tag);
}
}
};
xmlhttp.open("POST", url, false);
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xmlhttp.send();
</script>
Your api path should allow "CORS" from your website that handles the code above.
And your api should return a json data like below:
var latestVersion = '1892471298'; //this can be stored in the database
var jsonData = [
{
tag: 'link',
rel: 'stylesheet',
url: 'http://myserver.co/static/main.css?ver=' + latestVersion
},
{
tag: 'script',
rel: '',
url: 'http://myserver.co/static/script.js?ver=' + latestVersion
}
];
//return jsonData to the client here
If you change anything in your JS or CSS then you have to update the browser cache, all you can do is to update that particular JS version not all of them, it should reflect in browser.
How about adding a method in your API returning the files' last modified time and then inserting the value into the "src"/"href" attribute after the "ver="

How to check if page exists using JavaScript

I have a link: Hello.
When someone clicks the link I'd like to check via JavaScript if the page the href-attribute points to exists or not. If the page exists the browser redirects to that page ("www.example.com" in this example) but if the page doesn't exist the browser should redirect to another URL.
It depends on whether the page exists on the same domain or not. If you're trying to determine if a page on an external domain exists, it won't work – browser security prevents cross-domain calls (the same-origin policy).
If it is on the same domain however, you can use jQuery like Buh Buh suggested. Although I'd recommend doing a HEAD-request instead of the GET-request the default $.ajax() method does – the $.ajax() method will download the entire page. Doing a HEAD request will only return the headers and indicate whether the page exists (response codes 200 - 299) or not (response codes 400 - 499). Example:
$.ajax({
type: 'HEAD',
url: 'http://yoursite.com/page.html',
success: function() {
// page exists
},
error: function() {
// page does not exist
}
});
See also: http://api.jquery.com/jQuery.ajax/
A pretty good work around is to proxy. If you don't have access to a server side you can use YQL. Visit: http://developer.yahoo.com/yql/console/
From there you can do something like: select * from htmlstring where url="http://google.com". You can use the "REST query" they have on that page as a starting point for your code.
Here's some code that would accept a full URL and use YQL to detect if that page exists:
function isURLReal(fullyQualifiedURL) {
var URL = encodeURIComponent(fullyQualifiedURL),
dfd = $.Deferred(),
checkURLPromise = $.getJSON('http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20htmlstring%20where%20url%3D%22' + URL + '%22&format=json');
checkURLPromise
.done(function(response) {
// results should be null if the page 404s or the domain doesn't work
if (response.query.results) {
dfd.resolve(true);
} else {
dfd.reject(false);
}
})
.fail(function() {
dfd.reject('failed');
});
return dfd.promise();
}
// usage
isURLReal('http://google.com')
.done(function(result) {
// yes, or request succeded
})
.fail(function(result) {
// no, or request failed
});
Update August 2nd, 2017
It looks like Yahoo deprecated "select * from html", although "select * from htmlstring" does work.
Based on the documentation for XMLHttpRequest:
function returnStatus(req, status) {
//console.log(req);
if(status == 200) {
console.log("The url is available");
// send an event
}
else {
console.log("The url returned status code " + status);
// send a different event
}
}
function fetchStatus(address) {
var client = new XMLHttpRequest();
client.onreadystatechange = function() {
// in case of network errors this might not give reliable results
if(this.readyState == 4)
returnStatus(this, this.status);
}
client.open("HEAD", address);
client.send();
}
fetchStatus("/");
This will however only work for URLs within the same domain as the current URL. Do you want to be able to ping external services? If so, you could create a simple script on the server which does your job for you, and use javascript to call it.
If it is in the same domain, you can make a head request with the xmlhttprequest object [ajax] and check the status code.
If it is in another domain, make an xmlhttprequest to the server and have it make the call to see if it is up.
why not just create a custom 404 handler on the web server? this is probably the more "good-bear" way to do this.
$.ajax({
url: "http://something/whatever.docx",
method: "HEAD",
statusCode: {
404: function () {
alert('not found');
},
200: function() {
alert("foundfile exists");
}
}
});
If you are happy to use jQuery you could do something like this.
When the page loads make an ajax call for each link. Then just replace the href of all the links which fail.
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js"></script>
<script type="text/javascript">
<!--
$.fn.checkPageExists = function(defaultUrl){
$.each(this, function(){
var $link = $(this);
$.ajax({
url: $link.attr("href"),
error: function(){
$link.attr("href", defaultUrl);
}
});
});
};
$(document).ready(function(){
$("a").checkPageExists("default.html");
});
//-->
</script>
You won't be able to use an ajax call to ping the website because of same-origin policy.
The best way to do it is to use an image and if you know the website you are calling has a favicon or some sort of icon to grab, you can just use an html image tag and use the onerror event.
Example:
function pingImgOnWebsite(url) {
var img = document.createElement('img');
img.style.visibility = 'hidden';
img.style.position = 'fixed';
img.src = url;
img.onerror = continueBtn; // What to do on error function
document.body.appendChild(img);
}
Another way to do this is is with PHP.
You could add
<?php
if (file_exists('/index.php'))
{
$url = '/index.php';
} else {
$url = '/notindex.php';
}
?>
And then
<a href="<?php echo $url; ?>Link</a>

Categories