Javascript - Service Workers not working correctly - javascript

I am using service workers to create an offline page for my website.
At the moment I am saving offline.html into cache so that the browser can show this file if there is no interent connection.
In the fetch event of my service worker I attempt to load index.html, and if this fails (no internet connection) I load offline.html from cache.
However, whenever I check offline mode in developer tools and refresh the page index.html still shows...
The request isn't failing, and it looks like index.html is being cached even though I didn't specify it to be.
Here is my HTML for index.html:
<!DOCTYPE html>
<html>
<head>
<title>Service Workers - Test</title>
</head>
<body>
<h1> Online page! </h1>
<h3> You are connected to the internet. </h3>
</body>
<script>
if ('serviceWorker' in navigator)
{
navigator.serviceWorker.register('service-worker.js');
}
</script>
</html>
Here is my HTML for offline.html:
<!DOCTYPE html>
<html>
<head>
<title>You are Offline - Service Workers - Test</title>
</head>
<body>
<h1> Welcome to the Offline Page!</h1>
<h2> You are not connected to the internet but you can still do certain things offline. </h2>
</body>
</html>
Here is my javascript for service-worker.js:
const PRECACHE = "version1"
const CACHED = ["offline.html"];
// Caches "offline.html" incase there is no internet
self.addEventListener('install', event => {
console.log("[Service Worker] Installed");
caches.delete(PRECACHE)
event.waitUntil (
caches.open(PRECACHE)
.then(cache => cache.addAll(CACHED))
.then( _ => self.skipWaiting())
);
});
// Clears any caches that do not match this version
self.addEventListener("activate", event => {
event.waitUntil (
caches.keys()
.then(keys => {
return Promise.all (
keys.filter(key => {
return !key.startsWith(PRECACHE);
})
.map(key => {
return caches.delete(key);
})
);
})
.then(() => {
console.log('[Service Worker] Cleared Old Cache');
})
);
});
this.addEventListener('fetch', function(event) {
if (event.request.method !== 'GET') return;
console.log("[Service Worker] Handling Request ");
// If the request to `index.html` works it shows it, but if it fails it shows the cached version of `offline.html`
// This isn't working because `fetch` doesn't fail when there is no internet for some reason...
event.respondWith (
fetch(event.request)
.then(response => {
console.log("[Service Worker] Served from NETWORK");
return response;
}, () => {
console.log("[Service Worker] Served from CACHE");
return catches.match(event.request.url + OFFLINE_URL);
})
);
});
I am running a server using python's simple http server like so:
python -m SimpleHTTPServer
Does anyone know why the offline page isn't working and how I can fix this?
Thanks for the help,
David
EDIT:
These images are showing that index.html (localhost) is still loading without internet which means it must be cached.
Edit 2:
I've tried to add no-cache to the fetch of index.html and it still is fetching index.html when I have offline checked.
fetch(event.request, {cache: "no-cache"}) ...

I think we have all forgotten how the network request works from a browser's point of view.
The issue here is, index.html is served from the disk cache when the service worker intercepts requests.
browser ===> Service Worker ===>fetch event
inside the fetch event, we have ,
Check If there is network connectivity
If there is, fetch from network and respond
Else, fetch from cache and respond
Now, how does
If there is network connectivity, fetch from network work?
Service Worker OnFetch ===> Check in Disk Cache ===>Nothing? Fetch Online
The page being fetched here, is index.html
and the cache-control headers for index.html ,
Do Not Specify a no-cache
Hence the whole issue of the offline page not showing up.
Solution
Set a cache-control header with limiting values for index.html - On the server side
Or, add headers in the fetch request to the effect
pragma:no-cache
cache-control:no-cache
How Do I add these headers to fetch?
Apparently, fetch and the browser have their own reservations about the request body when it comes to a GET
Also, weirdness and utter chaos happens If you reuse the event.request object, for a fetch request, and add custom headers.
The chaos is a list of Uncaught Exceptions due to the fetch event's request.mode attribute , which bars you from adding custom headers to a fetch when under a no-cors or a navigate mode.
Our goal is to :
Identify that the browser is truly offline and then serve a page that says so
Here's How:
Check If you can fetch a dummy html page say test-connectivity.html under your origin, with a custom cache: no-cache header. If you can, proceed, else throw the offline page
self.addEventListener( 'fetch', ( event ) => {
let headers = new Headers();
headers.append( 'cache-control', 'no-cache' );
headers.append( 'pragma', 'no-cache' );
var req = new Request( 'test-connectivity.html', {
method: 'GET',
mode: 'same-origin',
headers: headers,
redirect: 'manual' // let browser handle redirects
} );
event.respondWith( fetch( req, {
cache: 'no-store'
} )
.then( function ( response ) {
return fetch( event.request )
} )
.catch( function ( err ) {
return new Response( '<div><h2>Uh oh that did not work</h2></div>', {
headers: {
'Content-type': 'text/html'
}
} )
} ) )
} );
The {cache:'no-store'} object as the second parameter to fetch , is an unfortunate NO-OP. Just doesn't work.
Just keep it for the sake of a future scenario. It is really optional as of today.
If that worked, then you do not need to build a whole new Request object for fetch
cheers!
The code piece that creates a new request is generously borrowed from
#pirxpilot 's answer here
The offline worker for this specific question on pastebin
https://pastebin.com/sNCutAw7

David, you have two errors in one line.
Your line
return catches.match(event.request.url + OFFLINE_URL);
should be
return caches.match('offline.html');
It's catches and you haven't defined OFFLINE_URL and you don't need event request url

I tried your code and I got the same result as you in the dev tools network tab. The network tab says it loaded the index.html from service-worker, but actually the service-worker returns the cached Offline Page as expected!

Related

Serviceworker event.respondWith network error

I running my radio site, and just created service-worker.js and it works, but when I go first time on the site, html5 audio works, but then if I don't clear cookies and site data, it won't work and this error comes out in console:
( I need always clear site data, if I wanna hear music )
Failed to load ‘http://myradio.com:8000/radio’. A ServiceWorker passed
a promise to FetchEvent.respondWith() that rejected with ‘TypeError:
NetworkError when attempting to fetch resource.’.
Service-worker.js
var cacheName = 'Myradio';
var filesToCache = [
'/',
'/index.php',
'/assets/css/all.min.css',
'/assets/js/jquery.min.js',
'/assets/js/jquery.ui.touch-punch.min.js',
'/main.js'
];
/* Start the service worker and cache all of the app's content */
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open(cacheName).then(function(cache) {
return cache.addAll(filesToCache);
})
);
});
/* Serve cached content when offline */
self.addEventListener('fetch', function(e) {
e.respondWith(
caches.match(e.request).then(function(response) {
return response || fetch(e.request);
})
);
});
HTML
<audio id="myAudio" preload="metadata">
<source src="http://myradio.com:8000/radio" />
</audio>
Service workers are required to have a secure origin hosted on "https". I assume your site has that since you can register your service worker.
The problem arises when you attempt to make the request to "http://myradio.com:8000/radio". This is an "http:" URL and not to a secure origin. When you do this in a page you will get a "mixed content" warning in the browser UI.
Mixed content, however, is not permitted at all in a service worker. Making a fetch() call to an "http:" URL from a service worker script will return a NetworkError.
There are two solutions to your problem:
Host the radio stream on https.
Check for the http URL before calling respondWith() and early return from the service worker script.
Option (2) is probably less work here. Something like:
/* Serve cached content when offline */
self.addEventListener('fetch', function(e) {
// we cannot fetch mixed-content from a service worker, so early return
if (e.request.url.startsWith('http:'))
return;
e.respondWith(
caches.match(e.request).then(function(response) {
return response || fetch(e.request);
})
);
});

Service Worker: Append header to requests for CSS & JS files

I've been trying to use service workers for (what seems like hours & hours), to attach a simple header to all requests. Whats frustrating is, it sort of works.
Attempt 1:
self.addEventListener("fetch", event => {
const modifiedHeaders = new Headers({
...event.request.headers,
'API-Key': '000000000000000000001'
});
const modifiedRequest = new Request(event.request, {
headers: modifiedHeaders,
});
event.respondWith((async () => {
return fetch(modifiedRequest);
})());
});
The above code works for HTML files however for CSS & JS files I get the follow error
ReferenceError: headers is not defined
If I disable the header requirement the page loads with images and javascript and I can interact with it like normal.
Attempt 2:
var req = new Request(event.request.url, {
headers: {
...event.request.headers,
'API-Key': '000000000000000000001'
},
method: event.request.method,
mode: event.request.mode,
credentials: event.request.credentials,
redirect: event.request.redirect,
referrer: event.request.referrer,
referrerPolicy: event.request.referrerPolicy,
bodyUsed: event.request.bodyUsed,
cache: event.request.cache,
destination: event.request.destination,
integrity: event.request.integrity,
isHistoryNavigation: event.request.isHistoryNavigation,
keepalive: event.request.keepalive
});
This attempt, I simply built a new request, which successfully included the new header on CSS & JS file requests. However, When I do a POST or redirect, things stop working and behave strange.
What is the correct approach for this? I feel that attempt 1 is the better path, but I can't seem to create the Headers object on the request no matter what I do.
The version of chrome I am using is
Version 78.0.3904.70 (Official Build) (64-bit)
The site is an internal developer tool so cross browser compatibility isn't required. So I'm happy to load any additional libs / enable experimental features etc.
The problem is that your modified requests reuse the mode of the original request in both of your attempts
For embedded resources where the request is initiated from markup (unless the crossorigin attribute is present) the request is in most cases made using the no-cors mode which only allows a very limited specific set of simple headers.
no-cors — Prevents the method from being anything other than HEAD, GET
or POST, and the headers from being anything other than simple
headers. If any ServiceWorkers intercept these requests, they may not
add or override any headers except for those that are simple headers...
Source and more info on request modes: https://developer.mozilla.org/en-US/docs/Web/API/Request/mode
Simple headers are the following ones: accept (only some values), accept-language, content-language (only some values), content-type.
Source: https://fetch.spec.whatwg.org/#simple-header:
Solution:
You need to make sure to set the mode to something other than no-cors when creating the modified request. You can pick either cors or same-origin, depending on whether you want to allow cross-origin requests. (The navigate mode is reserved for navigation only and it is not possible to create a request with that mode.)
Why your code worked for HTML files:
The request issued when navigating to a new page uses the navigate mode. Chrome does not allow creating requests with this mode using the new Request() constructor, but seems to automatically silently use the same-origin mode when an existing request with the navigate mode is passed to the constructor as a parameter. This means that your first (HTML load) modified request had same-origin mode, while the CSS and JS load requests had the no-cors mode.
Working example:
'use strict';
/* Auxiliary function to log info about requests to the console */
function logRequest(message, req) {
const headersString = [...req.headers].reduce((outputString, val) => `${outputString}\n${val[0]}: ${val[1]}`, 'Headers:');
console.log(`${message}\nRequest: ${req.url}\nMode: ${req.mode}\n${headersString}\n\n`);
}
self.addEventListener('fetch', (event) => {
logRequest('Fetch event detected', event.request);
const modifiedHeaders = new Headers(event.request.headers);
modifiedHeaders.append('API-Key', '000000000000000000001');
const modifiedRequestInit = { headers: modifiedHeaders, mode: 'same-origin' };
const modifiedRequest = new Request(event.request, modifiedRequestInit);
logRequest('Modified request', modifiedRequest);
event.respondWith((async () => fetch(modifiedRequest))());
});
I would try this:
self.addEventListener("fetch", event => {
const modifiedRequest = new Request(event.request, {
headers: {
'API-Key': '000000000000000000001'
},
});
event.respondWith((async () => {
return fetch(modifiedRequest);
})());
});

Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit

Consider this sample index.html file.
<!DOCTYPE html>
<html><head><title>test page</title>
<script>navigator.serviceWorker.register('sw.js');</script>
</head>
<body>
<p>test page</p>
</body>
</html>
Using this Service Worker, designed to load from the cache, then fallback to the network if necessary.
cacheFirst = (request) => {
var mycache;
return caches.open('mycache')
.then(cache => {
mycache = cache;
cache.match(request);
})
.then(match => match || fetch(request, {credentials: 'include'}))
.then(response => {
mycache.put(request, response.clone());
return response;
})
}
addEventListener('fetch', event => event.respondWith(cacheFirst(event.request)));
This fails badly on Chrome 62. Refreshing the HTML fails to load in the browser at all, with a "This site can't be reached" error; I have to shift refresh to get out of this broken state. In the console, it says:
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'ServiceWorkerGlobalScope': Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit.
"construct a Request"?! I'm not constructing a request. I'm using the event's request, unmodified. What am I doing wrong here?
Based on further research, it turns out that I am constructing a Request when I fetch(request, {credentials: 'include'})!
Whenever you pass an options object to fetch, that object is the RequestInit, and it creates a new Request object when you do that. And, uh, apparently you can't ask fetch() to create a new Request in navigate mode and a non-empty RequestInit for some reason.
In my case, the event's navigation Request already allowed credentials, so the fix is to convert fetch(request, {credentials: 'include'}) into fetch(request).
I was fooled into thinking I needed {credentials: 'include'} due to this Google documentation article.
When you use fetch, by default, requests won't contain credentials such as cookies. If you want credentials, instead call:
fetch(url, {
credentials: 'include'
})
That's only true if you pass fetch a URL, as they do in the code sample. If you have a Request object on hand, as we normally do in a Service Worker, the Request knows whether it wants to use credentials or not, so fetch(request) will use credentials normally.
https://developers.google.com/web/ilt/pwa/caching-files-with-service-worker
var networkDataReceived = false;
// fetch fresh data
var networkUpdate = fetch('/data.json').then(function(response) {
return response.json();
}).then(function(data) {
networkDataReceived = true;
updatePage(data);
});
// fetch cached data
caches.match('mycache').then(function(response) {
if (!response) throw Error("No data");
return response.json();
}).then(function(data) {
// don't overwrite newer network data
if (!networkDataReceived) {
updatePage(data);
}
}).catch(function() {
// we didn't get cached data, the network is our last hope:
return networkUpdate;
}).catch(showErrorMessage).then(console.log('error');
Best example of what you are trying to do, though you have to update your code accordingly. The web example is taken from under Cache then network.
for the service worker:
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.open('mycache').then(function(cache) {
return fetch(event.request).then(function(response) {
cache.put(event.request, response.clone());
return response;
});
})
);
});
Problem
I came across this problem when trying to override fetch for all kinds of different assets. navigate mode was set for the initial Request that gets the index.html (or other html) file; and I wanted the same caching rules applied to it as I wanted to several other static assets.
Here are the two things I wanted to be able to accomplish:
When fetching static assets, I want to sometimes be able to override the url, meaning I want something like: fetch(new Request(newUrl))
At the same time, I want them to be fetched just as the sender intended; meaning I want to set second argument of fetch (i.e. the RequestInit object mentioned in the error message) to the originalRequest itself, like so: fetch(new Request(newUrl), originalRequest)
However the second part is not possible for requests in navigate mode (i.e. the initial html file); at the same time it is not needed, as explained by others, since it will already keep it's cookies, credentials etc.
Solution
Here is my work-around: a versatile fetch that...
can override the URL
can override RequestInit config object
works with both, navigate as well as any other requests
function fetchOverride(originalRequest, newUrl) {
const fetchArgs = [new Request(newUrl)];
if (request.mode !== 'navigate') {
// customize the request only if NOT in navigate mode
// (since in "navigate" that is not allowed)
fetchArgs.push(request);
}
return fetch(...fetchArgs);
}
In my case I was contructing a request from a serialized form in a service worker (to handle failed POSTs). In the original request it had the mode attribute set, which is readonly, so before one reconstructs the request, delete the mode attribute:
delete serializedRequest["mode"];
request = new Request(serializedRequest.url, serializedRequest);

Chrome install Service Worker addAll failed to fetch

I am using a service worker to provide caching for my site's assets (HTML, JS, CSS).
When I use Firefox my sw.js is installed correctly and the required files cached. If I go into offline mode I get the site styled correctly with everything present but the data (which is correct as the data is not being cached).
However when I use Chrome I'm getting a TypeError: Failed to fetch error. I'm really unsure why I'm getting this error since it works in Firefox. In addition I'm getting the same error thrown whenever the fetch event fires and the request if for an asset that is not in the cache (and the fetch function is being called).
If I pass an empty array to the cache.addAll function I don't get any errors until attempting to actually handling the fetch event.
It's maybe worth noting that none of the files I'm caching are all coming from localhost and not any other origin so I can't see this being a cross-domain issue.
This is the console output when installing the service worker:
This is the console output when refreshing the page after installing the service worker:
This is the code for my service worker:
const CACHE_NAME = 'webapp-v1';
const CACHE_FILES = [
'/',
'/public/app.css',
'/public/img/_sprites.png',
'/public/js/app.min.js',
'/public/js/polyfills.min.js'
];
self.addEventListener('install', event => {
console.log("[sw.js] Install event.");
event.waitUntil(
caches.open(CACHE_NAME)
.then(cache => cache.addAll(CACHE_FILES))
.then(self.skipWaiting())
.catch(err => console.error("[sw.js] Error trying to pre-fetch cache files:", err))
);
});
self.addEventListener('activate', event => {
console.log("[sw.js] Activate event.");
event.waitUntil(
self.clients.claim()
);
});
self.addEventListener('fetch', event => {
if (!event.request.url.startsWith(self.location.origin)) return;
console.log("[sw.js] Fetch event on", event.request.url);
event.respondWith(
caches.match(event.request).then(response => {
console.info("[sw.js] Responded to ", event.request.url, "with", response ? "cache hit." : "fetch.");
return response || fetch(event.request);
}).catch(err => {
console.error("[sw.js] Error with match or fetch:", err);
})
);
});
Any help would be great.
cache.addAll(CACHE_FILES)
will fail when 1 of the file is not accessible (HTTP 400,401 etc, also 5XX and 3XX sometimes) to avoid failing all when 1 fail use individual catch statement in a map loop like here https://github.com/GrosSacASac/server-in-the-browser/blob/master/client/js/service_worker.js#L168
the fact that it does not fail with empty array probably means you have an inaccessible resource in CACHE_FILES.
Maybe firefox is less restrective and caches the body of the 400 response.
Inside your fetch handler you try to use caches.match directly but I think that is not legal. you must open the caches first and then from an opened cache you can do cache.match. See https://github.com/GrosSacASac/server-in-the-browser/blob/master/client/js/service_worker.js#L143

Service worker offline support with pushstate and client side routing

I'm using a service worker to introduce offline functionality for my single page web app. It's pretty straightforward - use the network when available, or try and fetch from the cache if not:
service-worker.js:
self.addEventListener("fetch", event => {
if(event.request.method !== "GET") {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
var responseClone = networkResponse.clone();
if (networkResponse.status == 200) {
caches.open("mycache").then(cache => cache.put(event.request, responseClone));
}
return networkResponse;
})
.catch(_ => {
return caches.match(event.request);
})
)
})
So it intercepts all GET requests and caches them for future use, including the initial page load.
Switching to "offline" in DevTools and refreshing at the root of the application works as expected.
However, my app uses HTML5 pushstate and a client side router. The user could navigate to a new route, then go offline, then hit refresh, and will get a "no internet" message, because the service worker was never told about this new URL.
I can't think of a way around it. As with most SPAs, my server is configured to serve the index.html for a number of catch-all URLs. I need some sort of similar behaviour for the service worker.
Inside your fetch handler, you need to check whether event.request.mode is set to 'navigate'. If so, it's a navigation, and instead of responding with a cached response that matches the specific URL, you can respond with a cached response for your index.html. (Or app-shell.html, or whatever URL you use for the generic HTML for your SPA.)
Your updated fetch handler would look roughly like:
self.addEventListener('fetch', event => {
if (event.request.method !== 'GET') {
return;
}
if (event.request.mode === 'navigate') {
event.respondWith(caches.match('index.html'));
return;
}
// The rest of your fetch handler logic goes here.
});
This is a common use case for service workers, and if you'd prefer to use a pre-packaged solution, the NavigationRoute class in the workbox-routing module can automate it for you.

Categories