I am using a service worker to provide caching for my site's assets (HTML, JS, CSS).
When I use Firefox my sw.js is installed correctly and the required files cached. If I go into offline mode I get the site styled correctly with everything present but the data (which is correct as the data is not being cached).
However when I use Chrome I'm getting a TypeError: Failed to fetch error. I'm really unsure why I'm getting this error since it works in Firefox. In addition I'm getting the same error thrown whenever the fetch event fires and the request if for an asset that is not in the cache (and the fetch function is being called).
If I pass an empty array to the cache.addAll function I don't get any errors until attempting to actually handling the fetch event.
It's maybe worth noting that none of the files I'm caching are all coming from localhost and not any other origin so I can't see this being a cross-domain issue.
This is the console output when installing the service worker:
This is the console output when refreshing the page after installing the service worker:
This is the code for my service worker:
const CACHE_NAME = 'webapp-v1';
const CACHE_FILES = [
'/',
'/public/app.css',
'/public/img/_sprites.png',
'/public/js/app.min.js',
'/public/js/polyfills.min.js'
];
self.addEventListener('install', event => {
console.log("[sw.js] Install event.");
event.waitUntil(
caches.open(CACHE_NAME)
.then(cache => cache.addAll(CACHE_FILES))
.then(self.skipWaiting())
.catch(err => console.error("[sw.js] Error trying to pre-fetch cache files:", err))
);
});
self.addEventListener('activate', event => {
console.log("[sw.js] Activate event.");
event.waitUntil(
self.clients.claim()
);
});
self.addEventListener('fetch', event => {
if (!event.request.url.startsWith(self.location.origin)) return;
console.log("[sw.js] Fetch event on", event.request.url);
event.respondWith(
caches.match(event.request).then(response => {
console.info("[sw.js] Responded to ", event.request.url, "with", response ? "cache hit." : "fetch.");
return response || fetch(event.request);
}).catch(err => {
console.error("[sw.js] Error with match or fetch:", err);
})
);
});
Any help would be great.
cache.addAll(CACHE_FILES)
will fail when 1 of the file is not accessible (HTTP 400,401 etc, also 5XX and 3XX sometimes) to avoid failing all when 1 fail use individual catch statement in a map loop like here https://github.com/GrosSacASac/server-in-the-browser/blob/master/client/js/service_worker.js#L168
the fact that it does not fail with empty array probably means you have an inaccessible resource in CACHE_FILES.
Maybe firefox is less restrective and caches the body of the 400 response.
Inside your fetch handler you try to use caches.match directly but I think that is not legal. you must open the caches first and then from an opened cache you can do cache.match. See https://github.com/GrosSacASac/server-in-the-browser/blob/master/client/js/service_worker.js#L143
Related
When running a local server from Visual Studio, Firefox will error out and not load the page correctly. It shows this error:
Failed to load ‘https://localhost/js/mqtt.js’. A ServiceWorker passed a promise to FetchEvent.respondWith() that resolved with non-Response value ‘undefined’. serviceworker.js:19:10
it works fine on Chrome. It also works fine on Firefox when the server is on a cloud-based Azure server. Here is the code for the service worker:
// Service worker file for PWA
var CACHE_NAME = 'v5';
var urlsToCache = [
'/index.html'
];
self.addEventListener('install', function (event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function (cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('fetch', event => {
event.respondWith(
fetch(event.request).then(response => {
cache.put(event.request, response.clone());
return response;
}).catch(_ => {
return caches.match(event.request);
})
)
});
I am unsure what this error is caused by. One workaround is to go to "about:debugging#workers" and unregister the serviceworkers manually. Then refreshing the page will allow it to load correctly. However I need a solution, not a workaround.
Based on that code, I'd expect to see that error message if the following two conditions are true:
fetch(event.request) rejects.
caches.match(event.request) results in a cache miss, which causes the promise to resolve with undefined.
This isn't really out of the ordinary—it's just the logic you've written in your service worker, and the behavior depends on both the current network/server conditions, as well as the state of your local cache.
The same mistake happened to me. The problem was that I had CORS disabled for some route on the server.
Enabling CORS on the server for all routes solved it.
// Enable cors for all routes
app.use(cors(corsOptions));
// To enable cors only for a single route must be added on the app.get
// app.get('/', cors(corsOptions), nextCallback() )
I am using service worker to provide a fallback page that shows the user is offline. The service worker during interception of request, fetches the same request and on error on fetching, provides response for 'offline.html' request from the cache. A small snippet of doing this is.
self.addEventListener("fetch", (event) => {
event.respondWith(
caches.match(event.request).then(() => {
return fetch(event.request).catch((err) => {
return caches.match("offline.html");
});
})
);
});
now if the offline html has other request, probably to its css files, or images, how do I load them from cache. I've tried doing the following:
self.addEventListener("fetch", (event) => {
event.respondWith(
caches.match(event.request).then(() => {
return fetch(event.request).catch((err) => {
let url = event.request.url;
if(url.endsWith('.css')) return caches.match('offline.css');
if(url.endsWith('.jpg') || url.endsWith('.png')) return caches.match('images/banner.jpg');
return caches.match("offline.html");
});
})
);
});
But is there a better way of doing this? Is there a standard way of doing this?
First off, I would recommend checking to see whether event.request.destination === 'document' before you decide whether or not to use offline.html as the fallback content. That ensure that you're not accidentally returning an HTML document to satisfy, say, a random API request that happens to fail.
Additionally, your current code includes caches.match(event.request) but then doesn't actually used the cached response, which is likely not what you intend.
That said, let's walk through what I think is your desired logic:
Your service worker attempts to make a request against the network.
If that request returns a valid response, use it, and you'd done.
If that request fails, then:
If it was a navigation request, regardless of the destination URL, use the cached offline.html for the response.
Otherwise, for non-navigation requests (like CSS or JS requests), use the cached entry matching the desired URL for the response.
Here's a service worker that implements that. You'll need to ensure that the CSS, JS, and offline.html assets are cached during service worker installation; this just includes the fetch handler logic.
self.addEventListener('install', (event) => {
event.waitUntil(
/* Cache your offline.html and the CSS and JS it uses here. */
);
});
async function fetchLogic(request) {
try {
// If the network request succeeds, just use
// that as the response.
return await fetch(request);
} catch(error) {
// Otherwise, implement fallback logic.
if (request.mode === 'navigate') {
// Use the cached fallback.html for failed navigations.
return await caches.match('offline.html');
}
// Otherwise, return a cached copy of the actual
// subresource that was requested.
// If there's a cache miss for that given URL, you'll
// end up with a NetworkError, just like you would if
// there were no service worker involvement.
return await caches.match(request.url);
}
}
self.addEventListener('fetch', (event) => {
event.respondWith(fetchLogic(event.request));
});
There's also some formal guidance in this article.
I can't seem to figure out any reason why a service worker would be deleted with the code that I have that registers or actually is the service worker.
But in this site, it shows up as deleted in the Service Workers section of the chrome dev tools (image below).
Yet it is also registering properly as logged in the console (same image below).
Here is the service worker registration code:
if('serviceWorker' in navigator){
navigator.serviceWorker.register('/earnie.min.js', { scope: '/'}).then(function(registration){
console.log('Registration successful, scope is:', registration.scope);
}).catch(function(error){
console.log('Service worker registration failed, error:', error);
});
}
Here is the service worker code:
var cachename="e2",cachelist=";;.;/;./;/privacyPolicy.html;/css/main.css;/css/normalize.css;".split(";");
self.addEventListener("install",function(a){
a.waitUntil(caches.open(cachename).then(function(a){
return a.addAll(cachelist)
}))
});
self.addEventListener("fetch",function(a){
a.respondWith(caches.match(a.request).then(function(b){
return b?b:fetch(a.request.clone(), { credentials: 'include', redirect: 'follow' })
}))
});
What is causing it to be deleted and not registering?
Registration succeeded, but installation actually fails. Your waitUntil() promise is not resolving, which causes InstallEvent event to fail, thus deleting the ServiceWorker. cachelist probably returns invalid/empty values when you run split(';')
I recommend ensuring that cachelist is an array with valid URI values, then you can debug within the install event**
self.addEventListener("install", event => {
event.waitUntil(
caches.open(cachename)
.then(cache => cache.addAll(cachelist))
.catch(error => console.error('💩', error))
)
})
**You'll most likely need "Preserve log" console option enabled in Chrome Dev Tools to see the console error.
I am using service workers to create an offline page for my website.
At the moment I am saving offline.html into cache so that the browser can show this file if there is no interent connection.
In the fetch event of my service worker I attempt to load index.html, and if this fails (no internet connection) I load offline.html from cache.
However, whenever I check offline mode in developer tools and refresh the page index.html still shows...
The request isn't failing, and it looks like index.html is being cached even though I didn't specify it to be.
Here is my HTML for index.html:
<!DOCTYPE html>
<html>
<head>
<title>Service Workers - Test</title>
</head>
<body>
<h1> Online page! </h1>
<h3> You are connected to the internet. </h3>
</body>
<script>
if ('serviceWorker' in navigator)
{
navigator.serviceWorker.register('service-worker.js');
}
</script>
</html>
Here is my HTML for offline.html:
<!DOCTYPE html>
<html>
<head>
<title>You are Offline - Service Workers - Test</title>
</head>
<body>
<h1> Welcome to the Offline Page!</h1>
<h2> You are not connected to the internet but you can still do certain things offline. </h2>
</body>
</html>
Here is my javascript for service-worker.js:
const PRECACHE = "version1"
const CACHED = ["offline.html"];
// Caches "offline.html" incase there is no internet
self.addEventListener('install', event => {
console.log("[Service Worker] Installed");
caches.delete(PRECACHE)
event.waitUntil (
caches.open(PRECACHE)
.then(cache => cache.addAll(CACHED))
.then( _ => self.skipWaiting())
);
});
// Clears any caches that do not match this version
self.addEventListener("activate", event => {
event.waitUntil (
caches.keys()
.then(keys => {
return Promise.all (
keys.filter(key => {
return !key.startsWith(PRECACHE);
})
.map(key => {
return caches.delete(key);
})
);
})
.then(() => {
console.log('[Service Worker] Cleared Old Cache');
})
);
});
this.addEventListener('fetch', function(event) {
if (event.request.method !== 'GET') return;
console.log("[Service Worker] Handling Request ");
// If the request to `index.html` works it shows it, but if it fails it shows the cached version of `offline.html`
// This isn't working because `fetch` doesn't fail when there is no internet for some reason...
event.respondWith (
fetch(event.request)
.then(response => {
console.log("[Service Worker] Served from NETWORK");
return response;
}, () => {
console.log("[Service Worker] Served from CACHE");
return catches.match(event.request.url + OFFLINE_URL);
})
);
});
I am running a server using python's simple http server like so:
python -m SimpleHTTPServer
Does anyone know why the offline page isn't working and how I can fix this?
Thanks for the help,
David
EDIT:
These images are showing that index.html (localhost) is still loading without internet which means it must be cached.
Edit 2:
I've tried to add no-cache to the fetch of index.html and it still is fetching index.html when I have offline checked.
fetch(event.request, {cache: "no-cache"}) ...
I think we have all forgotten how the network request works from a browser's point of view.
The issue here is, index.html is served from the disk cache when the service worker intercepts requests.
browser ===> Service Worker ===>fetch event
inside the fetch event, we have ,
Check If there is network connectivity
If there is, fetch from network and respond
Else, fetch from cache and respond
Now, how does
If there is network connectivity, fetch from network work?
Service Worker OnFetch ===> Check in Disk Cache ===>Nothing? Fetch Online
The page being fetched here, is index.html
and the cache-control headers for index.html ,
Do Not Specify a no-cache
Hence the whole issue of the offline page not showing up.
Solution
Set a cache-control header with limiting values for index.html - On the server side
Or, add headers in the fetch request to the effect
pragma:no-cache
cache-control:no-cache
How Do I add these headers to fetch?
Apparently, fetch and the browser have their own reservations about the request body when it comes to a GET
Also, weirdness and utter chaos happens If you reuse the event.request object, for a fetch request, and add custom headers.
The chaos is a list of Uncaught Exceptions due to the fetch event's request.mode attribute , which bars you from adding custom headers to a fetch when under a no-cors or a navigate mode.
Our goal is to :
Identify that the browser is truly offline and then serve a page that says so
Here's How:
Check If you can fetch a dummy html page say test-connectivity.html under your origin, with a custom cache: no-cache header. If you can, proceed, else throw the offline page
self.addEventListener( 'fetch', ( event ) => {
let headers = new Headers();
headers.append( 'cache-control', 'no-cache' );
headers.append( 'pragma', 'no-cache' );
var req = new Request( 'test-connectivity.html', {
method: 'GET',
mode: 'same-origin',
headers: headers,
redirect: 'manual' // let browser handle redirects
} );
event.respondWith( fetch( req, {
cache: 'no-store'
} )
.then( function ( response ) {
return fetch( event.request )
} )
.catch( function ( err ) {
return new Response( '<div><h2>Uh oh that did not work</h2></div>', {
headers: {
'Content-type': 'text/html'
}
} )
} ) )
} );
The {cache:'no-store'} object as the second parameter to fetch , is an unfortunate NO-OP. Just doesn't work.
Just keep it for the sake of a future scenario. It is really optional as of today.
If that worked, then you do not need to build a whole new Request object for fetch
cheers!
The code piece that creates a new request is generously borrowed from
#pirxpilot 's answer here
The offline worker for this specific question on pastebin
https://pastebin.com/sNCutAw7
David, you have two errors in one line.
Your line
return catches.match(event.request.url + OFFLINE_URL);
should be
return caches.match('offline.html');
It's catches and you haven't defined OFFLINE_URL and you don't need event request url
I tried your code and I got the same result as you in the dev tools network tab. The network tab says it loaded the index.html from service-worker, but actually the service-worker returns the cached Offline Page as expected!
I'm using a service worker to introduce offline functionality for my single page web app. It's pretty straightforward - use the network when available, or try and fetch from the cache if not:
service-worker.js:
self.addEventListener("fetch", event => {
if(event.request.method !== "GET") {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
var responseClone = networkResponse.clone();
if (networkResponse.status == 200) {
caches.open("mycache").then(cache => cache.put(event.request, responseClone));
}
return networkResponse;
})
.catch(_ => {
return caches.match(event.request);
})
)
})
So it intercepts all GET requests and caches them for future use, including the initial page load.
Switching to "offline" in DevTools and refreshing at the root of the application works as expected.
However, my app uses HTML5 pushstate and a client side router. The user could navigate to a new route, then go offline, then hit refresh, and will get a "no internet" message, because the service worker was never told about this new URL.
I can't think of a way around it. As with most SPAs, my server is configured to serve the index.html for a number of catch-all URLs. I need some sort of similar behaviour for the service worker.
Inside your fetch handler, you need to check whether event.request.mode is set to 'navigate'. If so, it's a navigation, and instead of responding with a cached response that matches the specific URL, you can respond with a cached response for your index.html. (Or app-shell.html, or whatever URL you use for the generic HTML for your SPA.)
Your updated fetch handler would look roughly like:
self.addEventListener('fetch', event => {
if (event.request.method !== 'GET') {
return;
}
if (event.request.mode === 'navigate') {
event.respondWith(caches.match('index.html'));
return;
}
// The rest of your fetch handler logic goes here.
});
This is a common use case for service workers, and if you'd prefer to use a pre-packaged solution, the NavigationRoute class in the workbox-routing module can automate it for you.