Service-workers blocks backbonejs? - javascript

I have built a web app using Backbone.js and it has lots of calls to a RESTful service and it works like a charm.
I tried adding a ServiceWorker to cache all the previous calls so they'll be available offline.
What I actually get is that the calls I do for the first time, dies with this error:
Failed to load resource: net::ERR_FAILED
However on page reload, I get it's cached data
My service worker fetch:
self.addEventListener('fetch', function(e) {
// e.respondWidth Responds to the fetch event
e.respondWith(
// Check in cache for the request being made
caches.match(e.request)
.then(function(response) {
// If the request is in the cache
if ( response ) {
console.log("[ServiceWorker] Found in Cache", e.request.url, response);
// Return the cached version
return response;
}
// If the request is NOT in the cache, fetch and cache
var requestClone = e.request.clone();
fetch(requestClone)
.then(function(response) {
if ( !response ) {
console.log("[ServiceWorker] No response from fetch ")
return response;
}
var responseClone = response.clone();
// Open the cache
caches.open(cacheName).then(function(cache) {
// Put the fetched response in the cache
cache.put(e.request, responseClone);
console.log('[ServiceWorker] New Data Cached', e.request.url);
// Return the response
return response;
}); // end caches.open
console.log("Response is.. ?", response)
return response;
})
.catch(function(err) {
console.log('[ServiceWorker] Error Fetching & Caching New Data', err);
});
}) // end caches.match(e.request)
); // end e.respondWith
});
edit:
I don't think there is a need for any Backbone.js web app code.
I use the fetch method from Backbone.js models and collections.
calls like
https://jsonplaceholder.typicode.com/posts/1
and
https://jsonplaceholder.typicode.com/posts/2
will replay show this error on first time. after refreshing the page, i do have this info without requesting. all from cache.
and all other request that i still didn't do, will stay error

i solved it after searching more.
Backbone.js my views in the Web app used to do:
this.listenTo(this.collection,"reset",this.render);
this.listenTo(this.collection,"add",this.addCollectionItem);
this.listenTo(this.collection,"error", this.errorRender);
while my Service worker is returning Promises.
I had to change my some code my Web app views to something like this:
this.collection.fetch({},{reset:true})
.then(_.bind(this.render, this))
.fail(_.bind(this.errorRender,this))
more or less...

The only problem I see is that when the request is not in the cache, then you do a fetch, but you do not return the result of that fetch to the enclosing then handler. You need to add a return so that you have:
return fetch(requestClone)
.then(function(response) {
None of the data provided by the return statements inside your then handler for the fetch will get transferred up the chain otherwise.
I also see that you do not return the promise provided by caches.open(cacheName).then.... This may be fine if you want to decouple saving a response in the cache from returning a result up the chain, but at the very least I'd put a comment saying that that's what I'm doing here rather than leave it to the reader to figure out whether a return statement is missing by accident, or it was done on purpose.

Related

How to load different files from cache?

I am using service worker to provide a fallback page that shows the user is offline. The service worker during interception of request, fetches the same request and on error on fetching, provides response for 'offline.html' request from the cache. A small snippet of doing this is.
self.addEventListener("fetch", (event) => {
event.respondWith(
caches.match(event.request).then(() => {
return fetch(event.request).catch((err) => {
return caches.match("offline.html");
});
})
);
});
now if the offline html has other request, probably to its css files, or images, how do I load them from cache. I've tried doing the following:
self.addEventListener("fetch", (event) => {
event.respondWith(
caches.match(event.request).then(() => {
return fetch(event.request).catch((err) => {
let url = event.request.url;
if(url.endsWith('.css')) return caches.match('offline.css');
if(url.endsWith('.jpg') || url.endsWith('.png')) return caches.match('images/banner.jpg');
return caches.match("offline.html");
});
})
);
});
But is there a better way of doing this? Is there a standard way of doing this?
First off, I would recommend checking to see whether event.request.destination === 'document' before you decide whether or not to use offline.html as the fallback content. That ensure that you're not accidentally returning an HTML document to satisfy, say, a random API request that happens to fail.
Additionally, your current code includes caches.match(event.request) but then doesn't actually used the cached response, which is likely not what you intend.
That said, let's walk through what I think is your desired logic:
Your service worker attempts to make a request against the network.
If that request returns a valid response, use it, and you'd done.
If that request fails, then:
If it was a navigation request, regardless of the destination URL, use the cached offline.html for the response.
Otherwise, for non-navigation requests (like CSS or JS requests), use the cached entry matching the desired URL for the response.
Here's a service worker that implements that. You'll need to ensure that the CSS, JS, and offline.html assets are cached during service worker installation; this just includes the fetch handler logic.
self.addEventListener('install', (event) => {
event.waitUntil(
/* Cache your offline.html and the CSS and JS it uses here. */
);
});
async function fetchLogic(request) {
try {
// If the network request succeeds, just use
// that as the response.
return await fetch(request);
} catch(error) {
// Otherwise, implement fallback logic.
if (request.mode === 'navigate') {
// Use the cached fallback.html for failed navigations.
return await caches.match('offline.html');
}
// Otherwise, return a cached copy of the actual
// subresource that was requested.
// If there's a cache miss for that given URL, you'll
// end up with a NetworkError, just like you would if
// there were no service worker involvement.
return await caches.match(request.url);
}
}
self.addEventListener('fetch', (event) => {
event.respondWith(fetchLogic(event.request));
});
There's also some formal guidance in this article.

Service worker sends two requests

I've implemented a service worker which caches all requests for offline usage, this works fine. But everytime I load a page there are two requests hitting my webserver (one from the service worker and one from the browser)!
How can I cache the request and only load the page once?
service-worker.js
self.addEventListener('install', function(event) {
//load error page which will show if user has no internet
var errorPage = new Request('/?p=error&offline');
event.waitUntil(pushToCache(errorPage));
});
//If any fetch fails, it will look for the request in the cache and serve it from there first
self.addEventListener('fetch', function(event) {
event.waitUntil(pushToCache(event.request));
event.respondWith(
fetch(event.request) //try loading from internet
.catch(function (error) {
return fetchFromCache(event.request);
}) //no internet connection try getting it from cache
);
});
function pushToCache(request){
if(request.method == "GET"){
return caches.open('stm-app').then(function (cache) {
return fetch(request).then(function (response) {
return cache.put(request, response);
});
});
}
};
function fetchFromCache(request) {
return caches.open('stm-app').then(function (cache) {
return cache.match(request).then(function (matching) {
if(!matching || matching.status == 404){
return fetchFromCache(new Request('/?p=error&offline')); //show page that user is offline
}else{
return matching;
}
});
});
}
sw-register.js
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('service-worker.js')
.then(function(registration) {
console.log('Registered:', registration);
})
.catch(function(error) {
console.log('Registration failed: ', error);
});
}
So here's what happens whenever you make a request:
The webpage sends a fetch request to the server,
the Service Worker intercepts the request on the 'fetch' event,
pushToCache() fires a fetch request to the server in order to cache the response,
then you respond to the event with a fetch request to the server which will return a Promise for a Response from the web server.
Yup, that makes sense, that thing just sent two requests two the server for every request the page originally made.
One thing you might want to consider is responding from the cache first and then going on the network to get the latest data. This way you will avoid delays in loading in the case of connection issues and it will speed up the loading time of the page even when the user is online.
Let's consider the following scenario: Either the user or the server are offline. Once you fire the request, it will have to time out before it goes to the catch part of the promise and get the cached response.
What you could do once you intercept the event is check the caches for a match and if you find anything, respond to the event with that. Then start a fetch request in order to update the cache.
Now if you don't find anything, fire a fetch request, clone the response (because the response body can only be used once), respond with the original response and then update the cache with the cloned response.
What did we achieve with that?
The user gets an instant response, no matter if he's online, offline or on Lie-Fi!
The server gets at most one request and the caches will always get updated with the latest data from the server!
serviceworke.rs is a great resource that can help you understand how to do many interesting things with Service Workers.
This page in particular explains in a bit more detail how what I said above works.

Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit

Consider this sample index.html file.
<!DOCTYPE html>
<html><head><title>test page</title>
<script>navigator.serviceWorker.register('sw.js');</script>
</head>
<body>
<p>test page</p>
</body>
</html>
Using this Service Worker, designed to load from the cache, then fallback to the network if necessary.
cacheFirst = (request) => {
var mycache;
return caches.open('mycache')
.then(cache => {
mycache = cache;
cache.match(request);
})
.then(match => match || fetch(request, {credentials: 'include'}))
.then(response => {
mycache.put(request, response.clone());
return response;
})
}
addEventListener('fetch', event => event.respondWith(cacheFirst(event.request)));
This fails badly on Chrome 62. Refreshing the HTML fails to load in the browser at all, with a "This site can't be reached" error; I have to shift refresh to get out of this broken state. In the console, it says:
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'ServiceWorkerGlobalScope': Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit.
"construct a Request"?! I'm not constructing a request. I'm using the event's request, unmodified. What am I doing wrong here?
Based on further research, it turns out that I am constructing a Request when I fetch(request, {credentials: 'include'})!
Whenever you pass an options object to fetch, that object is the RequestInit, and it creates a new Request object when you do that. And, uh, apparently you can't ask fetch() to create a new Request in navigate mode and a non-empty RequestInit for some reason.
In my case, the event's navigation Request already allowed credentials, so the fix is to convert fetch(request, {credentials: 'include'}) into fetch(request).
I was fooled into thinking I needed {credentials: 'include'} due to this Google documentation article.
When you use fetch, by default, requests won't contain credentials such as cookies. If you want credentials, instead call:
fetch(url, {
credentials: 'include'
})
That's only true if you pass fetch a URL, as they do in the code sample. If you have a Request object on hand, as we normally do in a Service Worker, the Request knows whether it wants to use credentials or not, so fetch(request) will use credentials normally.
https://developers.google.com/web/ilt/pwa/caching-files-with-service-worker
var networkDataReceived = false;
// fetch fresh data
var networkUpdate = fetch('/data.json').then(function(response) {
return response.json();
}).then(function(data) {
networkDataReceived = true;
updatePage(data);
});
// fetch cached data
caches.match('mycache').then(function(response) {
if (!response) throw Error("No data");
return response.json();
}).then(function(data) {
// don't overwrite newer network data
if (!networkDataReceived) {
updatePage(data);
}
}).catch(function() {
// we didn't get cached data, the network is our last hope:
return networkUpdate;
}).catch(showErrorMessage).then(console.log('error');
Best example of what you are trying to do, though you have to update your code accordingly. The web example is taken from under Cache then network.
for the service worker:
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.open('mycache').then(function(cache) {
return fetch(event.request).then(function(response) {
cache.put(event.request, response.clone());
return response;
});
})
);
});
Problem
I came across this problem when trying to override fetch for all kinds of different assets. navigate mode was set for the initial Request that gets the index.html (or other html) file; and I wanted the same caching rules applied to it as I wanted to several other static assets.
Here are the two things I wanted to be able to accomplish:
When fetching static assets, I want to sometimes be able to override the url, meaning I want something like: fetch(new Request(newUrl))
At the same time, I want them to be fetched just as the sender intended; meaning I want to set second argument of fetch (i.e. the RequestInit object mentioned in the error message) to the originalRequest itself, like so: fetch(new Request(newUrl), originalRequest)
However the second part is not possible for requests in navigate mode (i.e. the initial html file); at the same time it is not needed, as explained by others, since it will already keep it's cookies, credentials etc.
Solution
Here is my work-around: a versatile fetch that...
can override the URL
can override RequestInit config object
works with both, navigate as well as any other requests
function fetchOverride(originalRequest, newUrl) {
const fetchArgs = [new Request(newUrl)];
if (request.mode !== 'navigate') {
// customize the request only if NOT in navigate mode
// (since in "navigate" that is not allowed)
fetchArgs.push(request);
}
return fetch(...fetchArgs);
}
In my case I was contructing a request from a serialized form in a service worker (to handle failed POSTs). In the original request it had the mode attribute set, which is readonly, so before one reconstructs the request, delete the mode attribute:
delete serializedRequest["mode"];
request = new Request(serializedRequest.url, serializedRequest);

Can service workers cache POST requests?

I tried to cache a POST request in a service worker on fetch event.
I used cache.put(event.request, response), but the returned promise was rejected with TypeError: Invalid request method POST..
When I tried to hit the same POST API, caches.match(event.request) was giving me undefined.
But when I did the same for GET methods, it worked: caches.match(event.request) for a GET request was giving me a response.
Can service workers cache POST requests?
In case they can't, what approach can we use to make apps truly offline?
You can't cache POST requests using the Cache API. See https://w3c.github.io/ServiceWorker/#cache-put (point 4).
There's a related discussion in the spec repository: https://github.com/slightlyoff/ServiceWorker/issues/693
An interesting solution is the one presented in the ServiceWorker Cookbook: https://serviceworke.rs/request-deferrer.html
Basically, the solution serializes requests to IndexedDB.
I've used the following solution in a recent project with a GraphQL API: I cached all responses from API routes in an IndexedDB object store using a serialized representation of the Request as cache key. Then I used the cache as a fallback if the network was unavailable:
// ServiceWorker.js
self.addEventListener('fetch', function(event) {
// We will cache all POST requests to matching URLs
if(event.request.method === "POST" || event.request.url.href.match(/*...*/)){
event.respondWith(
// First try to fetch the request from the server
fetch(event.request.clone())
// If it works, put the response into IndexedDB
.then(function(response) {
// Compute a unique key for the POST request
var key = getPostId(request);
// Create a cache entry
var entry = {
key: key,
response: serializeResponse(response),
timestamp: Date.now()
};
/* ... save entry to IndexedDB ... */
// Return the (fresh) response
return response;
})
.catch(function() {
// If it does not work, return the cached response. If the cache does not
// contain a response for our request, it will give us a 503-response
var key = getPostId(request);
var cachedResponse = /* query IndexedDB using the key */;
return response;
})
);
}
})
function getPostId(request) {
/* ... compute a unique key for the request incl. it's body: e.g. serialize it to a string */
}
Here is the full code for my specific solution using Dexie.js as IndexedDB-wrapper. Feel free to use it!
If you are talking about form data, then you could intercept the fetch event and read the form data in a similar way as below and then save the data in indexedDB.
//service-worker.js
self.addEventListener('fetch', function(event) {
if(event.request.method === "POST"){
var newObj = {};
event.request.formData().then(formData => {
for(var pair of formData.entries()) {
var key = pair[0];
var value = pair[1];
newObj[key] = value;
}
}).then( ...save object in indexedDB... )
}
})
Another approach to provide a full offline experience can be obtained by using Cloud Firestore offline persistence.
POST / PUT requests are executed on the local cached database and then automatically synchronised to the server as soon as the user restores its internet connectivity (note though that there is a limit of 500 offline requests).
Another aspect to be taken into account by following this solution is that if multiple users have offline changes that get concurrently synchronised, there is no warranty that the changes will be executed in the right chronological order on the server as Firestore uses a first come first served logic.
According to https://w3c.github.io/ServiceWorker/#cache-put (point 4).
if(request.method !== "GET") {
return Promise.reject('no-match')
}

How and When should we write to cache in Service Workers?

Cache all requests from an app without explicitly specifying urlsToCache. So I will cache stuff under fetch event.
To respond to requests from the cache.
Update the cache when fetch is success.
Initially,
this.addEventListener('fetch', function(event) {
var fetchReq = event.request.clone(),
cacheReq = event.request.clone();
event.respondWith(fetch(fetchReq).then(function(response) {
var resp = response.clone();
caches.open(CACHE_NAME).then(function(cache) {
req = event.request.clone();
cache.put(req, resp);
});
return response;
}).catch(function() {
return caches.match(cacheReq);
}));
});
The offline situations were handled perfectly well. But the problem here was with the slow connections. The user has to wait till fetch times out or throws an error to get the response from cache.
self.addEventListener('fetch', function(event) {
var cacheRequest = event.request.clone();
event.respondWith(caches.match(cacheRequest).then(function(response) {
if(response) return response;
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(function(response) {
var responseToCache = response.clone();
caches.open(cache_name).then(function(cache) {
var cacheSaveRequest = event.request.clone();
cache.put(cacheSaveRequest, responseToCache);
});
return response;
});
}));
});
With the cache taking precedence, the responses served were fine. But the problem here is that when the code updates. When /public/main.css served via sw is updated, on page reload only the cache is served, the updated content is not served.
I also tried modifying the cache_name to cache-v2 from cache-v1 (so that sw binary diff exists and sw is updated and that old cache can be cleared), and cleared cache-v1 on activate event. But it gave rise to new problems where two service workers were running at the same time under the same Registration ID. More on this is in this other SO question: How to stop older service workers?
Two service workers running at the same time are not technically a problem—it's working as designed. (See my answer to How to stop older service workers?) Make sure that you close other tabs that might have an older version of your service worker active.
You're running into the inevitable tradeoffs between the different cache vs. network scenarios here. If you haven't yet read through the offline cookbook, it's a great starting point when trying to decide which caching strategy works best for your specific resources.

Categories