Service worker sends two requests - javascript

I've implemented a service worker which caches all requests for offline usage, this works fine. But everytime I load a page there are two requests hitting my webserver (one from the service worker and one from the browser)!
How can I cache the request and only load the page once?
service-worker.js
self.addEventListener('install', function(event) {
//load error page which will show if user has no internet
var errorPage = new Request('/?p=error&offline');
event.waitUntil(pushToCache(errorPage));
});
//If any fetch fails, it will look for the request in the cache and serve it from there first
self.addEventListener('fetch', function(event) {
event.waitUntil(pushToCache(event.request));
event.respondWith(
fetch(event.request) //try loading from internet
.catch(function (error) {
return fetchFromCache(event.request);
}) //no internet connection try getting it from cache
);
});
function pushToCache(request){
if(request.method == "GET"){
return caches.open('stm-app').then(function (cache) {
return fetch(request).then(function (response) {
return cache.put(request, response);
});
});
}
};
function fetchFromCache(request) {
return caches.open('stm-app').then(function (cache) {
return cache.match(request).then(function (matching) {
if(!matching || matching.status == 404){
return fetchFromCache(new Request('/?p=error&offline')); //show page that user is offline
}else{
return matching;
}
});
});
}
sw-register.js
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('service-worker.js')
.then(function(registration) {
console.log('Registered:', registration);
})
.catch(function(error) {
console.log('Registration failed: ', error);
});
}

So here's what happens whenever you make a request:
The webpage sends a fetch request to the server,
the Service Worker intercepts the request on the 'fetch' event,
pushToCache() fires a fetch request to the server in order to cache the response,
then you respond to the event with a fetch request to the server which will return a Promise for a Response from the web server.
Yup, that makes sense, that thing just sent two requests two the server for every request the page originally made.
One thing you might want to consider is responding from the cache first and then going on the network to get the latest data. This way you will avoid delays in loading in the case of connection issues and it will speed up the loading time of the page even when the user is online.
Let's consider the following scenario: Either the user or the server are offline. Once you fire the request, it will have to time out before it goes to the catch part of the promise and get the cached response.
What you could do once you intercept the event is check the caches for a match and if you find anything, respond to the event with that. Then start a fetch request in order to update the cache.
Now if you don't find anything, fire a fetch request, clone the response (because the response body can only be used once), respond with the original response and then update the cache with the cloned response.
What did we achieve with that?
The user gets an instant response, no matter if he's online, offline or on Lie-Fi!
The server gets at most one request and the caches will always get updated with the latest data from the server!
serviceworke.rs is a great resource that can help you understand how to do many interesting things with Service Workers.
This page in particular explains in a bit more detail how what I said above works.

Related

Service Worker: Send message to client before redirect

I wish to give the client feedback before a redirect occurs, so they can store it in session storage, then when the cached page arrive from the service worker, they check session storage while the page is being rendered (not after!), and can handle the cached response accordingly.
I tried:
Adding a custom header to the response, but the client JavaScript can't read it for security reasons.
I have tried to edit the response directly. This only works for GET requests. Unfortunately when I sync a POST request, because it returns a redirect, so then it looks like a normal GET. So I need some additional way of saying, this is a GET after a sync POST, tell the user the POST was saved, its not just a normal "get the page"
Post Message, but slow as.
LocalStorage and SessionStorage is forbidden for the service worker
I could write to IndexedDB in the service worker, and then read from the client. But IndexedDB is such a confusing beast I really don't want to go down this route.
URL search parameters, redirect and url cleaning strategy became spaghetti code very quickly. The server would have to clean up URLs, and so would the service worker for the injected query args.
Is there any recommended machanism for relaying information to a client that would suite this purpose?
Side note about the post message being slow:
I currently use post message, but the problem is its really slow, and the reason I think is this:
Client attempts offline POST
Service worker serializes and stores it for when online again. In the fetch interrupt it responds with the cached response. It also calls an async postmessage to tell the client it was saved. Unfortunately if I await the postmessage, it errors out the fetch. So then one has to leave it to be async. Which means the post message happens only after the redirect
Client receive redirect response
Client redirects
Client paints the page
The cahed paged is showed
Only after about two seconds later it shows the 'was saved banner'
Heres some code if applicable:
Note: Orginally the code would set a value in the session storage when receiving the message (assumed it would receive the message before the redirect), and then pop it after the redirect at page render. However because the post message was coming so much later, I changed to performing the change on the page directly.
async function msgClientSyncSaved(event) {
const data = {
type: 'MSG_SYNC_SAVED',
};
const client = await getClient(event);
client.postMessage(data);
}
// Applicable parts of runFetch:
async function runFetch(event) {
const urlObj = new URL(event.request.url);
if (utils.getIsMethodTx(event.request.method)) {
// If a Sync URL
const clonedRequest = event.request.clone();
const response = await new strategies.NetworkOnlyStratey(log, event, cacheMutator).run();
if (!response.isDefaultResponse && !response.isCachedResponse) {
event.waitUntil(syncAllRequests());
return response;
} else {
const [syncKey, syncValue] = settings.PWA_SYNC_POST_URL_PARAM.split("=");
if (urlObj.searchParams.get(syncKey) === syncValue) {
// A failed POST, that requires SYNCING
console.log(`SW: Sync later: ${event.request.method} to ${event.request.url}`);
event.waitUntil(storeRequest(clonedRequest)); // no need to wait for this to finish before returning response
event.waitUntil(msgClientSyncSaved(event)); <--- HERE message client
// After a post, return a redirect
urlObj.searchParams.delete(syncKey);
const redirectUrl = String(urlObj);
// 302 means GET the redirect, 307 means POST to the redirect
console.log('REDIRECT TO', redirectUrl)
return Response.redirect(redirectUrl, 302);
}
}
}
}
function handleFetch(event) {
event.respondWith(runFetch(event));
}
self.addEventListener("fetch", handleFetch);
Reciever on client side:
async function handleMessage(event) {
switch (event.data.type) {
case 'MSG_SYNC_SAVED':
document.body.setAttribute('data-pwa-cached-page', 'true data-tx')
break;
}
}
navigator.serviceWorker.addEventListener("message", handleMessage);

How to update the cached files in my service worker every 30 minutes?

I have this service worker:
//IMPORT POLYFILL
importScripts('cache-polyfill.js');
//INSTALL
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open('stock_item_balance_v1').then(function(cache) {
return cache.addAll([
'/app/offline_content/purchase/stock_items/stock_items_balance.php',
'/app/offline_content/purchase/stock_items/js/stock_items_balance.js'
]);
})
);
});
//FETCH (FETCH IS WHEN YOU CHECK FOR INTERNET)
self.addEventListener('fetch', function(event) {
//console.log(event.request.url);
event.respondWith(
caches.match(event.request).then(function(response) {
return response || fetch(event.request);
})
);
});
In "stock_items_balance.php" i fetch data from my DB. So in every 30 minutes i would like to update my cached pages and reload the window.
So first i have a script that checks for internet connection.
If true, i want to clean/update the cache and reload the page.
How can i do that?
//INTERVAL
setInterval(function(){
//CLEAN/UPDATE CACHED FILES
serviceworker.update(); // ???
//RELOAD PAGE
window.location.reload();
}, 180000);
(I think you have a larger question as to whether the approach you describe is actually going to give a good, predictable, offline-capable experience for your users, but I'm just going to focus on the actual technical question you asked.)
Messaging the service worker
First off, you should keep in mind that it's possible to have multiple tabs open for the same URL, and if you, you're going to end up with your update code potentially running multiple times. The code in this answer handles the "reload" step for you from inside of the service worker, after the asynchronous cache update has completed, by getting a list of all the active clients of the service worker and telling each to navigate to the current URL (which is effectively a reload).
// Additions to your service worker code:
self.addEventListener('message', (event) => {
// Optional: if you need to potentially send different
// messages, use a different identifier for each.
if (event.data === 'update') {
event.waitUntil((async () => {
// TODO: Move these URLs and cache names into constants.
const cache = await caches.open('stock_item_balance_v1');
await cache.addAll([
'/app/offline_content/purchase/stock_items/stock_items_balance.php',
'/app/offline_content/purchase/stock_items/js/stock_items_balance.js'
]);
const windowClients = await clients.matchAll();
for (const windowClient of windowClients) {
// Optional: check windowClient.url first and
// only call navigate() if it's the URL for one
// specific page.
windowClient.navigate(windowClient.url);
}
})());
}
});
// Additions to your window/page code:
setInterval(() => {
if (navigator.serviceWorker.controller) {
navigator.serviceWorker.controller.postMessage('update');
}
}, 180000);
What won't work
The Cache Storage API is available from both inside a service worker and inside of your page's window scope. Normally what I'd recommend that folks do is to open up the same cache from the window context, and call cache.add() to update the cached entry with the latest from the network. However, calling cache.add() from the window context will cause the network request to be intercepted by your fetch handler, and at that point, your response won't actually come from the network. By calling cache.add() from inside your service worker, you can guarantee that the resulting network request won't trigger your fetch handler.

JS Is it possible for fetch to not wait for response?

I have a logging API in which is executed before a link. The link will be redirecting the user to other place and I'm executing fetch before the user is redirected. So the script is like this now:
loggingAPI({
timestamp: moment()
})
window.location = "http://.......com"
The logging api is just a normal fetch wrapper.
However, the server doesn't receive the API request right now. I think it's because of it doesn't even get the chance to send the request to the api.
So can I wait for the request to be sent but not waiting for the response?
Using sendBeacon it's very simple
without seeing the code for you function loggingAPI the following is a best guess
Note: sendBeacon uses a POST request, so the server side may need to be modified to accept such a request - though, seeing as your loggingAPI is sending data, I imagine it is already using POST - so this may be a non-issue
somewhere in your code, set up an unload event for windows
window.addEventListener("unload", () => {
sendBeacon("same url as loggingAPI", JSON.stringify({timestamp: moment()}));
}, false);
Then, when you
window.location = "http://.......com"
the loggingAPI function gets called for you
edit: sorry, I didn't flesh out the code fully, I missed a few steps!!
You can send the request in a service worker.
https://developers.google.com/web/fundamentals/primers/service-workers/
Here's some fetch specific information:
https://developer.mozilla.org/en-US/docs/Web/API/FetchEvent
You would register the service worker, and then send a message to it before redirecting.
The upside to the initial complexity is that once you start using service workers, they open up a whole new world of programming; You will end up using them for much more then queuing up messages to send.
Step 1 Register a service worker
index.html
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('service-worker.js').then(function(registration) {
// Registration was successful
console.log('ServiceWorker registration successful with scope: ', registration.scope);
}, function(err) {
// registration failed :(
console.log('ServiceWorker registration failed: ', err);
});
});
}
Step 2 Create the service worker script
service-worker.js
self.addEventListener('install', function(e) {
return Promise.resolve(null)
});
Step 3 Create a listener in server worker script
service-worker.js
self.addEventListener('message', function (event) {
console.log('message', event.data)
// call fetch here, catching and responding to what you stashed in the message
});
Step 4 Send the message before you redirect
index.html
Just a demo to simulate your client.
setTimeout(() => {
navigator.serviceWorker.controller.postMessage({message: 'A LOG MESSAGE'});
}, 2000)
After you put all pieces in place, MAKE SURE YOU CLOSE ALL TABS AND REOPEN, or have chrome dev tools set up to deal with refreshing the worker.
An old question, but if you're using fetch you can use the keepalive flag.
The keepalive option can be used to allow the request to outlive the page. Fetch with the keepalive flag is a replacement for the Navigator.sendBeacon() API.
https://developer.mozilla.org/en-US/docs/Web/API/fetch#keepalive

Service-workers blocks backbonejs?

I have built a web app using Backbone.js and it has lots of calls to a RESTful service and it works like a charm.
I tried adding a ServiceWorker to cache all the previous calls so they'll be available offline.
What I actually get is that the calls I do for the first time, dies with this error:
Failed to load resource: net::ERR_FAILED
However on page reload, I get it's cached data
My service worker fetch:
self.addEventListener('fetch', function(e) {
// e.respondWidth Responds to the fetch event
e.respondWith(
// Check in cache for the request being made
caches.match(e.request)
.then(function(response) {
// If the request is in the cache
if ( response ) {
console.log("[ServiceWorker] Found in Cache", e.request.url, response);
// Return the cached version
return response;
}
// If the request is NOT in the cache, fetch and cache
var requestClone = e.request.clone();
fetch(requestClone)
.then(function(response) {
if ( !response ) {
console.log("[ServiceWorker] No response from fetch ")
return response;
}
var responseClone = response.clone();
// Open the cache
caches.open(cacheName).then(function(cache) {
// Put the fetched response in the cache
cache.put(e.request, responseClone);
console.log('[ServiceWorker] New Data Cached', e.request.url);
// Return the response
return response;
}); // end caches.open
console.log("Response is.. ?", response)
return response;
})
.catch(function(err) {
console.log('[ServiceWorker] Error Fetching & Caching New Data', err);
});
}) // end caches.match(e.request)
); // end e.respondWith
});
edit:
I don't think there is a need for any Backbone.js web app code.
I use the fetch method from Backbone.js models and collections.
calls like
https://jsonplaceholder.typicode.com/posts/1
and
https://jsonplaceholder.typicode.com/posts/2
will replay show this error on first time. after refreshing the page, i do have this info without requesting. all from cache.
and all other request that i still didn't do, will stay error
i solved it after searching more.
Backbone.js my views in the Web app used to do:
this.listenTo(this.collection,"reset",this.render);
this.listenTo(this.collection,"add",this.addCollectionItem);
this.listenTo(this.collection,"error", this.errorRender);
while my Service worker is returning Promises.
I had to change my some code my Web app views to something like this:
this.collection.fetch({},{reset:true})
.then(_.bind(this.render, this))
.fail(_.bind(this.errorRender,this))
more or less...
The only problem I see is that when the request is not in the cache, then you do a fetch, but you do not return the result of that fetch to the enclosing then handler. You need to add a return so that you have:
return fetch(requestClone)
.then(function(response) {
None of the data provided by the return statements inside your then handler for the fetch will get transferred up the chain otherwise.
I also see that you do not return the promise provided by caches.open(cacheName).then.... This may be fine if you want to decouple saving a response in the cache from returning a result up the chain, but at the very least I'd put a comment saying that that's what I'm doing here rather than leave it to the reader to figure out whether a return statement is missing by accident, or it was done on purpose.

Service worker offline support with pushstate and client side routing

I'm using a service worker to introduce offline functionality for my single page web app. It's pretty straightforward - use the network when available, or try and fetch from the cache if not:
service-worker.js:
self.addEventListener("fetch", event => {
if(event.request.method !== "GET") {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
var responseClone = networkResponse.clone();
if (networkResponse.status == 200) {
caches.open("mycache").then(cache => cache.put(event.request, responseClone));
}
return networkResponse;
})
.catch(_ => {
return caches.match(event.request);
})
)
})
So it intercepts all GET requests and caches them for future use, including the initial page load.
Switching to "offline" in DevTools and refreshing at the root of the application works as expected.
However, my app uses HTML5 pushstate and a client side router. The user could navigate to a new route, then go offline, then hit refresh, and will get a "no internet" message, because the service worker was never told about this new URL.
I can't think of a way around it. As with most SPAs, my server is configured to serve the index.html for a number of catch-all URLs. I need some sort of similar behaviour for the service worker.
Inside your fetch handler, you need to check whether event.request.mode is set to 'navigate'. If so, it's a navigation, and instead of responding with a cached response that matches the specific URL, you can respond with a cached response for your index.html. (Or app-shell.html, or whatever URL you use for the generic HTML for your SPA.)
Your updated fetch handler would look roughly like:
self.addEventListener('fetch', event => {
if (event.request.method !== 'GET') {
return;
}
if (event.request.mode === 'navigate') {
event.respondWith(caches.match('index.html'));
return;
}
// The rest of your fetch handler logic goes here.
});
This is a common use case for service workers, and if you'd prefer to use a pre-packaged solution, the NavigationRoute class in the workbox-routing module can automate it for you.

Categories