Progressive Web App doesn't fetch my cached files - javascript

I am trying to turn my weather app into a PWA, and I want to create an offline page if the user lose the connection.
So I've managed to put the html and related ressources (like scripts or svg) into the browser's cache, but when I got offline, only the html page load, and not the other stuff...
Here is the files that are in the cache :
And here is the errors that occur in the console and in the network when I got offline :
As you see, only the KUTE.js library (that doesn't work even if apparently loaded ???) that comes from a CDN and the things imported by the CSS (I put the CSS directly in my html page) are loaded.
--- If you wonder what's the "en" file, it's because I made a translation system with Express, Ejs and cookies, and so when you go to /en or /fr in the url, it will translate the page either in english or french. ---
Finally, here is the code of my service worker :
const OFFLINE_VERSION = 1;
const CACHE_NAME = "offline";
const OFFLINE_URL = "offline.html";
const BASE = location.protocol + "//" + location.host;
const CACHED_FILES = [
"https://cdn.jsdelivr.net/npm/kute.js#2.1.2/dist/kute.min.js",
`${BASE}/src/favicon/favicon.ico`,
`${BASE}/src/favicon/android-chrome-192x192.png`,
`${BASE}/src/favicon/android-chrome-512x512.png`,
`${BASE}/src/favicon/apple-touch-icon.png`,
`${BASE}/src/favicon/favicon-16x16.png`,
`${BASE}/src/favicon/favicon-32x32.png`,
`${BASE}/src/svg/layered-waves.svg`,
`${BASE}/js/background.js`,
`${BASE}/js/animation-blob.js`
];
self.addEventListener('install', (event) => {
event.waitUntil((async() => {
const cache = await caches.open(CACHE_NAME);
await Promise.all(
[...CACHED_FILES, OFFLINE_URL].map((path) => {
return cache.add(new Request(path, {cache: "reload"}));
})
);
})());
self.skipWaiting();
});
self.addEventListener('activate', (event) => {
event.waitUntil((async () => {
if ("navigationPreload" in self.registration) {
await self.registration.navigationPreload.enable();
}
})());
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
if(event.request.mode === "navigate") {
event.respondWith((async() => {
try {
const preloadResponse = await event.preloadResponse;
if(preloadResponse) {
return preloadResponse;
}
return await fetch(event.request);
} catch(e) {
const cache = await caches.open(CACHE_NAME);
return await cache.match(OFFLINE_URL);
}
})());
}
});
It's the "regular" code for creating an offline page, except that I add multiple files to the cache.
So do you know why I can't fetch my other cached files ?
Thank you in advance !

this function act as a proxy to know if it should fetch data from cache or net wetwork :
self.addEventListener('fetch', (event) => {
if(event.request.mode === "navigate") {
event.respondWith((async() => {
try {
const preloadResponse = await event.preloadResponse;
if(preloadResponse) {
return preloadResponse;
}
return await fetch(event.request);
} catch(e) {
const cache = await caches.open(CACHE_NAME);
return await cache.match(OFFLINE_URL);
}
})());
}
});
but this line if(event.request.mode === "navigate") make a special behavior only if it's a navigate request (when navigator load a new page). So you need some minor change to make it work, you can try something like this :
self.addEventListener('fetch', (event) => {
event.respondWith((async() => {
try {
const preloadResponse = await event.preloadResponse;
if(preloadResponse) {
return preloadResponse;
}
return await fetch(event.request);
} catch(e) {
const cache = await caches.open(CACHE_NAME);
return await cache.match(event.request);
}
})());
});
This will normaly make it work but now every resquest will now pass through ServiceWorker and not only navigate one

Related

What's the correct way to write a fetch event listener in a service worker for my PWA?

I'm writing a progressive web app and cannot figure out how to properly implement a fetch event listener that is supposed to serve cached files if they are already stored or fetch the file from the web if they are not in cache. I believe this is the cache first approach.
`
self.addEventListener('fetch', (e) => {
console.log(e.request.url);
e.respondWith(
caches.match(e.request).then((response) => response || fetch(e.request)),
);
});
`
const assets = ["/", "styles.css", "app.js", "sw-register.js"];
self.addEventListener("install", event => {
event.waitUntil(
caches.open("assets").then( cache => {
cache.addAll(assets);
})
);
});
// Stale while revalidate strategy
self.addEventListener('fetch', event => {
event.respondWith(
caches.match(event.request)
.then( cachedResponse => {
const fetchPromise = fetch(event.request).then(
networkResponse => {
caches.open("assets").then( cache => {
cache.put(event.request, networkResponse.clone());
return networkResponse;
});
});
return cachedResponse || fetchPromise; // cached or a network fetch
})
);
});
// cache first strategy
// self.addEventListener("fetch", event => {
// event.respondWith(
// caches.match(event.request)
// .then( response => {
// if (response) {
// // The request is in the cache
// return response;
// } else {
// return fetch(event.request);
// }
// })
// );
// });

Http requests being dropped in Chrome Extension

Summary:
I've built a chrome extension that reaches out to external API to fetch some data. Sometimes that data returns quickly, sometimes it takes 4 seconds or so. I'm often doing about 5-10 in rapid succession (this is a scraping tool).
Previously, a lot of requests were dropped because the service worker in V3 of Manifest randomly shuts down. I thought I had resolved that. Then I realized there was a race condition because local storage doesn't have a proper queue.
Current Error - Even with all these fixes, requests are still being dropped. The external API returns the correct data successfully, but it seems like the extension never gets it. Hoping someone can point me in the right direction.
Relevant code attached, I imagine it will help someone dealing with these queue and service worker issues.
Local Storage queue
let writing: Map<string, Promise<any>> = new Map();
let updateUnsynchronized = async (ks: string[], f: Function) => {
let m = await new Promise((resolve, reject) => {
chrome.storage.local.get(ks, res => {
let m = {};
for (let k of ks) {
m[k] = res[k];
}
maybeResolveLocalStorage(resolve, reject, m);
});
});
// Guaranteed to have not changed in the meantime
let updated = await new Promise((resolve, reject) => {
let updateMap = f(m);
chrome.storage.local.set(updateMap, () => {
maybeResolveLocalStorage(resolve, reject, updateMap);
});
});
console.log(ks, 'Updated', updated);
return updated;
};
export async function update(ks: string[], f: Function) {
let ret = null;
// Global lock for now
await navigator.locks.request('global-storage-lock', async lock => {
ret = await updateUnsynchronized(ks, f);
});
return ret;
}
Here's the main function
export async function appendStoredScrapes(
scrape: any,
fromHTTPResponse: boolean
) {
let updated = await update(['urlType', 'scrapes'], storage => {
const urlType = storage.urlType;
const scrapes = storage.scrapes;
const {url} = scrape;
if (fromHTTPResponse) {
// We want to make sure that the url type at time of scrape, not time of return, is used
scrapes[url] = {...scrapes[url], ...scrape};
} else {
scrapes[url] = {...scrapes[url], ...scrape, urlType};
}
return {scrapes};
});
chrome.action.setBadgeText({text: `${Object.keys(updated['scrapes']).length}`});
}
Keeping the service worker alive
let defaultKeepAliveInterval = 20000;
// To avoid GC
let channel;
// To be run in content scripts
export function contentKeepAlive(name : string) {
channel = chrome.runtime.connect({ name });
channel.onDisconnect.addListener(() => contentKeepAlive(name));
channel.onMessage.addListener(msg => { });
}
let deleteTimer = (chan : any) => {
if (chan._timer) {
clearTimeout(chan._timer);
delete chan._timer;
}
}
let backgroundForceReconnect = (chan : chrome.runtime.Port) => {
deleteTimer(chan);
chan.disconnect();
}
// To be run in background scripts
export function backgroundKeepAlive(name : string) {
chrome.runtime.onConnect.addListener(chan => {
if (chan.name === name) {
channel = chan;
channel.onMessage.addListener((msg, chan) => { });
channel.onDisconnect.addListener(deleteTimer);
channel._timer = setTimeout(backgroundForceReconnect, defaultKeepAliveInterval, channel);
}
});
}
// "Always call sendResponse() in your chrome.runtime.onMessage listener even if you don't need
// the response. This is a bug in MV3." — https://stackoverflow.com/questions/66618136/persistent-service-worker-in-chrome-extension
export function defaultSendResponse (sendResponse : Function) {
sendResponse({ farewell: 'goodbye' });
}
Relevant parts of background.ts
backgroundKeepAlive('extension-background');
let listen = async (request, sender, sendResponse) => {
try {
if (request.message === 'SEND_URL_DETAIL') {
const {url, website, urlType} = request;
await appendStoredScrapes({url}, false);
let data = await fetchPageData(url, website, urlType);
console.log(data, url, 'fetch data returned background');
await appendStoredScrapes(data, true);
defaultSendResponse(sendResponse);
} else if (request.message === 'KEEPALIVE') {
sendResponse({isAlive: true});
} else {
defaultSendResponse(sendResponse);
}
} catch (e) {
console.error('background listener error', e);
}
};
chrome.runtime.onMessage.addListener(function (request, sender, sendResponse) {
listen(request, sender, sendResponse);
});

Firefox service worker stops CSS images from showing if using JavaScript redirect

Service worker stops CSS images from showing if using JS location.href(JavaSript redirection), in Firefox but I have noticed some smaller loading problems in safari too after some other redirects, chrome works correct.
Anyways, in Firefox 100.0.2 (64-bit, windows 8.1) if you visit my site the first time and I use location.href(after onload event) to redirect you to the right place, based on your language pref, all sites CSS images don't appear at all, even after refresh, design is pretty blank, img tags are loading, no CSS backgrounds. Or if I activate service worker after the redirect, part of CSS images are loading not all of them.
Firefox Network shows 200 image loaded(service worker), but blank response, still not appearing any CSS images.
Anyone can help me with this. I have no idea why Firefox stops showing CSS backgrounds, something with fetch?
All the code:
//test.js
$(document).ready(function() {
if (document.getElementById("xxx")) {
location.href = "https://www.example.com/?xxx";
}
});
//swin.js
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/sw.js').then(function(registration) {
// Registration was successful
}, function(err) {
// registration failed :(
});
});
}
//sw.js
var filevers='1';
const OFFLINE_VERSION = filevers;
const CACHE_NAME = 'C'+filevers;
const expectedCaches = ['C'+filevers];
const OFFLINE_URL = 'offline.php';
const OFFLINE_URL_ALL = [
'offline.php',
'/.'
].map(url => new Request(url, {credentials: 'include'}));
self.addEventListener('install', (event) => {
self.skipWaiting();
event.waitUntil((async () => {
const cache = await caches.open(CACHE_NAME);
await cache.addAll(OFFLINE_URL_ALL);
})());
});
self.addEventListener('activate', (event) => {
event.waitUntil(
caches.keys().then(keys => Promise.all(
keys.map(key => {
if (!expectedCaches.includes(key)) {
return caches.delete(key);
}
})
)).then(() => {
})
);
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
const destination = event.request.destination;
if (destination == "style" || destination == "script" || destination == "document" || destination == "image" || destination == "font") {
event.respondWith((async () => {
try {
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
return cachedResponse;
} else {
const networkResponse = await fetch(event.request);
return networkResponse;
}
} catch (error) {
if (event.request.mode === 'navigate') {
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
}
})());
}
});
Current Conclusion:
Tested on 2 different sites and 2 different PCs, at least some part of CSS images are not loaded, seems to be Firefox BUG, anyone can test this too?

DOMException: The operation is insecure during cache adding or retriving

I want to add cache to my site. I am using the following two functions to add and get the cache. It is working fine during development but when I use docker to build the app, then the caching is not working. It is giving me the following error.(Uncaught (in promise) DOMException: The operation is insecure.)
Image showing the error
Here is my code.
export const addDataIntoCache = (cacheName, url, response) => {
// Converting our response into Actual Response form
const data = new Response(JSON.stringify(response));
if ('caches' in window) {
// Opening given cache and putting our data into it
caches.open(cacheName).then((cache) => {
cache.put(url, data);
console.log('Data Added into cache!')
});
} else {
console.log("does not able to find caches in window");
caches.open(cacheName).then((cache) => {
cache.put(url, data);
console.log('Data Added into cache!')
});
}
};
export const getSingleCacheData = async (cacheName, url) => {
if (typeof caches === 'undefined') {
console.log("cache is undefined");
return null;
}
const cacheStorage = await caches.open(cacheName);
const cachedResponse = await cacheStorage.match(url);
// If no cache exists
if (!cachedResponse || !cachedResponse.ok) {
console.log('Fetched failed!');
return null;
}
return cachedResponse.json().then((item) => {
console.log("fetched from cache");
console.log(item);
return item;
});
};

Puppeteer - Wait for network requests to complete after page.select()

Is there a way to wait for network requests to resolve after performing an action on a page, before performing a new action in Puppeteer?
I need to interact with a select menu on the page using page.select() which causes dynamic images and fonts to load into the page. I need to wait for these requests to complete before executing the next action.
--
Caveats:
I cannot reload the page or go to a new url.
I do not know what the request types might be, or how many
--
// launch puppeteer
const browser = await puppeteer.launch({});
// load new page
const page = await browser.newPage();
// go to URL and wait for initial requests to resolve
await page.goto(pageUrl, {
waitUntil: "networkidle0"
});
// START LOOP
for (let value of lotsOfValues) {
// interact with select menu
await page.select('select', value);
// wait for network requests to complete (images, fonts)
??
// screenshot page with new content
await pageElement.screenshot({
type: "jpeg",
quality: 100
});
} // END LOOP
// close
await browser.close();
The answer to this lies in using page.setRequestInterception(true); and monitoring subsequent requests, waiting for them to resvolve before moving on to the next task (thanks #Guarev for the point in the right direction).
This module (https://github.com/jtassin/pending-xhr-puppeteer) does exactly that, but for XHR requests. I modified it to look for 'image' and 'font' types.
Final code looks something like this:
// launch puppeteer
const browser = await puppeteer.launch({});
// load new page
const page = await browser.newPage();
// go to URL and wait for initial requests to resolve
await page.goto(pageUrl, {
waitUntil: "networkidle0"
});
// enable this here because we don't want to watch the initial page asset requests (which page.goto above triggers)
await page.setRequestInterception(true);
// custom version of pending-xhr-puppeteer module
let monitorRequests = new PuppeteerNetworkMonitor(page);
// START LOOP
for (let value of lotsOfValues) {
// interact with select menu
await page.select('select', value);
// wait for network requests to complete (images, fonts)
await monitorRequests.waitForAllRequests();
// screenshot page with new content
await pageElement.screenshot({
type: "jpeg",
quality: 100
});
} // END LOOP
// close
await browser.close();
NPM Module
class PuppeteerNetworkMonitor {
constructor(page) {
this.promisees = [];
this.page = page;
this.resourceType = ['image'];
this.pendingRequests = new Set();
this.finishedRequestsWithSuccess = new Set();
this.finishedRequestsWithErrors = new Set();
page.on('request', (request) => {
request.continue();
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.add(request);
this.promisees.push(
new Promise(resolve => {
request.resolver = resolve;
}),
);
}
});
page.on('requestfailed', (request) => {
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.delete(request);
this.finishedRequestsWithErrors.add(request);
if (request.resolver) {
request.resolver();
delete request.resolver;
}
}
});
page.on('requestfinished', (request) => {
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.delete(request);
this.finishedRequestsWithSuccess.add(request);
if (request.resolver) {
request.resolver();
delete request.resolver;
}
}
});
}
async waitForAllRequests() {
if (this.pendingRequestCount() === 0) {
return;
}
await Promise.all(this.promisees);
}
pendingRequestCount() {
return this.pendingRequests.size;
}
}
module.exports = PuppeteerNetworkMonitor;
For anyone still interested in the solution #danlong posted above but wants it in a more modern way, here is the TypeScript version for it:
import { HTTPRequest, Page, ResourceType } from "puppeteer";
export class PuppeteerNetworkMonitor {
page: Page;
resourceType: ResourceType[] = [];
promises: Promise<unknown>[] = [];
pendingRequests = new Set();
finishedRequestsWithSuccess = new Set();
finishedRequestsWithErrors = new Set();
constructor(page: Page, resourceType: ResourceType[]) {
this.page = page;
this.resourceType = resourceType;
this.finishedRequestsWithSuccess = new Set();
this.finishedRequestsWithErrors = new Set();
page.on(
"request",
async (
request: HTTPRequest & { resolver?: (value?: unknown) => void },
) => {
await request.continue();
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.add(request);
this.promises.push(
new Promise((resolve) => {
request.resolver = resolve;
}),
);
}
},
);
page.on(
"requestfailed",
(request: HTTPRequest & { resolver?: (value?: unknown) => void }) => {
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.delete(request);
this.finishedRequestsWithErrors.add(request);
if (request.resolver) {
request.resolver();
delete request.resolver;
}
}
},
);
page.on(
"requestfinished",
(request: HTTPRequest & { resolver?: (value?: unknown) => void }) => {
if (this.resourceType.includes(request.resourceType())) {
this.pendingRequests.delete(request);
this.finishedRequestsWithSuccess.add(request);
if (request.resolver) {
request.resolver();
delete request.resolver;
}
}
},
);
}
async waitForAllRequests() {
if (this.pendingRequestCount() === 0) {
return;
}
await Promise.all(this.promises);
}
pendingRequestCount() {
return this.pendingRequests.size;
}
}
I did change one thing, where instead of hard-coding what resource type to look for in the network requests, I am passing the resource types to look for as one of the constructor arguments. That should make this class more generic.
I've tested this code with my API that uses Puppeteer, and it works great.
For the usage of this class, it would be similar to what #danlong posted above like this:
// other necessary puppeteer code here...
const monitorNetworkRequests = new PuppeteerNetworkMonitor(page, ["image"]);
await monitorNetworkRequests.waitForAllRequests();

Categories