So again like other thousand times I updated my service workers file version, to check any errors, I opened the Chrome(Browser) developer tool, and what I see... fetching works in a new way...ERROR?
Fetch finished loading: GET "https://www.example.com/favicon.ico". etc...some more CSS and image, I don't know what this is: Fetch failed loading: GET "https://www.example.com/". (the last line of console log)
Why it needs to request the domain top root every time...
Now I check the headers (DEV tools - Network - Headers) because the network status = (failed)
Request URL: https://www.example.com/
Referrer Policy: unsafe-url
pretty much no headers info at all or any content???
If I use the preload it shows extra ERROR in red (The service worker navigation preload request failed with network error:net::ERR_INTERNET_DISCONNECTED), so I have disabled preload for now, the service worker would still work with this error.
I had just updated to PHP 8.0 so maybe that was doing something, but after getting back to the old version nothing changed. Maybe my server started blocking some sort of requests, but that is unlikely, more like bad request from chrome service workers.
If the Chrome tries to check with the last request some sort of offline capacity, I use to display an offline page if fetch error, If that has anything to do with this.
Anyways despite the problems/errors described above the service worker works like it should.
Anyways, here is the SW code example:
const OFFLINE_VERSION = 1;
var filevers='xxxx';
const CACHE_NAME = 'offline'+filevers;
// Customize this with a different URL if needed.
const OFFLINE_URL = 'offlineurl.php';
const OFFLINE_URL_ALL = [
'stylesheet'+filevers+'.css',
'offlineurl.php',
'favicon.ico',
'img/logo.png'
].map(url => new Request(url, {credentials: 'include'}));
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.addAll(OFFLINE_URL_ALL);
})());
});
self.addEventListener('activate', (event) => {
event.waitUntil((async () => {
// Enable navigation preload if it's supported.
// See https://developers.google.com/web/updates/2017/02/navigation-preload
//removed for now
})());
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
const destination = event.request.destination;
if (destination == "style" || destination == "script" || destination == "document" || destination == "image" || destination == "font") {
event.respondWith((async () => {
try {
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
return cachedResponse;
} else {
// First, try to use the navigation preload response if it's supported.
//removed for now
const networkResponse = await fetch(event.request);
return networkResponse;
}
} catch (error) {
if (event.request.mode === 'navigate') {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
}
})());
}
});
Any suggestions, what may cause the error?
Related
I need my service worker to monitor fetch requests and include a header value in the response if the request contains a certain path however this value isn't known until run time. I can use postMessage() to send the value and receive it in the sw's message listener, but then can't seem to use it in the fetch listener.
app.js
const customerUrl = "customerUrl.domain.com" //this will come from a setting
const swr = new Worker('/sw.js')
swr.postMessage(customerUrl)
sw.js
let customerDomain = "foo.domain.com"
self.addEventListener("message", (msg) => {
console.log("Service Worker Message Recieved", msg.data) // "customerUrl.domain.com"
customerDomain = msg.data
console.log(customerDomain) // "customerUrl.domain.com" ... awesome
})
self.addEventListener('fetch', evt => {
console.log("Fetch intercepted for " + customerDomain) // "foo.sample.com" but needs to be "customerUrl.domain.com"
if (evt.url.includes("/bar/")) {
// ... modify request and send with customerDomain in header
}
}
I am building a web extension (Chrome) that checks if the external's API changed. This should be happening periodically (e.g. every 10 mins) and in the background. The plan is to have a service worker that would fire these requests, and replace extension's icon when a change in the response was detected. I have a working code that does exactly that, but I am unable to persist the service worker, and make it run on browser load (i.e. moment when the window opens). I managed to use message API, but that requires the user to click the extension button to open it and only then the extension would continuously run in the background.
This is my service worker code:
const browser = chrome || browser;
self.addEventListener("message", async (event) => {
if (event.data && event.data.type === 'CHECK_FOR_NEW_RESOURCES') {
communicationPort = event.ports[0];
const compareStructures = setInterval(async () => {
const currentStructure = await getStructure(event.data.baseURL + 'webservice/rest/server.php', event.data.token);
const {curr, newResources } = findDifferences(event.data.structure.base, currentStructure);
if(newResources > 0) {
communicationPort.postMessage({different: true, structure: curr, newResources,
time: new Date().toISOString()});
browser.action.setIcon({ path: { '48': event.data.newResourcesIcon } });
clearInterval(compareStructures);
} else {
communicationPort.postMessage({ different: false, time: new Date().toISOString() });
browser.action.setIcon({ path: { '48': event.data.noNewResourcesIcon } });
}
}, 900000);
}
});
const getStructure = async (url, token) => {
// Uses fetch() to get the resources
...
};
const findDifferences = (newStructure, oldStructure) => {
...
};
If it is not possible, what are the viable options to achieve my desired design? Could reverting to manifest ver. 2 help?
Hopefully my description makes sense, and I think it is possible, as I have seen extensions send notifications when the browser is opened.
I have a page which communicates to a server every 10 seconds via XHR inside an iframe. I would like to monitor the responses (text/plain UTF-8).
Opposite to the DevTools Network list, it seems Puppeteer does not "detect"
XHR responses from inside iframes, with it's normal page procedure:
page.on('response', async (response) => {}
When the ID of the iframe is known, is there any way to receive the XHR responses of the iframe? (with a JS web worker doing the XHR requests)
I have written this example which showcases that requests are indeed captured from inner frames of your page. You have to be careful, requests that fail won't actually trigger page.on('response') handler but will trigger page.on('requestfailed').
Also make sure to call await page.setRequestInterception(true) before adding any request related handlers to your page!
Here is a working example: (it will only trigger requestFailed due to cross origin restrictions.)
var puppeteer = require('puppeteer')
function simulateRequest() {
function doXmlHTTPRequest() {
function reqListener() {
console.log('xml done.')
console.log(this.responseText);
}
var oReq = new XMLHttpRequest();
oReq.addEventListener("load", reqListener);
oReq.open("GET", "https://www.google.com");
oReq.send();
}
setInterval(doXmlHTTPRequest, 2 * 1000)
}
// Easiest way to get an Iframe is by its name.
const findFrame = (frames, name) => {
return frames.find(f => f.name() === name)
}
const main = async() => {
const browser = await puppeteer.launch({ headless: false });
const page = await browser.newPage()
// have a page with only an iframe.
const pageContent = '<iframe id="myFrame" name="frameName">'
await page.setContent(pageContent)
// important call!
await page.setRequestInterception(true)
page.on('response', async(res) => {
console.log('Page response received..', )
})
page.on('requestfailed', () => {
console.log('requestfailed recieved')
})
page.on('request', async(res) => {
console.log('page request received..', )
// make sure to continue request.
res.continue()
})
// find the iframe!
const targetFrame = findFrame(page.frames(), 'frameName')
// set some test content..
await targetFrame.setContent('<h1>hello</h1>')
// add a script to iframe which simulates requests!
await targetFrame.addScriptTag({
// execute the function immediatly..
content: `(${simulateRequest.toString()})()`
})
}
main()
I'm playing around with service workers. The following code should proxy JS files to patch the imports so that they conform to platform standards (i.e., "./", "../", "/", or "http://...").
Works great in Chromium (67.0.3396.79 on Arch Linux). And seems to work just as well in Firefox (60.0.2 (64-bit) on Arch), at least from the network tab, I can see all of the patched sources loading, but for some reason the JS modules aren't running. Can't console.log etc. or anything. Not sure how to get Firefox to bootstrap the application.
I noticed that the fetch headers are all toLowerCaseed, but I read up on that here and Mozilla also points out that header names are case-insensitive here.
I also thought maybe because the content-length was possibly changed that the file wasn't being received completely, but I didn't see any parse errors and indeed the network tab had the correct content-length changes, so I ruled that out.
const maybeAppendJS = (x) =>
x.endsWith(".js")
? x
: `${x}.js`;
const maybePatchURL = (x) =>
x.match(/(^'#.*'(.)?$)|(^"#.*"(.)?$)/)
? `"/node_modules/${maybeAppendJS(eval(x))}";`
: x;
const maybePatchImport = (x) =>
x.startsWith("import ")
? x.split(/\s/).map(maybePatchURL).join(" ")
: x;
async function maybeRewriteImportStatements(event) {
let candidate = event.request.url;
const url = maybeAppendJS(candidate);
const resp = await fetch(url);
if (!resp.headers.get("content-type").startsWith("text")) {
const text = await resp.text();
const newText = text.split(/\n/g)
.map(maybePatchImport)
.join("\n");
return new Response(newText, {headers: resp.headers});
}
if (resp.headers.get("content-type").startsWith("text/")) {
const location = `${url.substring(0, url.length - 3)}/index.js`;
return new Response(null, {status: 302, headers: {location: location}});
}
console.log("Service worker should never get here");
}
this.addEventListener('fetch', (event) => {
if (event.request.destination === "script" || event.request.referrer.endsWith(".js") || event.request.url.endsWith(".js")) {
event.respondWith(maybeRewriteImportStatements(event));
}
});
This was fixed by upgrading to the Firefox nightly (62.0a1.20180611-1).
What cache strategies are you using? I read the Offline Cookbook and the simplest strategy to use is to cache static content and the left out the API calls.
This strategy seems something like this:
Check if the request is already in cache
If not add the request, response pair to cache
Return response
How to update the cache if on the server side files has changed? Currently the clients gets always the cached results.
Here is my cache strategy's code:
// You will need this polyfill, at least on Chrome 41 and older.
importScripts("serviceworker-cache-polyfill.js");
var VERSION = 1;
var CACHES = {
common: "common-cache" + VERSION
};
// an array of file locations we want to cache
var filesToCache = [
"font-cache.html",
"script.js",
];
var neededFiles = [
"index.html"
];
var errorResponse = function() {
return new Response([
"<h2>Failed to get file</h2>",
"<p>Could not retrive response from cache</p>"
].join("\n"),
500
);
};
var networkFetch = function(request) {
return fetch(request).then(function(response) {
caches.open(CACHES["common"]).then(function(cache) {
return cache.put(request, response);
});
}).catch(function() {
console.error("Network fetch failed");
return errorResponse();
});
}
this.addEventListener("install", function(evt) {
evt.waitUntil(
caches.open(CACHES["common"]).then(function(cache) {
// Cache before
cache.addAll(filesToCache);
return cache.addAll(neededFiles);
})
);
});
this.addEventListener("activate", function(event) {
var expectedCacheNames = Object.keys(CACHES).map(function(key) {
return CACHES[key];
});
console.log("Activate the worker");
// Active worker won"t be treated as activated until promise resolves successfully.
event.waitUntil(
caches.keys().then(function(cacheNames) {
return Promise.all(
cacheNames.map(function(cacheName) {
if (expectedCacheNames.indexOf() ===
-1) {
console.log(
"Deleting out of date cache:",
cacheName);
return caches.delete(cacheName);
}
})
);
})
);
});
self.addEventListener("fetch", function(event) {
console.log("Handling fetch event for", event.request.url);
event.respondWith(
// Opens Cache objects
caches.open(CACHES["common"]).then(function(cache) {
return cache.match(event.request).then(function(
response) {
if (response) {
console.log("Found response in cache", response);
return response;
} else {
return networkFetch(event.request);
}
}).catch(function(error) {
// Handles exceptions that arise from match() or fetch().
console.error(
" Error in fetch handler:",
error);
return errorResponse();
});
})
);
});
You may get familiar with great Jeff Posnick's solution - sw-precache.
Strategy used there is:
Gulp is generating Service Worker file with checksums
Service Worker is registered (with his own checksum)
If files were added/updated, the SW file changes
With next visit, SW checks that its checksum differs, so it registers itself once again with updated files
You may automate this flow with backend in any way you want :)
He described it much better in this article
This is the code I use to cache. It fetches the resource and caches and serves it.
this.addEventListener("fetch", function(event) {
event.respondWith(
fetch(event.request).then(function(response) {
return caches.open("1").then(function(cache) {
return cache.put(event.request, response.clone()).then(function() {
return response
})
})
}).catch(function() {
return caches.match(event.request)
})
)
})
You have to change your Service Worker file. According to Introduction to Service Worker:
When the user navigates to your site, the browser tries to redownload the script file that defined the service worker in the background. If there is even a byte's difference in the service worker file compared to what it currently has, it considers it 'new'.
So even if you only need to change static resources, you'll have to update your service worker file so that a new service worker is registered that updates the cache. (You'll want to make sure to delete any previous caches as well in your activate handler.) #Karol Klepacki's answer suggests a way to automate this.
Alternatively, you could implement logic in your service worker itself to periodically check cached resources for changes and update the entries appropriately.