Serviceworker event.respondWith network error - javascript

I running my radio site, and just created service-worker.js and it works, but when I go first time on the site, html5 audio works, but then if I don't clear cookies and site data, it won't work and this error comes out in console:
( I need always clear site data, if I wanna hear music )
Failed to load ‘http://myradio.com:8000/radio’. A ServiceWorker passed
a promise to FetchEvent.respondWith() that rejected with ‘TypeError:
NetworkError when attempting to fetch resource.’.
Service-worker.js
var cacheName = 'Myradio';
var filesToCache = [
'/',
'/index.php',
'/assets/css/all.min.css',
'/assets/js/jquery.min.js',
'/assets/js/jquery.ui.touch-punch.min.js',
'/main.js'
];
/* Start the service worker and cache all of the app's content */
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open(cacheName).then(function(cache) {
return cache.addAll(filesToCache);
})
);
});
/* Serve cached content when offline */
self.addEventListener('fetch', function(e) {
e.respondWith(
caches.match(e.request).then(function(response) {
return response || fetch(e.request);
})
);
});
HTML
<audio id="myAudio" preload="metadata">
<source src="http://myradio.com:8000/radio" />
</audio>

Service workers are required to have a secure origin hosted on "https". I assume your site has that since you can register your service worker.
The problem arises when you attempt to make the request to "http://myradio.com:8000/radio". This is an "http:" URL and not to a secure origin. When you do this in a page you will get a "mixed content" warning in the browser UI.
Mixed content, however, is not permitted at all in a service worker. Making a fetch() call to an "http:" URL from a service worker script will return a NetworkError.
There are two solutions to your problem:
Host the radio stream on https.
Check for the http URL before calling respondWith() and early return from the service worker script.
Option (2) is probably less work here. Something like:
/* Serve cached content when offline */
self.addEventListener('fetch', function(e) {
// we cannot fetch mixed-content from a service worker, so early return
if (e.request.url.startsWith('http:'))
return;
e.respondWith(
caches.match(e.request).then(function(response) {
return response || fetch(e.request);
})
);
});

Related

Firefox "A ServiceWorker passed a promise to FetchEvent.respondWith() that resolved with non-Response value ‘undefined’" for Local Server

When running a local server from Visual Studio, Firefox will error out and not load the page correctly. It shows this error:
Failed to load ‘https://localhost/js/mqtt.js’. A ServiceWorker passed a promise to FetchEvent.respondWith() that resolved with non-Response value ‘undefined’. serviceworker.js:19:10
it works fine on Chrome. It also works fine on Firefox when the server is on a cloud-based Azure server. Here is the code for the service worker:
// Service worker file for PWA
var CACHE_NAME = 'v5';
var urlsToCache = [
'/index.html'
];
self.addEventListener('install', function (event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function (cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('fetch', event => {
event.respondWith(
fetch(event.request).then(response => {
cache.put(event.request, response.clone());
return response;
}).catch(_ => {
return caches.match(event.request);
})
)
});
I am unsure what this error is caused by. One workaround is to go to "about:debugging#workers" and unregister the serviceworkers manually. Then refreshing the page will allow it to load correctly. However I need a solution, not a workaround.
Based on that code, I'd expect to see that error message if the following two conditions are true:
fetch(event.request) rejects.
caches.match(event.request) results in a cache miss, which causes the promise to resolve with undefined.
This isn't really out of the ordinary—it's just the logic you've written in your service worker, and the behavior depends on both the current network/server conditions, as well as the state of your local cache.
The same mistake happened to me. The problem was that I had CORS disabled for some route on the server.
Enabling CORS on the server for all routes solved it.
// Enable cors for all routes
app.use(cors(corsOptions));
// To enable cors only for a single route must be added on the app.get
// app.get('/', cors(corsOptions), nextCallback() )

How do I load a service worker before all other requests?

I'm trying to load a service worker before all subresource requests on the page so I can apply some optimizations to the way subresources are loaded (e.g. lazy-loading, loading minified versions of assets instead of full assets). However, I cannot find a way load my SW before other subresource requests begin.
I created a simple proof-of-concept of what I'm trying to do to 401 any requests handled by my Service Worker (just to make it easier to find when my SW begins handling requests).
Here's my HTML:
<!doctype html>
<head>
<script>
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/dl-wps-sw.js', { scope: '/' }).then(function (registration) {
console.log('Service Worker registration successful with scope: ', registration.scope);
}, function (err) {
console.error(err);
});
}
</script>
...
and here's my Service Worker:
self.addEventListener('install', function (event) {
self.skipWaiting();
});
self.addEventListener('activate', () => {
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
const init = {status: 401, statusText: 'Blocked!'};
event.respondWith(new Response(null, init));
});
This is what happens in my browser:
As you can see in the screenshot, even though my code to register the Service Worker is at the very top of the page, it doesn't activate and begin handling requests until a bit later, and by then a large number of critical requests I need to catch have already fired.
I found someone else who seemed to be trying to do what I'm trying to accomplish (2 years earlier) in a Github issue for the Service Worker spec: https://github.com/w3c/ServiceWorker/issues/1282
It seems they suggested using a <link rel="serviceworker"... tag to do this, but it appears this link type has since been removed from Chrome for some reason: https://www.chromestatus.com/feature/5682681044008960
I've tried several other ideas to attempt to load my SW first:
Preloading my SW with a <link rel="preload"... tag
Preloading my SW with a fetch/XMLHttpRequest
Inlining the Service Worker in the HTML (not possible apparently)
Delaying the execution of the page by running a while loop for a few seconds (this kinda worked, but its a terrible unpredictable hack)
Any ideas or strategies I'm missing? My Google-fu has failed me on coming up with a solution.
You'll need to install the SW alone first, then refresh the page. The SW can then serve content for a SW-enabled page and intercept all other requests. Example: fetch-progress.anthum.com
index.html
<p>Installing Service Worker, please wait...</p>
<script>
navigator.serviceWorker.register('sw.js')
.then(reg => {
if (reg.installing) {
const sw = reg.installing || reg.waiting;
sw.onstatechange = function() {
if (sw.state === 'installed') {
// SW installed. Refresh page so SW can respond with SW-enabled page.
window.location.reload();
}
};
} else if (reg.active) {
// something's not right or SW is bypassed. previously-installed SW should have redirected this request to different page
handleError(new Error('Service Worker is installed and not redirecting.'))
}
})
.catch(handleError)
function handleError(error) {}
</script>
sw.js
self.addEventListener('fetch', event => {
const url = event.request.url;
const scope = self.registration.scope;
// serve index.html with service-worker-enabled page
if (url === scope || url === scope+'index.html') {
const newUrl = scope+'index-sw-enabled.html';
event.respondWith(fetch(newUrl))
} else {
// process other files here
}
});
index-sw-enabled.html
<!--
This page shows after SW installs.
Put your main app content on this page.
-->

How to access scripts offline in a webworker?

I am trying to do some performance improvements for our web app by using web workers. I need to include some scripts that I was using importScripts(). But the conundrum is that importScripts fails when trying to access offline. How do I access these files offline using Cache API? Do I need to implement custom reader for reading ReadableStream? Is there a better standard to implement offline cache access inside web workers?
Details
These files are javascript scripts which have some custom js and external libraries like CryptoJS and LocalForage. I would like to implement - Network falling back to Cache paradigm using CacheAPI/Service Workers.
I initially implemented a standard Service Worker with an install and fetch event listeners but I believe the scope between the service worker and the web worker was not the same. After some research on MDN and exploration, I see that Cache API is available within the WebWorkerScope so I moved the cache call within the web worker scope.
I have tried various ways of accessing these files by using fetch events and just getting the files from cache. I get a response back after my promises resolve but the body of the response is a readable stream and I am not sure how to resolve that.
Any help or pointers would be really appreciated.
My web worker invocation
var worker = new Worker('Path');
I have attempted to follow the write up as a guide -
https://developers.google.com/web/ilt/pwa/caching-files-with-service-worker
// Web Worker
self.addEventListener('fetch', function(event){
event.respondWith(
fetch(event.request).catch(function(){
return caches.match(event.request);
})
)
});
caches.open('mcaseworker').then(function(cache){
var urlList = ['/resources/scripts/custom/globalConfig.js',
'/resources/scripts/localforage/localforage.min.js'
'/resources/scripts/utility/pako.js',
'/resources/scripts/cryptojs/aes.js',
'/resources/scripts/cryptojs/sha1.js'
];
// Initialize a promise all with the list of urls
Promise.all(urlList.map(function(url){
return fetch(url, {
headers : {
'Content-Type' : 'application/x-javascript'
}
})
.then(function(response){
if (response){
return response;
}
});
}))
.then(function(values){
Promise.all(values.map(function(value){
return value;
}))
.then(function(res){
// Custom Code
// Would like to access localforage and other javascript libraries.
})
})
})
Response after promises resolve.
Web workers don't have a fetch event, so your code listening on the fetch event will never trigger. You should put your cache and fetch event listener in a service worker.
Main code:
if ('serviceWorker' in navigator) {
// Register a service worker hosted at the root of the
// site using the default scope.
navigator.serviceWorker.register('/sw.js').then(function(registration) {
console.log('Service worker registration succeeded:', registration);
}, /*catch*/ function(error) {
console.log('Service worker registration failed:', error);
});
} else {
console.log('Service workers are not supported.');
}
const worker = new Worker("/worker.js");
sw.js
self.addEventListener('fetch', function(event){
event.respondWith(
fetch(event.request).catch(function(){
return caches.match(event.request);
})
)
});
//Add cache opening code here
worker.js
// Import scripts here
importScripts('/resources/scripts/localforage/localforage.min.js');
You can see this answer for more information about the difference between web workers and service workers.

Javascript - Service Workers not working correctly

I am using service workers to create an offline page for my website.
At the moment I am saving offline.html into cache so that the browser can show this file if there is no interent connection.
In the fetch event of my service worker I attempt to load index.html, and if this fails (no internet connection) I load offline.html from cache.
However, whenever I check offline mode in developer tools and refresh the page index.html still shows...
The request isn't failing, and it looks like index.html is being cached even though I didn't specify it to be.
Here is my HTML for index.html:
<!DOCTYPE html>
<html>
<head>
<title>Service Workers - Test</title>
</head>
<body>
<h1> Online page! </h1>
<h3> You are connected to the internet. </h3>
</body>
<script>
if ('serviceWorker' in navigator)
{
navigator.serviceWorker.register('service-worker.js');
}
</script>
</html>
Here is my HTML for offline.html:
<!DOCTYPE html>
<html>
<head>
<title>You are Offline - Service Workers - Test</title>
</head>
<body>
<h1> Welcome to the Offline Page!</h1>
<h2> You are not connected to the internet but you can still do certain things offline. </h2>
</body>
</html>
Here is my javascript for service-worker.js:
const PRECACHE = "version1"
const CACHED = ["offline.html"];
// Caches "offline.html" incase there is no internet
self.addEventListener('install', event => {
console.log("[Service Worker] Installed");
caches.delete(PRECACHE)
event.waitUntil (
caches.open(PRECACHE)
.then(cache => cache.addAll(CACHED))
.then( _ => self.skipWaiting())
);
});
// Clears any caches that do not match this version
self.addEventListener("activate", event => {
event.waitUntil (
caches.keys()
.then(keys => {
return Promise.all (
keys.filter(key => {
return !key.startsWith(PRECACHE);
})
.map(key => {
return caches.delete(key);
})
);
})
.then(() => {
console.log('[Service Worker] Cleared Old Cache');
})
);
});
this.addEventListener('fetch', function(event) {
if (event.request.method !== 'GET') return;
console.log("[Service Worker] Handling Request ");
// If the request to `index.html` works it shows it, but if it fails it shows the cached version of `offline.html`
// This isn't working because `fetch` doesn't fail when there is no internet for some reason...
event.respondWith (
fetch(event.request)
.then(response => {
console.log("[Service Worker] Served from NETWORK");
return response;
}, () => {
console.log("[Service Worker] Served from CACHE");
return catches.match(event.request.url + OFFLINE_URL);
})
);
});
I am running a server using python's simple http server like so:
python -m SimpleHTTPServer
Does anyone know why the offline page isn't working and how I can fix this?
Thanks for the help,
David
EDIT:
These images are showing that index.html (localhost) is still loading without internet which means it must be cached.
Edit 2:
I've tried to add no-cache to the fetch of index.html and it still is fetching index.html when I have offline checked.
fetch(event.request, {cache: "no-cache"}) ...
I think we have all forgotten how the network request works from a browser's point of view.
The issue here is, index.html is served from the disk cache when the service worker intercepts requests.
browser ===> Service Worker ===>fetch event
inside the fetch event, we have ,
Check If there is network connectivity
If there is, fetch from network and respond
Else, fetch from cache and respond
Now, how does
If there is network connectivity, fetch from network work?
Service Worker OnFetch ===> Check in Disk Cache ===>Nothing? Fetch Online
The page being fetched here, is index.html
and the cache-control headers for index.html ,
Do Not Specify a no-cache
Hence the whole issue of the offline page not showing up.
Solution
Set a cache-control header with limiting values for index.html - On the server side
Or, add headers in the fetch request to the effect
pragma:no-cache
cache-control:no-cache
How Do I add these headers to fetch?
Apparently, fetch and the browser have their own reservations about the request body when it comes to a GET
Also, weirdness and utter chaos happens If you reuse the event.request object, for a fetch request, and add custom headers.
The chaos is a list of Uncaught Exceptions due to the fetch event's request.mode attribute , which bars you from adding custom headers to a fetch when under a no-cors or a navigate mode.
Our goal is to :
Identify that the browser is truly offline and then serve a page that says so
Here's How:
Check If you can fetch a dummy html page say test-connectivity.html under your origin, with a custom cache: no-cache header. If you can, proceed, else throw the offline page
self.addEventListener( 'fetch', ( event ) => {
let headers = new Headers();
headers.append( 'cache-control', 'no-cache' );
headers.append( 'pragma', 'no-cache' );
var req = new Request( 'test-connectivity.html', {
method: 'GET',
mode: 'same-origin',
headers: headers,
redirect: 'manual' // let browser handle redirects
} );
event.respondWith( fetch( req, {
cache: 'no-store'
} )
.then( function ( response ) {
return fetch( event.request )
} )
.catch( function ( err ) {
return new Response( '<div><h2>Uh oh that did not work</h2></div>', {
headers: {
'Content-type': 'text/html'
}
} )
} ) )
} );
The {cache:'no-store'} object as the second parameter to fetch , is an unfortunate NO-OP. Just doesn't work.
Just keep it for the sake of a future scenario. It is really optional as of today.
If that worked, then you do not need to build a whole new Request object for fetch
cheers!
The code piece that creates a new request is generously borrowed from
#pirxpilot 's answer here
The offline worker for this specific question on pastebin
https://pastebin.com/sNCutAw7
David, you have two errors in one line.
Your line
return catches.match(event.request.url + OFFLINE_URL);
should be
return caches.match('offline.html');
It's catches and you haven't defined OFFLINE_URL and you don't need event request url
I tried your code and I got the same result as you in the dev tools network tab. The network tab says it loaded the index.html from service-worker, but actually the service-worker returns the cached Offline Page as expected!

Service worker offline support with pushstate and client side routing

I'm using a service worker to introduce offline functionality for my single page web app. It's pretty straightforward - use the network when available, or try and fetch from the cache if not:
service-worker.js:
self.addEventListener("fetch", event => {
if(event.request.method !== "GET") {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
var responseClone = networkResponse.clone();
if (networkResponse.status == 200) {
caches.open("mycache").then(cache => cache.put(event.request, responseClone));
}
return networkResponse;
})
.catch(_ => {
return caches.match(event.request);
})
)
})
So it intercepts all GET requests and caches them for future use, including the initial page load.
Switching to "offline" in DevTools and refreshing at the root of the application works as expected.
However, my app uses HTML5 pushstate and a client side router. The user could navigate to a new route, then go offline, then hit refresh, and will get a "no internet" message, because the service worker was never told about this new URL.
I can't think of a way around it. As with most SPAs, my server is configured to serve the index.html for a number of catch-all URLs. I need some sort of similar behaviour for the service worker.
Inside your fetch handler, you need to check whether event.request.mode is set to 'navigate'. If so, it's a navigation, and instead of responding with a cached response that matches the specific URL, you can respond with a cached response for your index.html. (Or app-shell.html, or whatever URL you use for the generic HTML for your SPA.)
Your updated fetch handler would look roughly like:
self.addEventListener('fetch', event => {
if (event.request.method !== 'GET') {
return;
}
if (event.request.mode === 'navigate') {
event.respondWith(caches.match('index.html'));
return;
}
// The rest of your fetch handler logic goes here.
});
This is a common use case for service workers, and if you'd prefer to use a pre-packaged solution, the NavigationRoute class in the workbox-routing module can automate it for you.

Categories