I am trying to 1) retrieve hundreds of separate json files from this website https://bioguide.congress.gov/ that contains legislators in the U.S., 2) process them and 3) combine them into a big json that contains all the individual records.
Some of the files I am working with (each individual legislator has a different url that contains their data in a json file format) can be found in these urls:
https://bioguide.congress.gov/search/bio/F000061.json
https://bioguide.congress.gov/search/bio/F000062.json
https://bioguide.congress.gov/search/bio/F000063.json
https://bioguide.congress.gov/search/bio/F000064.json
https://bioguide.congress.gov/search/bio/F000091.json
https://bioguide.congress.gov/search/bio/F000092.json
My approach is to create a for loop to loop over the different ids and combine all the records in an array of objects. Unfortunately, I am stuck trying to access the data.
So far, I have tried the following methods but I am getting a CORS error.
Using fetch:
url = "https://bioguide.congress.gov/search/bio/F000061.json"
fetch(url)
.then((res) => res.text())
.then((text) => {
console.log(text);
})
.catch((err) => console.log(err));
Using the no-cors mode in fetch and getting an empty response:
url = "https://bioguide.congress.gov/search/bio/F000061.json"
const data = await fetch(url, { mode: "no-cors" })
Using d3:
url = "https://bioguide.congress.gov/search/bio/F000061.json"
const data = d3.json(url);
I am getting a CORS related error blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. with all of them.
I would appreciate any suggestions and advice to work around this issue. Thanks.
Following on from what #code says in their answer, here's a contrived (but tested) NodeJS example that gets the range of data (60-69) from the server once a second, and compiles it into one JSON file.
import express from 'express';
import fetch from 'node-fetch';
import { writeFile } from 'fs/promises';
const app = express();
const port = process.env.PORT || 4000;
let dataset;
let dataLoadComplete;
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
function getData() {
return new Promise((res, rej) => {
// Initialise the data array
let arr = [];
dataLoadComplete = false;
// Initialise the page number
async function loop(page = 0) {
try {
// Use the incremented page number in the url
const uri = `https://bioguide.congress.gov/search/bio/F00006${page}.json`;
// Get the data, parse it, and add it to the
// array we set up to capture all of the data
const response = await fetch(uri);
const data = await response.json();
arr = [ ...arr, data];
console.log(`Loading page: ${page}`);
// Call the function again to get the next
// set of data if we've not reached the end of the range,
// or return the finalised data in the promise response
if (page < 10) {
setTimeout(loop, 1000, ++page);
} else {
console.log('API calls complete');
res(arr);
}
} catch (err) {
rej(err);
}
}
loop();
});
}
// Call the looping function and, once complete,
// write the JSON to a file
async function main() {
const completed = await getData();
dataset = completed;
dataLoadComplete = true;
writeFile('data.json', JSON.stringify(dataset, null, 2), 'utf8');
}
main();
Well, you're getting a CORS (Cross-Origin Resource Sharing) error because the website you're sending an AJAX request to (bioguide.congress.gov) has not explicitly enabled CORS, which means that you can't send AJAX requests (client-side) to that website because of security reasons.
If you want to send a request to that site, you must send a request from the server-side (such as PHP, Node, Python, etc).
More on the subject
I'm developing an audio-based PWA and, since I'm not familiar with this technology, I have a couple of doubts regading the cache management and invalidation in the service worker.
The application need to work offline, that I covered using a SW precache.
My only doubt is the amount of data: in the experience there are 5 use case scenarios. Each scenario has ~30MB of audio content, that means around 150MB + all images, js and css in total to precache.
I know that this exceeds the limit of some browsers (se this question and this article)
and in general you must be careful with the storage size, that also depends on the user's device available space on disk.
So that's what I thought: since between one scenario and another, the users will stop by a desk with WiFi connection, my idea is to empty the cache runtime after an user's action (like pressing a button), and replace it with thw new content.
This way I would store only one scenario at a time, that means ~35MB, a reasonable size.
Do you think that's a good approach?
What's the best way to implement this?
Here's my current code:
service-worker.js
const PRECACHE = 'precache-test-v1';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
'/',
'/audio/scenario1.mp3',
'/audio/scenario2.mp3',
'/audio/scenario3.mp3',
'/audio/scenario4.mp3',
'/audio/scenario5.mp3',
'/css/style.css',
'/js/bundle.js',
'/img/favicon.png',
'/img/logo.png',
'/img/image1.png',
'/img/image2.png',
'/img/image3.png',
'/img/image4.png',
'/img/image5.png',
];
// never cache these resources
const TO_SKIP = [/* empty for now */];
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', event => {
const now = new Date();
console.log(`PWA Service Worker installing - :: ${now} ::`);
event.waitUntil(caches.open(PRECACHE).then(cache => {
return cache.addAll(PRECACHE_URLS).then(() => {
self.skipWaiting();
});
}));
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', event => {
const now = new Date();
console.log(`PWA Service Worker activating - :: ${now} ::`);
const currentCaches = [PRECACHE];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics and the other provided urls.
if (event.request.url.startsWith(self.location.origin) && TO_SKIP.every(url => !event.request.url.includes(url))) {
event.respondWith(
caches.match(event.request).then(resp => {
return resp || fetch(event.request).then(response => {
return caches.open(PRECACHE).then(cache => {
cache.put(event.request, response.clone());
return response;
});
});
})
);
}
});
index.js
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/sw.js').then(registration => {
console.log('Registration successful, scope is:', registration.scope);
}).catch(error => {
console.log('Service worker registration failed, error:', error);
});
}
Thank you for your time,
Francesco
Hmm.. instead of precaching 5 videos, you could provide an button Save for offline so that the user can save only that videos that he wants to see later offline:
let videoUrl = url to that video:
button.addEventListener('click', function(event) {
event.preventDefault();
caches.open("myVideoCache").then(function(cache) {
fetch(videoUrl)
.then(function(video) {
cache.add(video);
});
});
});
Do delete 1 entry you need to open your cache and delete it. Pass the path that you stored.
caches.open('myVideoCache').then(function(cache) {
cache.delete('/path/to/audio.mp4').then(function(response) {
console.log("entry deleted");
});
})
More details you can find here: https://developers.google.com/web/ilt/pwa/caching-files-with-service-worker
I am calling server data by using ajax in index.html. It is perfectly fetching those data. Now, i am working with serviceworker. I can cache all the static assets(images,js,css) and check those cached assets in Cached storage in application tab in Chrome dev tools. I can see in Network tab also those assets are cached( disk cache).
Now, I want to cache those ajax response(array of image files) using service worker. In network tab, i can see it is calling url (type : xhr ) not cached. I have tried so far to fetch the url and cache those but not able to do it.
Here is my ajax call in index.html
<script type="text/javascript">
$(document).ready(function () {
var url = 'index.cfm?action=main.appcache';
$.ajax({
type:"GET",
url: url,
data: function(data){
var resData = JSON.stringify(data);
},
cache: true,
complete: doSomething
})
});
function doSomething(data) {
console.log(data.responseText);
}
</script>
Here is my serviceWorker fetch event:
self.addEventListener('fetch', (event) => {
if (event.request.mode === 'navigate') {
event.respondWith((async () => {
try {
const preloadResponse = await event.preloadResponse;
if (preloadResponse) {
return preloadResponse;
}
const normalizedUrl = new URL(event.request.url);
if(normalizedUrl.endsWith === 'index.cfm?action=main.appcache'){
const fetchResponseP = fetch(normalizedUrl);
const fetchResponseCloneP = fetchResponseP.then(r => r.clone());
event.waitUntil(async function() {
const cache = await caches.open(precacheName);
await cache.put(normalizedUrl, await fetchResponseCloneP);
}());
return (await caches.match(normalizedUrl)) || fetchResponseP;
}
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
console.log('Fetch failed; returning offline page instead.', error);
const cache = await caches.open(precacheName);
const cachedResponse = await cache.match(offlineDefaultPage);
return cachedResponse;
}
})());
}
});
Please help me what are changes needed to cache the response.
Your fetch event handler starts with
if (event.request.mode === 'navigate') {
// ...
}
That means the code inside of it will only execute if the incoming fetch event is for a navigation request. Only the initial request for an HTML document when first loading a page is a navigation request. Your AJAX requests for other subresources are not navigation requests.
If you want to cache your requests for index.cfm?action=main.appcache in addition to your logic in place for navigation requests, you can add another if statement after your first one, and check for that URL:
self.addEventListener('fetch', (event) => {
if (event.request.mode === 'navigate') {
// ...
}
if (event.request.url.endsWith('index.cfm?action=main.appcache')) {
// Your caching logic goes here. See:
// https://developers.google.com/web/fundamentals/instant-and-offline/offline-cookbook#serving-suggestions
}
});
I need to implement serviceworker file caching, but using a no-cors flag.
This is because I get a CORS error on my webserver for below code. The code is from the standard Ionic 2 starter template (inside serviceworker.js). I can't use the standard code because for some reason the requests trigger an authentication flow in which there is a redirect to some URL, which fails because of a CORS error.
How would I do that in the nicest (cq. easiest) way?
// TODO: Implement this without CORS (set no-cors flag)
self.toolbox.precache(
[
// './build/main.js',
// './build/vendor.js',
// './build/main.css',
// './build/polyfills.js',
// 'index.html',
// 'manifest.json'
]
);
EDIT: It's not really an authentication error that happens, the user is definitely already authenticated. But because of the redirect during the authentication the request for the files above goes wrong. I found this article: What is an opaque request, and what it serves for? which indicates settings the no cors flag would be the solution. The error I get, like on that page is:
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://abc' is therefore not allowed access.
If an opaque response serves your needs, set the request's mode to 'no-cors'
to fetch the resource with CORS disabled.
Solved it myself like below. The install event is what triggers the app to store files locally.
/**
* Check out https://googlechromelabs.github.io/sw-toolbox/ for
* more info on how to use sw-toolbox to custom configure your service worker.
*/
'use strict';
importScripts('./build/sw-toolbox.js');
self.toolbox.options.cache = {
name: 'ionic-cache'
};
// pre-cache our key assets
// TODO: Implemente this without using CORS (set no-cors flag)
/*
self.toolbox.precache(
[
'./build/main.js',
'./build/vendor.js',
'./build/main.css',
'./build/polyfills.js',
'index.html',
'manifest.json'
]
);
*/
// MANUAL precaching in order to evade CORS error on SharePoint
var aFilesToCache = [
'./assets/json/propertyLUT.js',
'./assets/json/propertyvalues.js',
'./build/main.js',
'./build/vendor.js',
'./build/main.css',
'./build/polyfills.js',
'index.html',
'manifest.json'
];
self.addEventListener('fetch', function(event) {
console.log('Handling fetch event for', event.request.url);
event.respondWith(
// Opens Cache objects that start with 'font'.
caches.open('pwa_').then(function(cache) {
return cache.match(event.request).then(function(response) {
if (response) {
console.log('Found response in cache:', response);
return response;
}
}).catch(function(error) {
// Handles exceptions that arise from match() or fetch().
console.error('Error in fetch handler:', error);
throw error;
});
})
);
});
self.addEventListener('install', event => {
function onInstall(event, filesToCache) {
console.log('Hit event INSTALL');
return Promise.all(filesToCache.map(function(aUrl)
{
return caches.open('pwa_').then(function(cache)
{
debugger;
aUrl = resolveURL(aUrl, self.location.href);
return fetch(aUrl, { mode: 'no-cors' })
.then(function(response)
{
return cache.put(aUrl, response.clone());
});
})
}))
}
event.waitUntil(
onInstall(event, aFilesToCache).then( () => self.skipWaiting() )
);
});
function resolveURL(relative, base) {
var stack = base.split("/"),
parts = relative.split("/");
stack.pop(); // remove current file name (or empty string)
// (omit if "base" is the current folder without trailing slash)
for (var i=0; i<parts.length; i++) {
if (parts[i] == ".")
continue;
if (parts[i] == "..")
stack.pop();
else
stack.push(parts[i]);
}
return stack.join("/");
}
/*
foreach(aUrl in aFilesToCache)
{
var corsRequest = new Request(url, {mode: 'no-cors'});
fetch(corsRequest).then(response => {
return cache.put("pwa_" + url, response);
}); // response won't be opaque.
}
*/
// dynamically cache any other local assets
self.toolbox.router.any('/*', self.toolbox.fastest);
// for any other requests go to the network, cache,
// and then only use that cached resource if your user goes offline
self.toolbox.router.default = self.toolbox.networkFirst;
We have a web app (built using AngularJS) that we're gradually adding PWA 'features' too (service worker, launchable, notifications, etc). One of the features our web app has is the ability to complete a web form while offline. At the moment, we store the data in IndexedDB when offline, and simply encourage the user to push that data to the server once they're online ("This form is saved to your device. Now you're back online, you should save it to the cloud..."). We will do this automatically at some point, but that's not necessary at the moment.
We are adding a feature to these web forms, whereby the user will be able to attach files (images, documents) to the form, perhaps at several points throughout the form.
My question is this - is there a way for service worker to handle file uploads? To somehow - perhaps - store the path to the file to be uploaded, when offline, and push that file up once the connection has been restored? Would this work on mobile devices, as do we have access to that 'path' on those devices? Any help, advice or references would be much appreciated.
When the user selects a file via an <input type="file"> element, we are able to get the selected file(s) via fileInput.files. This gives us a FileList object, each item in it being a File object representing the selected file(s). FileList and File are supported by HTML5's Structured Clone Algorithm.
When adding items to an IndexedDB store, it creates a structured clone of the value being stored. Since FileList and File objects are supported by the structured clone algorithm, this means that we can store these objects in IndexedDB directly.
To perform those file uploads once the user goes online again, you can use the Background Sync feature of service workers. Here's an introductory article on how to do that. There are a lot of other resources for that as well.
In order to be able to include file attachments in your request once your background sync code runs, you can use FormData. FormDatas allow adding File objects into the request that will be sent to your backend, and it is available from within the service worker context.
One way to handle file uploads/deletes and almost everything, is by keeping track of all the changes made during the offline requests. We can create a sync object with two arrays inside, one for pending files that will need to be uploaded and one for deleted files that will need to be deleted when we'll get back online.
tl;dr
Key phases
Service Worker Installation
Along with static data, we make sure to fetch dynamic data as the main listing of our uploaded files (in the example case /uploads GET returns JSON data with the files).
Service Worker Fetch
Handling the service worker fetch event, if the fetch fails, then we have to handle the requests for the files listing, the requests that upload a file to the server and the request that deletes a file from the server. If we don't have any of these requests, then we return a match from the default cache.
Listing GET
We get the cached object of the listing (in our case /uploads) and the sync object. We concat the default listing files with the pending files and we remove the deleted files and we return new response object with a JSON result as the server would have returned it.
Uloading PUT
We get the cached listing files and the sync pending files from the cache. If the file isn't present, then we create a new cache entry for that file and we use the mime type and the blob from the request to create a new Response object that it will be saved to the default cache.
Deleting DELETE
We check in the cached uploads and if the file is present we delete the entry from both the listing array and the cached file. If the file is pending we just delete the entry from the pending array, else if it's not already in the deleted array, then we add it. We update listing, files and sync object cache at the end.
Syncing
When the online event gets triggered, we try to synchronize with the server. We read the sync cache.
If there are pending files, then we get each file Response object from cache and we send a PUT fetch request back to the server.
If there are deleted files, then we send a DELETE fetch request for each file to the server.
Finally, we reset the sync cache object.
Code implementation
(Please read the inline comments)
Service Worker Install
const cacheName = 'pwasndbx';
const syncCacheName = 'pwasndbx-sync';
const pendingName = '__pending';
const syncName = '__sync';
const filesToCache = [
'/',
'/uploads',
'/styles.css',
'/main.js',
'/utils.js',
'/favicon.ico',
'/manifest.json',
];
/* Start the service worker and cache all of the app's content */
self.addEventListener('install', function(e) {
console.log('SW:install');
e.waitUntil(Promise.all([
caches.open(cacheName).then(async function(cache) {
let cacheAdds = [];
try {
// Get all the files from the uploads listing
const res = await fetch('/uploads');
const { data = [] } = await res.json();
const files = data.map(f => `/uploads/${f}`);
// Cache all uploads files urls
cacheAdds.push(cache.addAll(files));
} catch(err) {
console.warn('PWA:install:fetch(uploads):err', err);
}
// Also add our static files to the cache
cacheAdds.push(cache.addAll(filesToCache));
return Promise.all(cacheAdds);
}),
// Create the sync cache object
caches.open(syncCacheName).then(cache => cache.put(syncName, jsonResponse({
pending: [], // For storing the penging files that later will be synced
deleted: [] // For storing the files that later will be deleted on sync
}))),
])
);
});
Service Worker Fetch
self.addEventListener('fetch', function(event) {
// Clone request so we can consume data later
const request = event.request.clone();
const { method, url, headers } = event.request;
event.respondWith(
fetch(event.request).catch(async function(err) {
const { headers, method, url } = event.request;
// A custom header that we set to indicate the requests come from our syncing method
// so we won't try to fetch anything from cache, we need syncing to be done on the server
const xSyncing = headers.get('X-Syncing');
if(xSyncing && xSyncing.length) {
return caches.match(event.request);
}
switch(method) {
case 'GET':
// Handle listing data for /uploads and return JSON response
break;
case 'PUT':
// Handle upload to cache and return success response
break;
case 'DELETE':
// Handle delete from cache and return success response
break;
}
// If we meet no specific criteria, then lookup to the cache
return caches.match(event.request);
})
);
});
function jsonResponse(data, status = 200) {
return new Response(data && JSON.stringify(data), {
status,
headers: {'Content-Type': 'application/json'}
});
}
Service Worker Fetch Listing GET
if(url.match(/\/uploads\/?$/)) { // Failed to get the uploads listing
// Get the uploads data from cache
const uploadsRes = await caches.match(event.request);
let { data: files = [] } = await uploadsRes.json();
// Get the sync data from cache
const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
const sync = await syncRes.json();
// Return the files from uploads + pending files from sync - deleted files from sync
const data = files.concat(sync.pending).filter(f => sync.deleted.indexOf(f) < 0);
// Return a JSON response with the updated data
return jsonResponse({
success: true,
data
});
}
Service Worker Fetch Uloading PUT
// Get our custom headers
const filename = headers.get('X-Filename');
const mimetype = headers.get('X-Mimetype');
if(filename && mimetype) {
// Get the uploads data from cache
const uploadsRes = await caches.match('/uploads', { cacheName });
let { data: files = [] } = await uploadsRes.json();
// Get the sync data from cache
const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
const sync = await syncRes.json();
// If the file exists in the uploads or in the pendings, then return a 409 Conflict response
if(files.indexOf(filename) >= 0 || sync.pending.indexOf(filename) >= 0) {
return jsonResponse({ success: false }, 409);
}
caches.open(cacheName).then(async (cache) => {
// Write the file to the cache using the response we cloned at the beggining
const data = await request.blob();
cache.put(`/uploads/${filename}`, new Response(data, {
headers: { 'Content-Type': mimetype }
}));
// Write the updated files data to the uploads cache
cache.put('/uploads', jsonResponse({ success: true, data: files }));
});
// Add the file to the sync pending data and update the sync cache object
sync.pending.push(filename);
caches.open(syncCacheName).then(cache => cache.put(new Request(syncName), jsonResponse(sync)));
// Return a success response with fromSw set to tru so we know this response came from service worker
return jsonResponse({ success: true, fromSw: true });
}
Service Worker Fetch Deleting DELETE
// Get our custom headers
const filename = headers.get('X-Filename');
if(filename) {
// Get the uploads data from cache
const uploadsRes = await caches.match('/uploads', { cacheName });
let { data: files = [] } = await uploadsRes.json();
// Get the sync data from cache
const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
const sync = await syncRes.json();
// Check if the file is already pending or deleted
const pendingIndex = sync.pending.indexOf(filename);
const uploadsIndex = files.indexOf(filename);
if(pendingIndex >= 0) {
// If it's pending, then remove it from pending sync data
sync.pending.splice(pendingIndex, 1);
} else if(sync.deleted.indexOf(filename) < 0) {
// If it's not in pending and not already in sync for deleting,
// then add it for delete when we'll sync with the server
sync.deleted.push(filename);
}
// Update the sync cache
caches.open(syncCacheName).then(cache => cache.put(new Request(syncName), jsonResponse(sync)));
// If the file is in the uplods data
if(uploadsIndex >= 0) {
// Updates the uploads data
files.splice(uploadsIndex, 1);
caches.open(cacheName).then(async (cache) => {
// Remove the file from the cache
cache.delete(`/uploads/${filename}`);
// Update the uploads data cache
cache.put('/uploads', jsonResponse({ success: true, data: files }));
});
}
// Return a JSON success response
return jsonResponse({ success: true });
}
Synching
// Get the sync data from cache
const syncRes = await caches.match(new Request(syncName), { cacheName: syncCacheName });
const sync = await syncRes.json();
// If the are pending files send them to the server
if(sync.pending && sync.pending.length) {
sync.pending.forEach(async (file) => {
const url = `/uploads/${file}`;
const fileRes = await caches.match(url);
const data = await fileRes.blob();
fetch(url, {
method: 'PUT',
headers: {
'X-Filename': file,
'X-Syncing': 'syncing' // Tell SW fetch that we are synching so to ignore this fetch
},
body: data
}).catch(err => console.log('sync:pending:PUT:err', file, err));
});
}
// If the are deleted files send delete request to the server
if(sync.deleted && sync.deleted.length) {
sync.deleted.forEach(async (file) => {
const url = `/uploads/${file}`;
fetch(url, {
method: 'DELETE',
headers: {
'X-Filename': file,
'X-Syncing': 'syncing' // Tell SW fetch that we are synching so to ignore this fetch
}
}).catch(err => console.log('sync:deleted:DELETE:err', file, err));
});
}
// Update and reset the sync cache object
caches.open(syncCacheName).then(cache => cache.put(syncName, jsonResponse({
pending: [],
deleted: []
})));
Example PWA
I have created a PWA example that implements all these, which you can find and test here. I have tested it using Chrome and Firefox and using Firefox Android on a mobile device.
You can find the full source code of the application (including an express server) in this Github repository: https://github.com/clytras/pwa-sandbox.
The Cache API is designed to store a request (as the key) and a response (as the value) in order to cache a content from the server, for the web page. Here, we're talking about caching user input for future dispatch to the server. In other terms, we're not trying to implement a cache, but a message broker, and that's not currently something handled by the Service Worker spec (Source).
You can figure it out by trying this code:
HTML:
<button id="get">GET</button>
<button id="post">POST</button>
<button id="put">PUT</button>
<button id="patch">PATCH</button>
JavaScript:
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
console.log('Registration succeeded. Scope is ' + reg.scope);
}).catch(function (error) {
console.log('Registration failed with ' + error);
});
};
document.getElementById('get').addEventListener('click', async function () {
console.log('Response: ', await fetch('50x.html'));
});
document.getElementById('post').addEventListener('click', async function () {
console.log('Response: ', await fetch('50x.html', { method: 'POST' }));
});
document.getElementById('put').addEventListener('click', async function () {
console.log('Response: ', await fetch('50x.html', { method: 'PUT' }));
});
document.getElementById('patch').addEventListener('click', async function () {
console.log('Response: ', await fetch('50x.html', { method: 'PATCH' }));
});
Service Worker:
self.addEventListener('fetch', function (event) {
var response;
event.respondWith(fetch(event.request).then(function (r) {
response = r;
caches.open('v1').then(function (cache) {
cache.put(event.request, response);
}).catch(e => console.error(e));
return response.clone();
}));
});
Which throws:
TypeError: Request method 'POST' is unsupported
TypeError: Request method 'PUT' is unsupported
TypeError: Request method 'PATCH' is unsupported
Since, the Cache API can't be used, and following the Google guidelines, IndexedDB is the best solution as a data store for ongoing requests.
Then, the implementation of a message broker is the responsibility of the developer, and there is no unique generic implementation that will cover all of the use cases. There are many parameters that will determine the solution:
Which criteria will trigger the use of the message broker instead of the network? window.navigator.onLine? A certain timeout? Other?
Which criteria should be used to start trying to forward ongoing requests on the network? self.addEventListener('online', ...)? navigator.connection?
Should requests respect the order or should they be forwarded in parallel? In other terms, should they be considered as dependent on each other, or not?
If run in parallel, should they be batched to prevent a bottleneck on the network?
In case the network is considered available, but the requests still fail for some reason, which retry logic to implement? Exponential backoff? Other?
How to notify the user that their actions are in a pending state while they are?
...
This is really very broad for a single StackOverflow answer.
That being said, here is a minimal working solution:
HTML:
<input id="file" type="file">
<button id="sync">SYNC</button>
<button id="get">GET</button>
JavaScript:
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/service-worker.js', { scope: '/' }).then(function (reg) {
console.log('Registration succeeded. Scope is ' + reg.scope);
}).catch(function (error) {
console.log('Registration failed with ' + error);
});
};
document.getElementById('get').addEventListener('click', function () {
fetch('api');
});
document.getElementById('file').addEventListener('change', function () {
fetch('api', { method: 'PUT', body: document.getElementById('file').files[0] });
});
document.getElementById('sync').addEventListener('click', function () {
navigator.serviceWorker.controller.postMessage('sync');
});
Service Worker:
self.importScripts('https://unpkg.com/idb#5.0.1/build/iife/index-min.js');
const { openDB, deleteDB, wrap, unwrap } = idb;
const dbPromise = openDB('put-store', 1, {
upgrade(db) {
db.createObjectStore('put');
},
});
const idbKeyval = {
async get(key) {
return (await dbPromise).get('put', key);
},
async set(key, val) {
return (await dbPromise).put('put', val, key);
},
async delete(key) {
return (await dbPromise).delete('put', key);
},
async clear() {
return (await dbPromise).clear('put');
},
async keys() {
return (await dbPromise).getAllKeys('put');
},
};
self.addEventListener('fetch', function (event) {
if (event.request.method === 'PUT') {
let body;
event.respondWith(event.request.blob().then(file => {
// Retrieve the body then clone the request, to avoid "body already used" errors
body = file;
return fetch(new Request(event.request.url, { method: event.request.method, body }));
}).then(response => handleResult(response, event, body)).catch(() => handleResult(null, event, body)));
} else if (event.request.method === 'GET') {
event.respondWith(fetch(event.request).then(response => {
return response.ok ? response : caches.match(event.request);
}).catch(() => caches.match(event.request)));
}
});
async function handleResult(response, event, body) {
const getRequest = new Request(event.request.url, { method: 'GET' });
const cache = await caches.open('v1');
await idbKeyval.set(event.request.method + '.' + event.request.url, { url: event.request.url, method: event.request.method, body });
const returnResponse = response && response.ok ? response : new Response(body);
cache.put(getRequest, returnResponse.clone());
return returnResponse;
}
// Function to call when the network is supposed to be available
async function sync() {
const keys = await idbKeyval.keys();
for (const key of keys) {
try {
const { url, method, body } = await idbKeyval.get(key);
const response = await fetch(url, { method, body });
if (response && response.ok)
await idbKeyval.delete(key);
}
catch (e) {
console.warn(`An error occurred while trying to sync the request: ${key}`, e);
}
}
}
self.addEventListener('message', sync);
Some words about the solution: it allows to cache the PUT request for future GET requests, and it also stores the PUT request into an IndexedDB database for future sync. About the key, I was inspired by Angular's TransferHttpCacheInterceptor which allows to serialize backend requests on the server-side rendered page for use by the browser-rendered page. It uses <verb>.<url> as the key. That supposes a request will override another request with the same verb and URL.
This solution also supposes that the backend does not return 204 No content as a response of a PUT request, but 200 with the entity in the body.
I was also stumbling upon it lately. Here is what I am doing to store in index db and return response when offline.
const storeFileAndReturnResponse = async function (request, urlSearchParams) {
let requestClone = request.clone();
let formData = await requestClone.formData();
let tableStore = "fileUploads";
let fileList = [];
let formDataToStore = [];
//Use formData.entries to iterate collection - this assumes you used input type= file
for (const pair of formData.entries()) {
let fileObjectUploaded = pair[1];
//content holds the arrayBuffer (blob) of the uploaded file
formDataToStore.push({
key: pair[0],
value: fileObjectUploaded,
content: await fileObjectUploaded.arrayBuffer(),
});
let fileName = fileObjectUploaded.name;
fileList.push({
fileName: fileName,
});
}
let payloadToStore = {
parentId: parentId,
fileList: fileList,
formDataKeyValue: formDataToStore,
};
(await idbContext).put(tableStore, payloadToStore);
return {
UploadedFileList: fileList,
};
};