how to access to localStorage from firebase-messaging-sw.js [duplicate] - javascript

I want to periodically call an API from my service worker to send data stored in the localStorage. This data will be produced and saved in localStorage when a user browses my website. Consider it something like saving stats in localStorage and sending it periodically through the service worker. How should I do this? I understand that I can't access localStorage from the service worker and will have to use the postMessage API. Any help would be highly appreciated.

You cannot access localStorage (and also sessionStorage) from a webworker process, they result will be undefined, this is for security reasons.
You need to use postMessage() back to the Worker's originating code, and have that code store the data in localStorage.
You should use localStorage.setItem() and localStorage.getItem() to save and get data from local storage.
More info:
Worker.postMessage()
Window.localStorage
Pseudo code below, hoping it gets you started:
// include your worker
var myWorker = new Worker('YourWorker.js'),
data,
changeData = function() {
// save data to local storage
localStorage.setItem('data', (new Date).getTime().toString());
// get data from local storage
data = localStorage.getItem('data');
sendToWorker();
},
sendToWorker = function() {
// send data to your worker
myWorker.postMessage({
data: data
});
};
setInterval(changeData, 1000)

Broadcast Channel API is easier
There are several ways to communicate between the client and the controlling service worker, but localStorage is not one of them.
IndexedDB is, but this might be an overkill for a PWA that by all means should remain slim.
Of all means, the Broadcast Channel API results the easiest. It is by far much easier to implement than above-mentioned postMessage() with the MessageChannel API.
Here is how broadcasting works
Define a new broadcasting channel in both the service worker and the client.
const channel4Broadcast = new BroadcastChannel('channel4');
To send a broadcast message in either the worker or the client:
channel4Broadcast.postMessage({key: value});
To receive a broadcast message in either the worker or the client:
channel4Broadcast.onmessage = (event) => {
value = event.data.key;
}

I've been using this package called localforage that provides a localStorage-like interface that wraps around IndexedDB. https://github.com/localForage/localForage
You can then import it by placing it in your public directory, so it is served by your webserver, and then calling: self.importScripts('localforage.js'); within your service worker.

https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers says
Note: localStorage works in a similar way to service worker cache, but it is synchronous, so not allowed in service workers.
Note: IndexedDB can be used inside a service worker for data storage if you require it.
Also there is a bit of discussion here: How do I access the local storage using service workers in angular?

Stumbling over this question myself for a tiny webapp-project, I considered the following solution:
When the user is online, the data can be sent immediately. When he is offline, I use the SyncEvent.tag property to send information from the client to the serviceworker. Like this:
//offline_page.html (loads only, when user is offline)
button.onclick = function() {
//on click: store new value in localStorage and prepare new value for synchronization
localStorage.setItem("actual", numberField.value);
navigator.serviceWorker.ready.then(function(swRegistration) {
return swRegistration.sync.register('newval:'+numberField.value);
});
}
//sw.js
self.addEventListener('sync', function(event) {
//let's say everything in front of ':' is the option, everything afterwards is the value
let option = event.tag.replace(/(.*?)\:.*?$/, "$1");
let value = event.tag.replace(/.*?\:(.*?)$/, "$1");
if(option == "newval") {
event.waitUntil(
fetch("update.php?newval="+value)
.then(function(response) {
console.log(response);
})
);
}
});
update.php saves the new value to backend, as soon as the user goes online.
This won't give the service worker access to the localStorage, but it will send him every change made.
Just starting to get used to this syncing topic. So I would really be interested, wheather this is helpful and what others think about this solution.

Related

why to use indexeddb instead of cache api for dynamic content

I am using service worker and precache assets in install event.
I also have fetch listener which intercepts requests and caches then at runtime dynamically. I know that people say to use indexeddb for dynamic content such as json data and possibly images.
Question: Why isn't it a good practice to use cache API for that json data too even though it's request/response storage?
The reason I am asking this is because I tried the following: I have index.html and main.js as precached in install event and in main.js I have axios request which returns some json and puts it in index.html. If I use dynamic caching which means when the request to that json api endpoint gets made, it goes first to my service worker, which gets the response and puts it into cache. Then I tested that and when refreshed the page in offline mode, I still got the same result (json data put in index.html accordingly).
So I guess even though Cache API store request/response, it still worked for json endpoint api urls flawlessly.
Any good idea why to prefer indexeddb over cache API while using service worker?
It's perfectly fine to cache JSON data using the Cache Storage API, as an alternative to using IndexedDB. I would expect similar performance characteristics, and in both cases you could read/write the data from either the service worker or window context.
It would be slightly more awkward to use the Cache Storage API if you have JSON data that isn't already associated with a Response object, or that doesn't have a "real" request URL, since you'll have to effectively "fake" them. But that's not particularly hard to do:
const data = {
// some data
};
const jsonString = JSON.stringify(data);
const jsonResponse = new Response(jsonString, {
headers: {
'content-type': 'application/json',
},
});
const cache = await caches.open('json-cache');
await cache.put('/some-json-url', jsonResponse);

Posting messages from a service worker to a client page

Today we can find many examples of sending message from a service worker to a client, but always following the reception of a message event in the service worker.
I would like to know if we can send messages independently of this event? And if yes, how? I tried few things, without success...
In my case, it's for redirect a client to a new page when my service worker receives a push event.
client.js :
const swListener = new BroadcastChannel('swListener');
swListener.onmessage = function(e) {
console.log('swListener Received', e.data);
};
service-worker.js
const swListener = new BroadcastChannel('swListener');
swListener.postMessage('This is From SW');
The interface for Client.postMessage() is described at https://github.com/slightlyoff/ServiceWorker/issues/609. Unfortunately, it is not fully implemented in Google Chrome as of version 45, though I'd expect it to make it into a version at a later date.
When the functionality's available, you could use self.clients.matchAll() to obtain a list of any open pages that are being controlled by the service worker, and call the postMessage() method on the specific client that you care about. You need to keep in mind that it's entirely possible that there won't be any tabs open with a page controlled by your service worker, in which case you'd want to do something like open a new client page with your target URL.
But, there's a method that's probably more appropriate for your use case (though also not currently support in Chrome 45): WindowClient.navigate(), which will instruct an open tab controlled by your service worker to navigate to a different URL on the same origin.

Ember Data with Websockets

In the ember guides on models it says (1) :
Ember Data is also designed to work with streaming APIs like socket.io, Firebase, or WebSockets. You can open a socket to your server and push changes to records into the store whenever they occur.
I tried writing a custom adapter that uses a websocket but i'm not getting very far. I couldn't find any working examples anywhere.
This is my totally unfinished prototype:
DS.WSAdapter = DS.Adapter.extend(Ember.Evented, {
websocket: undefined,
init: function () {
if(this.websocket === undefined)
{
this.websocket = new WebSocket('ws://localhost:8887');
this.websocket.onopen = function(e) {
console.log("Connection established!");
};
this.websocket.onmessage = function(e) {
// What to do here?
};
}
this._loadData();
},
//....
Can somone please help me with the websocket adapter?
My main problem is that I have no clue what to do when the websocket.onmessage() gets executed. I can't even access the store (using DS.get('defaultStore')) or anything
I don't have experience working directly with sockets in Ember, however I have recently completed an Ember Data + Firebase adapter which should follow very similar methodologies.
You should, at the least, be able to use it as inspiration:
https://github.com/sandersonet/ember-data-firebase
Firebase does provide an additional layer of abstraction from the sockets underneath, but the methodologies are very similar.
Have a look at http://emberjs.com/guides/models/frequently-asked-questions/#toc_how-do-i-inform-ember-data-about-new-records-created-on-the-backend
Some applications may want to add or update records in the store
without requesting the record via store.find. To accomplish this you
can use the DS.Store's push, pushPayload, or update methods. This is
useful for web applications that have a channel (such as SSE or Web
Sockets) to notify it of new or updated records on the backend.
Basically, you need to deserialize data you receive in your onmessage hook and push new objects to the data store using store.push('model', record) or alternative methods.

Synchronizing MongoDB server data to an IndexedDB local store

I'm trying to evaluate using IndexedDB to solve the offline issue. It would be populated with data currently stored in a MongoDB database (as is).
Once data is stored in IndexedDB, it may be changed on the MongoDB server and I need to propagate those changes. Is there any existing framework or Library to do somehting like this for Mongo. I already know about CouchDB/PouchDB and am not exploring those two.
[Sync solution for 2021]
I know the question asked was for MongoDB specifically, but since this is an old thread I thought readers might be looking for other solutions for new apps or rebuilds. I can really recommend to check out AceBase because it does exactly what you were looking for back then.
AceBase is a free and open source realtime database that enables easy storage and synchronization between browser and server databases. It uses IndexedDB in the browser, its own binary db / SQL Server / SQLite storage on the server. Offline edits are synced upon reconnect and clients are notified of remote database changes in realtime through a websocket (FAST!).
On top of this, AceBase has a unique feature called "live data proxies" that allow you to have all changes to in-memory objects to be persisted and synced to local and server databases, and remote changes to automatically update your in-memory objects. This means you can forget about database coding altogether, and code as if you're only using local objects. No matter whether you're online or offline.
The following example shows how to create a local IndexedDB database in the browser, how to connect to a remote database server that syncs with the local database, and how to create a live data proxy that eliminates further database coding. AceBase supports authentication and authorization as well, but I left it out for simplicity.
const { AceBaseClient } = require('acebase-client');
const { AceBase } = require('acebase');
// Create local database with IndexedDB storage:
const cacheDb = AceBase.WithIndexedDB('mydb-local');
// Connect to server database, use local db for offline storage:
const db = new AceBaseClient({ dbname: 'mydb', host: 'db.myproject.com', port: 443, https: true, cache: { db: cacheDb } });
// Wait for remote database to be connected, or ready to use when offline:
db.ready(async () => {
// Create live data proxy for a chat:
const emptyChat = { title: 'New chat', messages: {} };
const proxy = await db.ref('chats/chatid1').proxy(emptyChat); // Use emptyChat if chat node doesn't exist
// Get object reference containing live data:
const chat = proxy.value;
// Update chat's properties to save to local database,
// sync to server AND all other clients monitoring this chat in realtime:
chat.title = `Changing the title`;
chat.messages.push({
from: 'ewout',
sent: new Date(),
text: `Sending a message that is stored in the database and synced automatically was never this easy!` +
`This message might have been sent while we were offline. Who knows!`
});
// To monitor and handle realtime changes to the chat:
chat.onChanged((val, prev, isRemoteChange, context) => {
if (val.title !== prev.title) {
alert(`Chat title changed to ${val.title} by ${isRemoteChange ? 'us' : 'someone else'}`);
}
});
});
For more examples and documentation, see AceBase realtime database engine at npmjs.com
Open up a changeStream with the resumeToken. There's no guarantee of causal consistency however since we're talking multiple disparate databases.
I haven't worked with IndexDB, but the design problem isn't that uncommon. My understanding of your app is that when the client makes the connection to MongoDB, you pull a set of documents down for local storage and disconnect. The client then can do things locally (not connected to the data server), and then push up the changes.
The way I see it you've got to handle two general cases:
when the MongoDB server is updated and breaks continuity with the client, the client will have to
poll for the data (timer?) or
keep a websocket open to let notifications free-flow over the pipe
when the user needs to push changed data back up the pipe
you can reconnect asynchronously, check for state changes, (resolving conflicts according to your business rules)
have a server side (light) interface for handling conflicts (depending on complexity of your app, comparing time stamps of state changes in MongoDB to IndexedDB updates should suffice)

How to avoid too many ajax calls and cache json data on the client side

I have a calendar application and it loads all of the event data using ajax and json results. the issue is that i have different view and right now i have to re call the server when i change views.
Is there any recommendation for ways i can cache this data on the client side and check if i have loaded these events already before firing off more ajax calls.
What is the best practice for this ?
Like hvgotcodes said, an MVC framework would help; try backbone.js (http://documentcloud.github.com/backbone/), for instance.
Alternatively, you might want to consider using jStorage (http://www.jstorage.info/). Every time you need to make an AJAX call, check first if it's in your storage object, then run the AJAX call if it isn't. On the other end, whenever you finish an AJAX call, store the results in the storage object. Make sure you have some kind of index (a CalendarEvent id) to reference when looking it up in the data store. Might want to add some kind of "expire time" to the data in your storage, too ... a timestamp after the AJAX call, and re-request up front if it's out of date.
It's called MVC.
You need to construct a data model for you application, write some sort of Record objects, and then you can determine their status. So your application would have some sort of CalendarEvent model, and when you load data from the server, you would instantiate instances.
So when changing views, you would first check to see if you had the model object for that view, and if you did, you wouldn't need to load it from the server (unless you want to check for changes).
Your scheme doesn't need to be that complicated. If you load events by Id, you can do something like
window.App = {};
window.App.Models = {};
when you load a record you could put
window.App.Models[id] = InstanceOfYourRecord
and that way its pretty fast to look for records. Or just use a framework (like Sproutcore) that has a robust data layer.
I had similar issues on a recent project.
Conceptually, I have the "real" data model (DM) kept on the server, persisted to a database.
To make life sane, the client keeps its own local data model. Outside of the client DM, all the client code thinks it's pulling results locally.
When reading data (GET) from the client DM it:
checks the cache for existing results
invokes appropriate AJAX queries when cached data is not available, then caches the results.
When changing data (POST) via the client DM it:
invalidates the cache as appropriate
invokes appropriate AJAX queries
emits custom jQuery event indicating client DM changed
Note that this client DM also:
centralizes AJAX error handling
tracks AJAX calls still in-flight. (Lets us warn users when leaving pages with unsaved changes).
allows a drop-in, dummy replacement for unit testing, where all the calls hit local data and are completely synchronous.
Implementation notes:
I coded this as a JavaScript class called DataModel. As the design becomes more complex, it makes sense to further break-down the responsibilities in to separate objects.
jQuery's custom events let you easily implement the observer pattern. Client components update themselves from the client DM whenever it indicates data has changed.
JSON in your remote API helps simplify the code. My client DM stores the JSON results directly in its cache.
The client dm function arguments include call-backs so everything can naturally be passed along via AJAX when needed: function listAll( contactId, cb ) { ... }
My project only allowed single user logins. If outside parties can change the server datamodel, some sort of has-data-changed probe should be fired regularly to ensure the client cache is still valid.
For my app, multiple client components would request the same data when receiving a client DM changed event. This resulted in multiple AJAX calls with the same info. I fixed this problem with a getJsonOnce() helper, which manages a queue of client component call-backs awaiting the same result.
Example function in my implementation:
listAll:
function( contactId, cb ) {
// pull from cache
if ( contactId in this.notesCache ) {
cb( this.notesCache[contactId] );
return;
}
// init queue if needed
this.listAllQueue[contactId] = this.listAllQueue[contactId] || [];
// pull from server
var self = this;
dataModelHelpers.getJsonOnce(
'/teafile/api/notes.php',
{'req': 'listAll', 'contact': contactId},
function(resp) { self.notesCache[contactId] = resp; },
this.listAllQueue[contactId],
cb
);
}
The getJsonOnce() helper makes sure that if multiple client components request the exact same (uncached) data, that we only send out a single AJAX request and inform everyone once it comes in.
The notesCache is just a simple javascript object:
this.notesCache = {};

Categories